Dec 07 19:14:54 crc systemd[1]: Starting Kubernetes Kubelet... Dec 07 19:14:54 crc restorecon[4639]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:54 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 07 19:14:55 crc restorecon[4639]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 07 19:14:55 crc restorecon[4639]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Dec 07 19:14:55 crc kubenswrapper[4815]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 07 19:14:55 crc kubenswrapper[4815]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Dec 07 19:14:55 crc kubenswrapper[4815]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 07 19:14:55 crc kubenswrapper[4815]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 07 19:14:55 crc kubenswrapper[4815]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Dec 07 19:14:55 crc kubenswrapper[4815]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.582614 4815 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.586211 4815 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.586230 4815 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.586235 4815 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.586239 4815 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.586244 4815 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.586248 4815 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.586252 4815 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.586256 4815 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.586260 4815 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.586265 4815 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.586269 4815 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.586273 4815 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.586278 4815 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.586291 4815 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.586296 4815 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.586300 4815 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.586303 4815 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.586307 4815 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.586311 4815 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.586315 4815 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.586318 4815 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.586322 4815 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.586325 4815 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.586329 4815 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.586333 4815 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.586336 4815 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.586340 4815 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.586344 4815 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.586347 4815 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.586353 4815 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.586358 4815 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.586362 4815 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.586365 4815 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.586369 4815 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.586372 4815 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.586376 4815 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.586379 4815 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.586383 4815 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.586387 4815 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.586391 4815 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.586394 4815 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.586398 4815 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.586401 4815 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.586405 4815 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.586409 4815 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.586412 4815 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.586416 4815 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.586419 4815 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.586423 4815 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.586427 4815 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.586430 4815 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.586433 4815 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.586437 4815 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.586440 4815 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.586443 4815 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.586447 4815 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.586450 4815 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.586455 4815 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.586459 4815 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.586464 4815 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.586468 4815 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.586472 4815 feature_gate.go:330] unrecognized feature gate: Example Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.586476 4815 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.586479 4815 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.586484 4815 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.586489 4815 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.586493 4815 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.586497 4815 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.586500 4815 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.586505 4815 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.586509 4815 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.586744 4815 flags.go:64] FLAG: --address="0.0.0.0" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.586753 4815 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.586761 4815 flags.go:64] FLAG: --anonymous-auth="true" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.586767 4815 flags.go:64] FLAG: --application-metrics-count-limit="100" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.586773 4815 flags.go:64] FLAG: --authentication-token-webhook="false" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.586777 4815 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.586783 4815 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.586789 4815 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.586793 4815 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.586797 4815 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.586802 4815 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.586806 4815 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.586810 4815 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.586814 4815 flags.go:64] FLAG: --cgroup-root="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.586818 4815 flags.go:64] FLAG: --cgroups-per-qos="true" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.586823 4815 flags.go:64] FLAG: --client-ca-file="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.586827 4815 flags.go:64] FLAG: --cloud-config="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.586831 4815 flags.go:64] FLAG: --cloud-provider="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.586835 4815 flags.go:64] FLAG: --cluster-dns="[]" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.586839 4815 flags.go:64] FLAG: --cluster-domain="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.586843 4815 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.586847 4815 flags.go:64] FLAG: --config-dir="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.586851 4815 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.586855 4815 flags.go:64] FLAG: --container-log-max-files="5" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.586860 4815 flags.go:64] FLAG: --container-log-max-size="10Mi" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.586865 4815 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.586869 4815 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.586873 4815 flags.go:64] FLAG: --containerd-namespace="k8s.io" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.586877 4815 flags.go:64] FLAG: --contention-profiling="false" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.586881 4815 flags.go:64] FLAG: --cpu-cfs-quota="true" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.586886 4815 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.586890 4815 flags.go:64] FLAG: --cpu-manager-policy="none" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.586894 4815 flags.go:64] FLAG: --cpu-manager-policy-options="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.586899 4815 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.586903 4815 flags.go:64] FLAG: --enable-controller-attach-detach="true" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.586924 4815 flags.go:64] FLAG: --enable-debugging-handlers="true" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.586930 4815 flags.go:64] FLAG: --enable-load-reader="false" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.586934 4815 flags.go:64] FLAG: --enable-server="true" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.586938 4815 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.586944 4815 flags.go:64] FLAG: --event-burst="100" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.586948 4815 flags.go:64] FLAG: --event-qps="50" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.586952 4815 flags.go:64] FLAG: --event-storage-age-limit="default=0" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.586957 4815 flags.go:64] FLAG: --event-storage-event-limit="default=0" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.586960 4815 flags.go:64] FLAG: --eviction-hard="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.586965 4815 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.586969 4815 flags.go:64] FLAG: --eviction-minimum-reclaim="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.586974 4815 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.586978 4815 flags.go:64] FLAG: --eviction-soft="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.586982 4815 flags.go:64] FLAG: --eviction-soft-grace-period="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.586987 4815 flags.go:64] FLAG: --exit-on-lock-contention="false" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.586992 4815 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.586996 4815 flags.go:64] FLAG: --experimental-mounter-path="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587001 4815 flags.go:64] FLAG: --fail-cgroupv1="false" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587005 4815 flags.go:64] FLAG: --fail-swap-on="true" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587009 4815 flags.go:64] FLAG: --feature-gates="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587014 4815 flags.go:64] FLAG: --file-check-frequency="20s" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587042 4815 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587047 4815 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587051 4815 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587056 4815 flags.go:64] FLAG: --healthz-port="10248" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587060 4815 flags.go:64] FLAG: --help="false" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587065 4815 flags.go:64] FLAG: --hostname-override="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587069 4815 flags.go:64] FLAG: --housekeeping-interval="10s" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587073 4815 flags.go:64] FLAG: --http-check-frequency="20s" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587077 4815 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587082 4815 flags.go:64] FLAG: --image-credential-provider-config="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587087 4815 flags.go:64] FLAG: --image-gc-high-threshold="85" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587093 4815 flags.go:64] FLAG: --image-gc-low-threshold="80" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587097 4815 flags.go:64] FLAG: --image-service-endpoint="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587102 4815 flags.go:64] FLAG: --kernel-memcg-notification="false" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587106 4815 flags.go:64] FLAG: --kube-api-burst="100" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587110 4815 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587114 4815 flags.go:64] FLAG: --kube-api-qps="50" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587118 4815 flags.go:64] FLAG: --kube-reserved="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587122 4815 flags.go:64] FLAG: --kube-reserved-cgroup="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587126 4815 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587130 4815 flags.go:64] FLAG: --kubelet-cgroups="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587134 4815 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587138 4815 flags.go:64] FLAG: --lock-file="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587142 4815 flags.go:64] FLAG: --log-cadvisor-usage="false" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587146 4815 flags.go:64] FLAG: --log-flush-frequency="5s" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587150 4815 flags.go:64] FLAG: --log-json-info-buffer-size="0" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587157 4815 flags.go:64] FLAG: --log-json-split-stream="false" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587161 4815 flags.go:64] FLAG: --log-text-info-buffer-size="0" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587165 4815 flags.go:64] FLAG: --log-text-split-stream="false" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587169 4815 flags.go:64] FLAG: --logging-format="text" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587174 4815 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587178 4815 flags.go:64] FLAG: --make-iptables-util-chains="true" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587182 4815 flags.go:64] FLAG: --manifest-url="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587186 4815 flags.go:64] FLAG: --manifest-url-header="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587192 4815 flags.go:64] FLAG: --max-housekeeping-interval="15s" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587196 4815 flags.go:64] FLAG: --max-open-files="1000000" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587201 4815 flags.go:64] FLAG: --max-pods="110" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587205 4815 flags.go:64] FLAG: --maximum-dead-containers="-1" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587209 4815 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587213 4815 flags.go:64] FLAG: --memory-manager-policy="None" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587217 4815 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587221 4815 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587225 4815 flags.go:64] FLAG: --node-ip="192.168.126.11" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587231 4815 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587243 4815 flags.go:64] FLAG: --node-status-max-images="50" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587247 4815 flags.go:64] FLAG: --node-status-update-frequency="10s" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587259 4815 flags.go:64] FLAG: --oom-score-adj="-999" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587263 4815 flags.go:64] FLAG: --pod-cidr="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587267 4815 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587273 4815 flags.go:64] FLAG: --pod-manifest-path="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587277 4815 flags.go:64] FLAG: --pod-max-pids="-1" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587282 4815 flags.go:64] FLAG: --pods-per-core="0" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587286 4815 flags.go:64] FLAG: --port="10250" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587290 4815 flags.go:64] FLAG: --protect-kernel-defaults="false" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587294 4815 flags.go:64] FLAG: --provider-id="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587298 4815 flags.go:64] FLAG: --qos-reserved="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587301 4815 flags.go:64] FLAG: --read-only-port="10255" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587306 4815 flags.go:64] FLAG: --register-node="true" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587309 4815 flags.go:64] FLAG: --register-schedulable="true" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587313 4815 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587324 4815 flags.go:64] FLAG: --registry-burst="10" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587328 4815 flags.go:64] FLAG: --registry-qps="5" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587332 4815 flags.go:64] FLAG: --reserved-cpus="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587336 4815 flags.go:64] FLAG: --reserved-memory="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587341 4815 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587345 4815 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587349 4815 flags.go:64] FLAG: --rotate-certificates="false" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587354 4815 flags.go:64] FLAG: --rotate-server-certificates="false" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587358 4815 flags.go:64] FLAG: --runonce="false" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587362 4815 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587366 4815 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587370 4815 flags.go:64] FLAG: --seccomp-default="false" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587374 4815 flags.go:64] FLAG: --serialize-image-pulls="true" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587378 4815 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587382 4815 flags.go:64] FLAG: --storage-driver-db="cadvisor" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587386 4815 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587391 4815 flags.go:64] FLAG: --storage-driver-password="root" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587395 4815 flags.go:64] FLAG: --storage-driver-secure="false" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587400 4815 flags.go:64] FLAG: --storage-driver-table="stats" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587403 4815 flags.go:64] FLAG: --storage-driver-user="root" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587407 4815 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587411 4815 flags.go:64] FLAG: --sync-frequency="1m0s" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587416 4815 flags.go:64] FLAG: --system-cgroups="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587420 4815 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587426 4815 flags.go:64] FLAG: --system-reserved-cgroup="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587431 4815 flags.go:64] FLAG: --tls-cert-file="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587435 4815 flags.go:64] FLAG: --tls-cipher-suites="[]" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587440 4815 flags.go:64] FLAG: --tls-min-version="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587444 4815 flags.go:64] FLAG: --tls-private-key-file="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587449 4815 flags.go:64] FLAG: --topology-manager-policy="none" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587453 4815 flags.go:64] FLAG: --topology-manager-policy-options="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587457 4815 flags.go:64] FLAG: --topology-manager-scope="container" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587461 4815 flags.go:64] FLAG: --v="2" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587467 4815 flags.go:64] FLAG: --version="false" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587472 4815 flags.go:64] FLAG: --vmodule="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587477 4815 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587481 4815 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.587603 4815 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.587608 4815 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.587612 4815 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.587616 4815 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.587620 4815 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.587624 4815 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.587628 4815 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.587632 4815 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.587636 4815 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.587641 4815 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.587645 4815 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.587650 4815 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.587654 4815 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.587658 4815 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.587661 4815 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.587665 4815 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.587669 4815 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.587674 4815 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.587678 4815 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.587682 4815 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.587686 4815 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.587691 4815 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.587696 4815 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.587700 4815 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.587706 4815 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.587709 4815 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.587714 4815 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.587717 4815 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.587721 4815 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.587725 4815 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.587728 4815 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.587732 4815 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.587736 4815 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.587740 4815 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.587744 4815 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.587748 4815 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.587751 4815 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.587756 4815 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.587760 4815 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.587764 4815 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.587767 4815 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.587771 4815 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.587774 4815 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.587779 4815 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.587784 4815 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.587788 4815 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.587792 4815 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.587795 4815 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.587799 4815 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.587803 4815 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.587806 4815 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.587810 4815 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.587813 4815 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.587816 4815 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.587820 4815 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.587823 4815 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.587828 4815 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.587832 4815 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.587836 4815 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.587839 4815 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.587843 4815 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.587846 4815 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.587850 4815 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.587853 4815 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.587856 4815 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.587860 4815 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.587863 4815 feature_gate.go:330] unrecognized feature gate: Example Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.587867 4815 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.587870 4815 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.587874 4815 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.587878 4815 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.587888 4815 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.601537 4815 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.601595 4815 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.601734 4815 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.601753 4815 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.601764 4815 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.601778 4815 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.601791 4815 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.601802 4815 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.601810 4815 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.601818 4815 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.601825 4815 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.601833 4815 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.601841 4815 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.601849 4815 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.601857 4815 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.601865 4815 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.601872 4815 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.601880 4815 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.601889 4815 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.601896 4815 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.601904 4815 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.601912 4815 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.601952 4815 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.601960 4815 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.601968 4815 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.601976 4815 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.601984 4815 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.601991 4815 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.601999 4815 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.602007 4815 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.602015 4815 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.602029 4815 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.602042 4815 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.602054 4815 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.602064 4815 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.602074 4815 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.602083 4815 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.602091 4815 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.602098 4815 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.602107 4815 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.602114 4815 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.602122 4815 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.602129 4815 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.602139 4815 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.602150 4815 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.602160 4815 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.602169 4815 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.602178 4815 feature_gate.go:330] unrecognized feature gate: Example Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.602186 4815 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.602194 4815 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.602203 4815 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.602211 4815 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.602221 4815 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.602228 4815 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.602236 4815 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.602245 4815 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.602252 4815 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.602260 4815 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.602269 4815 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.602277 4815 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.602286 4815 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.602295 4815 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.602303 4815 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.602311 4815 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.602318 4815 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.602326 4815 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.602334 4815 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.602342 4815 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.602349 4815 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.602357 4815 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.602365 4815 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.602373 4815 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.602380 4815 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.602394 4815 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.602622 4815 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.602637 4815 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.602648 4815 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.602657 4815 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.602665 4815 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.602674 4815 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.602682 4815 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.602691 4815 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.602699 4815 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.602707 4815 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.602716 4815 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.602724 4815 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.602732 4815 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.602740 4815 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.602748 4815 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.602758 4815 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.602770 4815 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.602782 4815 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.602802 4815 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.602818 4815 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.602830 4815 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.602843 4815 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.602856 4815 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.602869 4815 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.602879 4815 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.602888 4815 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.602896 4815 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.602903 4815 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.602911 4815 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.602953 4815 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.602963 4815 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.602973 4815 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.602983 4815 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.602992 4815 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.602999 4815 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.603008 4815 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.603016 4815 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.603057 4815 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.603067 4815 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.603078 4815 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.603086 4815 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.603096 4815 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.603105 4815 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.603114 4815 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.603122 4815 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.603130 4815 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.603138 4815 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.603146 4815 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.603154 4815 feature_gate.go:330] unrecognized feature gate: Example Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.603161 4815 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.603170 4815 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.603177 4815 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.603185 4815 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.603193 4815 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.603201 4815 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.603209 4815 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.603216 4815 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.603224 4815 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.603232 4815 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.603242 4815 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.603251 4815 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.603260 4815 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.603268 4815 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.603276 4815 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.603285 4815 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.603295 4815 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.603311 4815 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.603328 4815 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.603338 4815 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.603350 4815 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.603359 4815 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.603376 4815 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.603968 4815 server.go:940] "Client rotation is on, will bootstrap in background" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.608211 4815 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.608355 4815 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.609541 4815 server.go:997] "Starting client certificate rotation" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.609579 4815 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.609801 4815 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2026-01-10 14:03:30.79393917 +0000 UTC Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.609966 4815 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 810h48m35.183980974s for next certificate rotation Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.616693 4815 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.619309 4815 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.629156 4815 log.go:25] "Validated CRI v1 runtime API" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.649772 4815 log.go:25] "Validated CRI v1 image API" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.651856 4815 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.655260 4815 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-12-07-19-09-03-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.655438 4815 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:41 fsType:tmpfs blockSize:0}] Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.677058 4815 manager.go:217] Machine: {Timestamp:2025-12-07 19:14:55.67543174 +0000 UTC m=+0.254421835 CPUVendorID:AuthenticAMD NumCores:8 NumPhysicalCores:1 NumSockets:8 CpuFrequency:2800000 MemoryCapacity:25199480832 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:077277cc-9bde-4aeb-947a-0cf3c49a1ac0 BootID:3d749f27-20c8-4d23-ad52-dc6b852bf3b7 Filesystems:[{Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:2519945216 Type:vfs Inodes:615221 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:41 Capacity:1073741824 Type:vfs Inodes:3076108 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:12599738368 Type:vfs Inodes:3076108 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:5039898624 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:12599742464 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:429496729600 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:b3:6f:9d Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:b3:6f:9d Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:f7:fc:86 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:ec:c0:18 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:4d:1a:79 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:7e:f0:5a Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:ee:9b:fe Speed:-1 Mtu:1496} {Name:eth10 MacAddress:36:92:c9:4d:76:93 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:22:a5:c1:ed:db:01 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:25199480832 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.677397 4815 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.677597 4815 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.678494 4815 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.679106 4815 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.679163 4815 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.679532 4815 topology_manager.go:138] "Creating topology manager with none policy" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.679555 4815 container_manager_linux.go:303] "Creating device plugin manager" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.679869 4815 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.679945 4815 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.680245 4815 state_mem.go:36] "Initialized new in-memory state store" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.680387 4815 server.go:1245] "Using root directory" path="/var/lib/kubelet" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.681320 4815 kubelet.go:418] "Attempting to sync node with API server" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.681352 4815 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.681399 4815 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.681421 4815 kubelet.go:324] "Adding apiserver pod source" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.681439 4815 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.683795 4815 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.684682 4815 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.685814 4815 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.686139 4815 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.2:6443: connect: connection refused Dec 07 19:14:55 crc kubenswrapper[4815]: E1207 19:14:55.686343 4815 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.2:6443: connect: connection refused" logger="UnhandledError" Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.686192 4815 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.2:6443: connect: connection refused Dec 07 19:14:55 crc kubenswrapper[4815]: E1207 19:14:55.686432 4815 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.2:6443: connect: connection refused" logger="UnhandledError" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.686747 4815 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.686792 4815 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.686808 4815 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.686821 4815 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.686843 4815 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.686857 4815 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.686871 4815 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.686893 4815 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.686912 4815 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.687172 4815 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.687197 4815 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.687214 4815 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.687544 4815 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.688198 4815 server.go:1280] "Started kubelet" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.688998 4815 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Dec 07 19:14:55 crc systemd[1]: Started Kubernetes Kubelet. Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.691813 4815 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.2:6443: connect: connection refused Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.691056 4815 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.693311 4815 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.693359 4815 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.693405 4815 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 20:23:11.471082264 +0000 UTC Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.693834 4815 volume_manager.go:287] "The desired_state_of_world populator starts" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.693852 4815 volume_manager.go:289] "Starting Kubelet Volume Manager" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.694055 4815 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.695121 4815 server.go:460] "Adding debug handlers to kubelet server" Dec 07 19:14:55 crc kubenswrapper[4815]: E1207 19:14:55.703479 4815 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.704112 4815 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.708431 4815 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.2:6443: connect: connection refused Dec 07 19:14:55 crc kubenswrapper[4815]: E1207 19:14:55.708555 4815 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.2:6443: connect: connection refused" logger="UnhandledError" Dec 07 19:14:55 crc kubenswrapper[4815]: E1207 19:14:55.705847 4815 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.2:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187f04d5e623407a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-07 19:14:55.688138874 +0000 UTC m=+0.267128959,LastTimestamp:2025-12-07 19:14:55.688138874 +0000 UTC m=+0.267128959,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.708722 4815 factory.go:55] Registering systemd factory Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.708744 4815 factory.go:221] Registration of the systemd container factory successfully Dec 07 19:14:55 crc kubenswrapper[4815]: E1207 19:14:55.710236 4815 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.2:6443: connect: connection refused" interval="200ms" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.712498 4815 factory.go:153] Registering CRI-O factory Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.712534 4815 factory.go:221] Registration of the crio container factory successfully Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.712758 4815 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.712857 4815 factory.go:103] Registering Raw factory Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.712890 4815 manager.go:1196] Started watching for new ooms in manager Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.717195 4815 manager.go:319] Starting recovery of all containers Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.722769 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.722822 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.722845 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.722865 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.722884 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.722902 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.722955 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.722973 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.722995 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.723064 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.723091 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.723109 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.723127 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.723148 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.723166 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.723211 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.723230 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.723248 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.723319 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.723340 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.723358 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.723376 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.723393 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.723410 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.723427 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.723447 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.723470 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.723492 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.723521 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.723538 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.723555 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.723579 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.723597 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.723617 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.723658 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.723677 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.723693 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.723711 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.723730 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.723776 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.723796 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.723815 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.723832 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.723851 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.723869 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.729541 4815 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.729593 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.729616 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.729635 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.729654 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.729672 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.729690 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.729708 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.729735 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.729756 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.729777 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.729797 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.729817 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.729836 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.729854 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.729873 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.729892 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.729910 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.729959 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.729979 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.729997 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.730015 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.730033 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.730051 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.730071 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.730090 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.730109 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.730127 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.730147 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.730166 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.730183 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.730205 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.730231 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.730255 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.730281 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.730303 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.730320 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.730338 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.730355 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.730381 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.730398 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.730418 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.730436 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.730453 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.730472 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.730489 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.730508 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.730526 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.730544 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.730563 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.730581 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.730600 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.730618 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.730637 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.730655 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.730672 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.730689 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.730708 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.730725 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.730744 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.730770 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.730790 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.730809 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.730829 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.730848 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.730868 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.730889 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.730906 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.730973 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.730991 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.731009 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.731047 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.731066 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.731084 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.731102 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.731120 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.731139 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.731157 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.731176 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.731197 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.731215 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.731232 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.731251 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.731269 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.731288 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.731307 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.731326 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.731344 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.731361 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.731377 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.731396 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.731413 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.731432 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.731450 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.731468 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.731486 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.731506 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.731544 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.731561 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.731579 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.731598 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.731616 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.731635 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.731652 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.731678 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.731698 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.731716 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.731733 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.731753 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.731771 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.731790 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.731808 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.731825 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.731843 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.731860 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.731879 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.731898 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.731938 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.731957 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.731976 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.731993 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.732011 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.732029 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.732047 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.732066 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.732085 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.732102 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.732120 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.732138 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.732156 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.732173 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.732191 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.732207 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.732224 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.732242 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.732261 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.732277 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.732296 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.732313 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.732331 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.732349 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.732367 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.732385 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.732403 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.732422 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.732438 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.732456 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.732472 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.732490 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.732508 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.732526 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.732544 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.732562 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.732580 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.732597 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.732616 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.732634 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.732653 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.732670 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.732687 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.732706 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.732724 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.732741 4815 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.732758 4815 reconstruct.go:97] "Volume reconstruction finished" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.732770 4815 reconciler.go:26] "Reconciler: start to sync state" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.752235 4815 manager.go:324] Recovery completed Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.762975 4815 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.764383 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.764510 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.764588 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.766165 4815 cpu_manager.go:225] "Starting CPU manager" policy="none" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.766252 4815 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.766328 4815 state_mem.go:36] "Initialized new in-memory state store" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.766748 4815 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.768526 4815 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.768570 4815 status_manager.go:217] "Starting to sync pod status with apiserver" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.768601 4815 kubelet.go:2335] "Starting kubelet main sync loop" Dec 07 19:14:55 crc kubenswrapper[4815]: E1207 19:14:55.768661 4815 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 07 19:14:55 crc kubenswrapper[4815]: W1207 19:14:55.773846 4815 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.2:6443: connect: connection refused Dec 07 19:14:55 crc kubenswrapper[4815]: E1207 19:14:55.773969 4815 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.2:6443: connect: connection refused" logger="UnhandledError" Dec 07 19:14:55 crc kubenswrapper[4815]: E1207 19:14:55.803745 4815 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.847240 4815 policy_none.go:49] "None policy: Start" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.851238 4815 memory_manager.go:170] "Starting memorymanager" policy="None" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.851296 4815 state_mem.go:35] "Initializing new in-memory state store" Dec 07 19:14:55 crc kubenswrapper[4815]: E1207 19:14:55.868858 4815 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Dec 07 19:14:55 crc kubenswrapper[4815]: E1207 19:14:55.904807 4815 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 07 19:14:55 crc kubenswrapper[4815]: E1207 19:14:55.913522 4815 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.2:6443: connect: connection refused" interval="400ms" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.941978 4815 manager.go:334] "Starting Device Plugin manager" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.942033 4815 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.942047 4815 server.go:79] "Starting device plugin registration server" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.942406 4815 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.942426 4815 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.942790 4815 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.942871 4815 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Dec 07 19:14:55 crc kubenswrapper[4815]: I1207 19:14:55.942880 4815 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 07 19:14:55 crc kubenswrapper[4815]: E1207 19:14:55.957550 4815 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.043089 4815 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.044103 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.044140 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.044157 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.044189 4815 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 07 19:14:56 crc kubenswrapper[4815]: E1207 19:14:56.044550 4815 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.2:6443: connect: connection refused" node="crc" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.069773 4815 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc"] Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.069970 4815 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.071243 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.071298 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.071320 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.071474 4815 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.071911 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.072024 4815 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.072582 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.072635 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.072658 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.072823 4815 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.073001 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.073061 4815 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.073365 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.073413 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.073434 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.075689 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.075739 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.075756 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.075889 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.075983 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.075993 4815 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.076003 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.076161 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.076223 4815 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.078789 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.078831 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.078848 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.078960 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.078989 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.079006 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.079017 4815 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.080809 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.080875 4815 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.084807 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.084849 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.084864 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.085090 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.085165 4815 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.085383 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.085438 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.085458 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.086268 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.086314 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.086331 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.137149 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.137202 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.137285 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.137416 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.137461 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.137492 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.137524 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.137553 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.137610 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.137672 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.137729 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.137758 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.137778 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.137793 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.137815 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.238819 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.238877 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.238898 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.238981 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.239000 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.238998 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.239014 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.239031 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.239053 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.239089 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.239126 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.239145 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.239149 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.239100 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.239178 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.239223 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.239186 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.239156 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.239250 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.239275 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.239185 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.239294 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.239314 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.239319 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.239316 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.239333 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.239376 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.239334 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.239427 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.239515 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.244995 4815 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.247179 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.247335 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.247363 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.247422 4815 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 07 19:14:56 crc kubenswrapper[4815]: E1207 19:14:56.248084 4815 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.2:6443: connect: connection refused" node="crc" Dec 07 19:14:56 crc kubenswrapper[4815]: E1207 19:14:56.315043 4815 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.2:6443: connect: connection refused" interval="800ms" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.407810 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.430513 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 07 19:14:56 crc kubenswrapper[4815]: W1207 19:14:56.437370 4815 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-062af368ace51e5639b51d81da63c0b063d507d96fbe4e9405e28993bc7938ed WatchSource:0}: Error finding container 062af368ace51e5639b51d81da63c0b063d507d96fbe4e9405e28993bc7938ed: Status 404 returned error can't find the container with id 062af368ace51e5639b51d81da63c0b063d507d96fbe4e9405e28993bc7938ed Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.456473 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 07 19:14:56 crc kubenswrapper[4815]: W1207 19:14:56.460593 4815 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-e9fbfb5602836dd606da09621895457b6c225ffcf45f9a195a5cc5d391f20c55 WatchSource:0}: Error finding container e9fbfb5602836dd606da09621895457b6c225ffcf45f9a195a5cc5d391f20c55: Status 404 returned error can't find the container with id e9fbfb5602836dd606da09621895457b6c225ffcf45f9a195a5cc5d391f20c55 Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.473076 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.501191 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 07 19:14:56 crc kubenswrapper[4815]: W1207 19:14:56.512222 4815 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-aa485f9d96c26e610851ac2c5a428f060d3cc622d2d7af02ef124790a9540e64 WatchSource:0}: Error finding container aa485f9d96c26e610851ac2c5a428f060d3cc622d2d7af02ef124790a9540e64: Status 404 returned error can't find the container with id aa485f9d96c26e610851ac2c5a428f060d3cc622d2d7af02ef124790a9540e64 Dec 07 19:14:56 crc kubenswrapper[4815]: W1207 19:14:56.520535 4815 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-679ac654491aabac9f3b79b0f0713f47109d2120a11a5ba459459a42510d9822 WatchSource:0}: Error finding container 679ac654491aabac9f3b79b0f0713f47109d2120a11a5ba459459a42510d9822: Status 404 returned error can't find the container with id 679ac654491aabac9f3b79b0f0713f47109d2120a11a5ba459459a42510d9822 Dec 07 19:14:56 crc kubenswrapper[4815]: W1207 19:14:56.615399 4815 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.2:6443: connect: connection refused Dec 07 19:14:56 crc kubenswrapper[4815]: E1207 19:14:56.615527 4815 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.2:6443: connect: connection refused" logger="UnhandledError" Dec 07 19:14:56 crc kubenswrapper[4815]: W1207 19:14:56.640040 4815 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.2:6443: connect: connection refused Dec 07 19:14:56 crc kubenswrapper[4815]: E1207 19:14:56.640145 4815 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.2:6443: connect: connection refused" logger="UnhandledError" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.649089 4815 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.650241 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.650294 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.650306 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.650333 4815 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 07 19:14:56 crc kubenswrapper[4815]: E1207 19:14:56.650708 4815 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.2:6443: connect: connection refused" node="crc" Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.693575 4815 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-11 15:24:58.509821147 +0000 UTC Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.693650 4815 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 836h10m1.816176135s for next certificate rotation Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.693707 4815 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.2:6443: connect: connection refused Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.775996 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"ac9361918d19b3fad67f709c24b2de52c57c5ef7f95cef1947b419d06bf165a6"} Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.776856 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"e9fbfb5602836dd606da09621895457b6c225ffcf45f9a195a5cc5d391f20c55"} Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.777985 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"062af368ace51e5639b51d81da63c0b063d507d96fbe4e9405e28993bc7938ed"} Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.778747 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"679ac654491aabac9f3b79b0f0713f47109d2120a11a5ba459459a42510d9822"} Dec 07 19:14:56 crc kubenswrapper[4815]: I1207 19:14:56.783549 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"aa485f9d96c26e610851ac2c5a428f060d3cc622d2d7af02ef124790a9540e64"} Dec 07 19:14:56 crc kubenswrapper[4815]: W1207 19:14:56.882480 4815 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.2:6443: connect: connection refused Dec 07 19:14:56 crc kubenswrapper[4815]: E1207 19:14:56.882585 4815 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.2:6443: connect: connection refused" logger="UnhandledError" Dec 07 19:14:56 crc kubenswrapper[4815]: W1207 19:14:56.890373 4815 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.2:6443: connect: connection refused Dec 07 19:14:56 crc kubenswrapper[4815]: E1207 19:14:56.890470 4815 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.2:6443: connect: connection refused" logger="UnhandledError" Dec 07 19:14:57 crc kubenswrapper[4815]: E1207 19:14:57.116553 4815 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.2:6443: connect: connection refused" interval="1.6s" Dec 07 19:14:57 crc kubenswrapper[4815]: I1207 19:14:57.452050 4815 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 07 19:14:57 crc kubenswrapper[4815]: I1207 19:14:57.454073 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:14:57 crc kubenswrapper[4815]: I1207 19:14:57.454137 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:14:57 crc kubenswrapper[4815]: I1207 19:14:57.454158 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:14:57 crc kubenswrapper[4815]: I1207 19:14:57.454198 4815 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 07 19:14:57 crc kubenswrapper[4815]: E1207 19:14:57.454821 4815 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.2:6443: connect: connection refused" node="crc" Dec 07 19:14:57 crc kubenswrapper[4815]: I1207 19:14:57.693208 4815 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.2:6443: connect: connection refused Dec 07 19:14:57 crc kubenswrapper[4815]: I1207 19:14:57.787800 4815 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="33a4c84a8cd512a90c51f4721d9ddfde2a452e7f03d9809d492d5bccd7cf4aa4" exitCode=0 Dec 07 19:14:57 crc kubenswrapper[4815]: I1207 19:14:57.787950 4815 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 07 19:14:57 crc kubenswrapper[4815]: I1207 19:14:57.787952 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"33a4c84a8cd512a90c51f4721d9ddfde2a452e7f03d9809d492d5bccd7cf4aa4"} Dec 07 19:14:57 crc kubenswrapper[4815]: I1207 19:14:57.789354 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:14:57 crc kubenswrapper[4815]: I1207 19:14:57.789386 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:14:57 crc kubenswrapper[4815]: I1207 19:14:57.789395 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:14:57 crc kubenswrapper[4815]: I1207 19:14:57.796803 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"89a76372fcc5ff7131084098e41183d148096169b9ccaf69d59dbef45dcfd755"} Dec 07 19:14:57 crc kubenswrapper[4815]: I1207 19:14:57.796840 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ae001d2e4c277c57382ca0c4a1286c3c135f2fc06ca5a78df27cbdab588ce576"} Dec 07 19:14:57 crc kubenswrapper[4815]: I1207 19:14:57.796853 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"8f92a2d37af34c519a6b00a9643f4a46de779ae0070adb0826e9f227960fe4a0"} Dec 07 19:14:57 crc kubenswrapper[4815]: I1207 19:14:57.798100 4815 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b" exitCode=0 Dec 07 19:14:57 crc kubenswrapper[4815]: I1207 19:14:57.798156 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b"} Dec 07 19:14:57 crc kubenswrapper[4815]: I1207 19:14:57.798178 4815 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 07 19:14:57 crc kubenswrapper[4815]: I1207 19:14:57.798800 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:14:57 crc kubenswrapper[4815]: I1207 19:14:57.798827 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:14:57 crc kubenswrapper[4815]: I1207 19:14:57.798835 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:14:57 crc kubenswrapper[4815]: I1207 19:14:57.800202 4815 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="0faf51cb907b5a90d7d1856e10ed14b8550be7fea5ff9b912ad8fbb96d1a67ad" exitCode=0 Dec 07 19:14:57 crc kubenswrapper[4815]: I1207 19:14:57.800250 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"0faf51cb907b5a90d7d1856e10ed14b8550be7fea5ff9b912ad8fbb96d1a67ad"} Dec 07 19:14:57 crc kubenswrapper[4815]: I1207 19:14:57.800323 4815 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 07 19:14:57 crc kubenswrapper[4815]: I1207 19:14:57.800742 4815 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 07 19:14:57 crc kubenswrapper[4815]: I1207 19:14:57.806119 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:14:57 crc kubenswrapper[4815]: I1207 19:14:57.806140 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:14:57 crc kubenswrapper[4815]: I1207 19:14:57.806148 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:14:57 crc kubenswrapper[4815]: I1207 19:14:57.806206 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:14:57 crc kubenswrapper[4815]: I1207 19:14:57.806247 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:14:57 crc kubenswrapper[4815]: I1207 19:14:57.806265 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:14:57 crc kubenswrapper[4815]: I1207 19:14:57.813968 4815 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="e6721c873a3c399141d67e58a9eb8d54614c64a48d6a3e373b6b74b884de6450" exitCode=0 Dec 07 19:14:57 crc kubenswrapper[4815]: I1207 19:14:57.814042 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"e6721c873a3c399141d67e58a9eb8d54614c64a48d6a3e373b6b74b884de6450"} Dec 07 19:14:57 crc kubenswrapper[4815]: I1207 19:14:57.814159 4815 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 07 19:14:57 crc kubenswrapper[4815]: I1207 19:14:57.815757 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:14:57 crc kubenswrapper[4815]: I1207 19:14:57.815788 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:14:57 crc kubenswrapper[4815]: I1207 19:14:57.815805 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:14:58 crc kubenswrapper[4815]: I1207 19:14:58.821800 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"f29e69a7aa018d68c2e39f3e0f7a619fb25262acaebf83b81fabb6f9b87a9892"} Dec 07 19:14:58 crc kubenswrapper[4815]: I1207 19:14:58.821971 4815 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 07 19:14:58 crc kubenswrapper[4815]: I1207 19:14:58.823300 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:14:58 crc kubenswrapper[4815]: I1207 19:14:58.823334 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:14:58 crc kubenswrapper[4815]: I1207 19:14:58.823346 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:14:58 crc kubenswrapper[4815]: I1207 19:14:58.831272 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"5bec885c9a80d9b5b1fe630a9b7a3f19b23b0af05f359bd87a664f007812b4c2"} Dec 07 19:14:58 crc kubenswrapper[4815]: I1207 19:14:58.831306 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"52ab545f838534306ea70cc6e400e9fb56ed71ed8206d72c6626166f3a51a93c"} Dec 07 19:14:58 crc kubenswrapper[4815]: I1207 19:14:58.833882 4815 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="b55f8d454d8472869d96503b5e9e62e8f8f3a41fa94d4d4e6f38e3c8446b025c" exitCode=0 Dec 07 19:14:58 crc kubenswrapper[4815]: I1207 19:14:58.833975 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"b55f8d454d8472869d96503b5e9e62e8f8f3a41fa94d4d4e6f38e3c8446b025c"} Dec 07 19:14:58 crc kubenswrapper[4815]: I1207 19:14:58.836378 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"fc18ff4f1a0a1e7aae38ce7dad5fbea485553d57aac5dbd709fe94d69b4ac6d2"} Dec 07 19:14:58 crc kubenswrapper[4815]: I1207 19:14:58.836465 4815 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 07 19:14:58 crc kubenswrapper[4815]: I1207 19:14:58.837357 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:14:58 crc kubenswrapper[4815]: I1207 19:14:58.837385 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:14:58 crc kubenswrapper[4815]: I1207 19:14:58.837398 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:14:58 crc kubenswrapper[4815]: I1207 19:14:58.840696 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"bda919a651aa83d2f9a82f711d0b242e14140e3d13ec5a80f7ec675a4aedb21a"} Dec 07 19:14:58 crc kubenswrapper[4815]: I1207 19:14:58.840727 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"cb1ec115ead917caf40ce3586653d6eb78e9a290d845f766f8818c2b57ece6a6"} Dec 07 19:14:58 crc kubenswrapper[4815]: I1207 19:14:58.840740 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"441c88363ea1cc8305fa21c51c9237798c47d82a207c9e80da98df93027fe4a2"} Dec 07 19:14:59 crc kubenswrapper[4815]: I1207 19:14:59.027529 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 07 19:14:59 crc kubenswrapper[4815]: I1207 19:14:59.055587 4815 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 07 19:14:59 crc kubenswrapper[4815]: I1207 19:14:59.056749 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:14:59 crc kubenswrapper[4815]: I1207 19:14:59.056777 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:14:59 crc kubenswrapper[4815]: I1207 19:14:59.056788 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:14:59 crc kubenswrapper[4815]: I1207 19:14:59.056812 4815 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 07 19:14:59 crc kubenswrapper[4815]: I1207 19:14:59.852943 4815 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 07 19:14:59 crc kubenswrapper[4815]: I1207 19:14:59.853030 4815 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 07 19:14:59 crc kubenswrapper[4815]: I1207 19:14:59.853075 4815 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 07 19:14:59 crc kubenswrapper[4815]: I1207 19:14:59.853119 4815 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 07 19:14:59 crc kubenswrapper[4815]: I1207 19:14:59.853492 4815 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 07 19:14:59 crc kubenswrapper[4815]: I1207 19:14:59.855966 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"539bb1f0e569004272aac994e13bdc7f17f342f5986f08beca97a4c888e046c2"} Dec 07 19:14:59 crc kubenswrapper[4815]: I1207 19:14:59.856084 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"902a764a3f6e3ac55c1817f369dfd69ffbf8528bfe4444a6514cdabd48f09fb5"} Dec 07 19:14:59 crc kubenswrapper[4815]: I1207 19:14:59.856143 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"4f59886c065844948639bf9828ed20c24546cc38659683c98162568c6b12c029"} Dec 07 19:14:59 crc kubenswrapper[4815]: I1207 19:14:59.856335 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:14:59 crc kubenswrapper[4815]: I1207 19:14:59.856404 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:14:59 crc kubenswrapper[4815]: I1207 19:14:59.856463 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:14:59 crc kubenswrapper[4815]: I1207 19:14:59.857237 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:14:59 crc kubenswrapper[4815]: I1207 19:14:59.857403 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:14:59 crc kubenswrapper[4815]: I1207 19:14:59.857442 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:14:59 crc kubenswrapper[4815]: I1207 19:14:59.857776 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:14:59 crc kubenswrapper[4815]: I1207 19:14:59.857862 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:14:59 crc kubenswrapper[4815]: I1207 19:14:59.857904 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:14:59 crc kubenswrapper[4815]: I1207 19:14:59.858208 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:14:59 crc kubenswrapper[4815]: I1207 19:14:59.858273 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:14:59 crc kubenswrapper[4815]: I1207 19:14:59.858325 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:14:59 crc kubenswrapper[4815]: I1207 19:14:59.858455 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:14:59 crc kubenswrapper[4815]: I1207 19:14:59.858527 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:14:59 crc kubenswrapper[4815]: I1207 19:14:59.858590 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:00 crc kubenswrapper[4815]: I1207 19:15:00.396741 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 07 19:15:00 crc kubenswrapper[4815]: I1207 19:15:00.426107 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 07 19:15:00 crc kubenswrapper[4815]: I1207 19:15:00.858517 4815 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="df109e5aa0deba7a690ea1f5f160ab06d8a3a3e0aa3536b620a7ff738e61c701" exitCode=0 Dec 07 19:15:00 crc kubenswrapper[4815]: I1207 19:15:00.858594 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"df109e5aa0deba7a690ea1f5f160ab06d8a3a3e0aa3536b620a7ff738e61c701"} Dec 07 19:15:00 crc kubenswrapper[4815]: I1207 19:15:00.858709 4815 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 07 19:15:00 crc kubenswrapper[4815]: I1207 19:15:00.858977 4815 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 07 19:15:00 crc kubenswrapper[4815]: I1207 19:15:00.859261 4815 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 07 19:15:00 crc kubenswrapper[4815]: I1207 19:15:00.860508 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:00 crc kubenswrapper[4815]: I1207 19:15:00.860538 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:00 crc kubenswrapper[4815]: I1207 19:15:00.860549 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:00 crc kubenswrapper[4815]: I1207 19:15:00.860862 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:00 crc kubenswrapper[4815]: I1207 19:15:00.860899 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:00 crc kubenswrapper[4815]: I1207 19:15:00.860999 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:00 crc kubenswrapper[4815]: I1207 19:15:00.862123 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:00 crc kubenswrapper[4815]: I1207 19:15:00.862147 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:00 crc kubenswrapper[4815]: I1207 19:15:00.862158 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:01 crc kubenswrapper[4815]: I1207 19:15:01.865390 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"db98a67270148a5b0749995280d4e5591b1682e847d56b06654319b528928abf"} Dec 07 19:15:01 crc kubenswrapper[4815]: I1207 19:15:01.865440 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"2c5e629885bc5c80c370216c2ad383a9c258c28ef825a043812dbb045e61aa1a"} Dec 07 19:15:01 crc kubenswrapper[4815]: I1207 19:15:01.865459 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"c7bfebca60e63011d26025b24bad72ec02c223b2d889f6985504a371645bcfc3"} Dec 07 19:15:01 crc kubenswrapper[4815]: I1207 19:15:01.865475 4815 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 07 19:15:01 crc kubenswrapper[4815]: I1207 19:15:01.865545 4815 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 07 19:15:01 crc kubenswrapper[4815]: I1207 19:15:01.866507 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:01 crc kubenswrapper[4815]: I1207 19:15:01.866514 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:01 crc kubenswrapper[4815]: I1207 19:15:01.866539 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:01 crc kubenswrapper[4815]: I1207 19:15:01.866550 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:01 crc kubenswrapper[4815]: I1207 19:15:01.866551 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:01 crc kubenswrapper[4815]: I1207 19:15:01.866567 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:02 crc kubenswrapper[4815]: I1207 19:15:02.028574 4815 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded" start-of-body= Dec 07 19:15:02 crc kubenswrapper[4815]: I1207 19:15:02.028687 4815 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded" Dec 07 19:15:02 crc kubenswrapper[4815]: I1207 19:15:02.875076 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"3af37bcac7eb4c2aae020c71eb18933c2d3a283f59b3dbb02dba4dfdee1c3928"} Dec 07 19:15:02 crc kubenswrapper[4815]: I1207 19:15:02.875151 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"a7f2e12c3c62b663f9ccb2d51e3e9d80e25024d21bd1c539f999401238ee2003"} Dec 07 19:15:02 crc kubenswrapper[4815]: I1207 19:15:02.875257 4815 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 07 19:15:02 crc kubenswrapper[4815]: I1207 19:15:02.876723 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:02 crc kubenswrapper[4815]: I1207 19:15:02.876780 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:02 crc kubenswrapper[4815]: I1207 19:15:02.876799 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:03 crc kubenswrapper[4815]: I1207 19:15:03.204418 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 07 19:15:03 crc kubenswrapper[4815]: I1207 19:15:03.204661 4815 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 07 19:15:03 crc kubenswrapper[4815]: I1207 19:15:03.206286 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:03 crc kubenswrapper[4815]: I1207 19:15:03.206346 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:03 crc kubenswrapper[4815]: I1207 19:15:03.206364 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:03 crc kubenswrapper[4815]: I1207 19:15:03.580673 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 07 19:15:03 crc kubenswrapper[4815]: I1207 19:15:03.580967 4815 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 07 19:15:03 crc kubenswrapper[4815]: I1207 19:15:03.582564 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:03 crc kubenswrapper[4815]: I1207 19:15:03.582639 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:03 crc kubenswrapper[4815]: I1207 19:15:03.582671 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:03 crc kubenswrapper[4815]: I1207 19:15:03.877615 4815 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 07 19:15:03 crc kubenswrapper[4815]: I1207 19:15:03.883294 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:03 crc kubenswrapper[4815]: I1207 19:15:03.883442 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:03 crc kubenswrapper[4815]: I1207 19:15:03.883505 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:03 crc kubenswrapper[4815]: I1207 19:15:03.898743 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 07 19:15:03 crc kubenswrapper[4815]: I1207 19:15:03.899012 4815 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 07 19:15:03 crc kubenswrapper[4815]: I1207 19:15:03.900651 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:03 crc kubenswrapper[4815]: I1207 19:15:03.900792 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:03 crc kubenswrapper[4815]: I1207 19:15:03.900861 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:04 crc kubenswrapper[4815]: I1207 19:15:04.290981 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 07 19:15:04 crc kubenswrapper[4815]: I1207 19:15:04.291539 4815 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 07 19:15:04 crc kubenswrapper[4815]: I1207 19:15:04.293492 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:04 crc kubenswrapper[4815]: I1207 19:15:04.293694 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:04 crc kubenswrapper[4815]: I1207 19:15:04.294082 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:05 crc kubenswrapper[4815]: I1207 19:15:05.548370 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 07 19:15:05 crc kubenswrapper[4815]: I1207 19:15:05.548675 4815 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 07 19:15:05 crc kubenswrapper[4815]: I1207 19:15:05.550356 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:05 crc kubenswrapper[4815]: I1207 19:15:05.550420 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:05 crc kubenswrapper[4815]: I1207 19:15:05.550447 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:05 crc kubenswrapper[4815]: I1207 19:15:05.558658 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 07 19:15:05 crc kubenswrapper[4815]: I1207 19:15:05.884292 4815 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 07 19:15:05 crc kubenswrapper[4815]: I1207 19:15:05.886134 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:05 crc kubenswrapper[4815]: I1207 19:15:05.886206 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:05 crc kubenswrapper[4815]: I1207 19:15:05.886271 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:05 crc kubenswrapper[4815]: E1207 19:15:05.958668 4815 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 07 19:15:06 crc kubenswrapper[4815]: I1207 19:15:06.747157 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Dec 07 19:15:06 crc kubenswrapper[4815]: I1207 19:15:06.748428 4815 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 07 19:15:06 crc kubenswrapper[4815]: I1207 19:15:06.750745 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:06 crc kubenswrapper[4815]: I1207 19:15:06.750800 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:06 crc kubenswrapper[4815]: I1207 19:15:06.750819 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:08 crc kubenswrapper[4815]: W1207 19:15:08.641565 4815 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Dec 07 19:15:08 crc kubenswrapper[4815]: I1207 19:15:08.642196 4815 trace.go:236] Trace[883142255]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (07-Dec-2025 19:14:58.640) (total time: 10002ms): Dec 07 19:15:08 crc kubenswrapper[4815]: Trace[883142255]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (19:15:08.641) Dec 07 19:15:08 crc kubenswrapper[4815]: Trace[883142255]: [10.002086704s] [10.002086704s] END Dec 07 19:15:08 crc kubenswrapper[4815]: E1207 19:15:08.642237 4815 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 07 19:15:08 crc kubenswrapper[4815]: I1207 19:15:08.694251 4815 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Dec 07 19:15:08 crc kubenswrapper[4815]: W1207 19:15:08.707039 4815 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Dec 07 19:15:08 crc kubenswrapper[4815]: I1207 19:15:08.707257 4815 trace.go:236] Trace[1943947974]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (07-Dec-2025 19:14:58.705) (total time: 10001ms): Dec 07 19:15:08 crc kubenswrapper[4815]: Trace[1943947974]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (19:15:08.707) Dec 07 19:15:08 crc kubenswrapper[4815]: Trace[1943947974]: [10.001921899s] [10.001921899s] END Dec 07 19:15:08 crc kubenswrapper[4815]: E1207 19:15:08.707454 4815 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 07 19:15:08 crc kubenswrapper[4815]: E1207 19:15:08.718091 4815 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" interval="3.2s" Dec 07 19:15:09 crc kubenswrapper[4815]: E1207 19:15:09.058225 4815 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": net/http: TLS handshake timeout" node="crc" Dec 07 19:15:09 crc kubenswrapper[4815]: W1207 19:15:09.595488 4815 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Dec 07 19:15:09 crc kubenswrapper[4815]: I1207 19:15:09.595653 4815 trace.go:236] Trace[1638050068]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (07-Dec-2025 19:14:59.594) (total time: 10001ms): Dec 07 19:15:09 crc kubenswrapper[4815]: Trace[1638050068]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (19:15:09.595) Dec 07 19:15:09 crc kubenswrapper[4815]: Trace[1638050068]: [10.001551551s] [10.001551551s] END Dec 07 19:15:09 crc kubenswrapper[4815]: E1207 19:15:09.595696 4815 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 07 19:15:09 crc kubenswrapper[4815]: W1207 19:15:09.680946 4815 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Dec 07 19:15:09 crc kubenswrapper[4815]: I1207 19:15:09.681079 4815 trace.go:236] Trace[456874173]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (07-Dec-2025 19:14:59.678) (total time: 10002ms): Dec 07 19:15:09 crc kubenswrapper[4815]: Trace[456874173]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (19:15:09.680) Dec 07 19:15:09 crc kubenswrapper[4815]: Trace[456874173]: [10.002083144s] [10.002083144s] END Dec 07 19:15:09 crc kubenswrapper[4815]: E1207 19:15:09.681115 4815 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 07 19:15:10 crc kubenswrapper[4815]: E1207 19:15:10.024134 4815 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": net/http: TLS handshake timeout" event="&Event{ObjectMeta:{crc.187f04d5e623407a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-07 19:14:55.688138874 +0000 UTC m=+0.267128959,LastTimestamp:2025-12-07 19:14:55.688138874 +0000 UTC m=+0.267128959,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 07 19:15:12 crc kubenswrapper[4815]: I1207 19:15:12.028249 4815 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 07 19:15:12 crc kubenswrapper[4815]: I1207 19:15:12.028373 4815 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 07 19:15:12 crc kubenswrapper[4815]: I1207 19:15:12.258739 4815 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 07 19:15:12 crc kubenswrapper[4815]: I1207 19:15:12.260519 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:12 crc kubenswrapper[4815]: I1207 19:15:12.260557 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:12 crc kubenswrapper[4815]: I1207 19:15:12.260574 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:12 crc kubenswrapper[4815]: I1207 19:15:12.260603 4815 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 07 19:15:12 crc kubenswrapper[4815]: I1207 19:15:12.426688 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Dec 07 19:15:12 crc kubenswrapper[4815]: I1207 19:15:12.426972 4815 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 07 19:15:12 crc kubenswrapper[4815]: I1207 19:15:12.429239 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:12 crc kubenswrapper[4815]: I1207 19:15:12.429318 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:12 crc kubenswrapper[4815]: I1207 19:15:12.429350 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:12 crc kubenswrapper[4815]: I1207 19:15:12.475573 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Dec 07 19:15:12 crc kubenswrapper[4815]: I1207 19:15:12.903465 4815 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 07 19:15:12 crc kubenswrapper[4815]: I1207 19:15:12.905002 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:12 crc kubenswrapper[4815]: I1207 19:15:12.905211 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:12 crc kubenswrapper[4815]: I1207 19:15:12.905351 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:12 crc kubenswrapper[4815]: I1207 19:15:12.924103 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Dec 07 19:15:13 crc kubenswrapper[4815]: I1207 19:15:13.581541 4815 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="Get \"https://192.168.126.11:6443/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 07 19:15:13 crc kubenswrapper[4815]: I1207 19:15:13.581603 4815 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.126.11:6443/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 07 19:15:13 crc kubenswrapper[4815]: I1207 19:15:13.903518 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 07 19:15:13 crc kubenswrapper[4815]: I1207 19:15:13.904197 4815 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 07 19:15:13 crc kubenswrapper[4815]: I1207 19:15:13.905660 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:13 crc kubenswrapper[4815]: I1207 19:15:13.905707 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:13 crc kubenswrapper[4815]: I1207 19:15:13.905724 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:13 crc kubenswrapper[4815]: I1207 19:15:13.906319 4815 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 07 19:15:13 crc kubenswrapper[4815]: I1207 19:15:13.907709 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:13 crc kubenswrapper[4815]: I1207 19:15:13.907812 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:13 crc kubenswrapper[4815]: I1207 19:15:13.907840 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:15 crc kubenswrapper[4815]: I1207 19:15:15.266013 4815 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 07 19:15:15 crc kubenswrapper[4815]: I1207 19:15:15.266078 4815 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 07 19:15:15 crc kubenswrapper[4815]: E1207 19:15:15.959155 4815 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 07 19:15:18 crc kubenswrapper[4815]: I1207 19:15:18.588262 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 07 19:15:18 crc kubenswrapper[4815]: I1207 19:15:18.589359 4815 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 07 19:15:18 crc kubenswrapper[4815]: I1207 19:15:18.590970 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:18 crc kubenswrapper[4815]: I1207 19:15:18.591033 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:18 crc kubenswrapper[4815]: I1207 19:15:18.591057 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:18 crc kubenswrapper[4815]: I1207 19:15:18.597119 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 07 19:15:18 crc kubenswrapper[4815]: I1207 19:15:18.918627 4815 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 07 19:15:18 crc kubenswrapper[4815]: I1207 19:15:18.919141 4815 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 07 19:15:18 crc kubenswrapper[4815]: I1207 19:15:18.920549 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:18 crc kubenswrapper[4815]: I1207 19:15:18.920608 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:18 crc kubenswrapper[4815]: I1207 19:15:18.920633 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.265438 4815 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.265612 4815 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.267707 4815 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.267888 4815 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.268857 4815 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Dec 07 19:15:20 crc kubenswrapper[4815]: E1207 19:15:20.271547 4815 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.315577 4815 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:47524->192.168.126.11:17697: read: connection reset by peer" start-of-body= Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.315643 4815 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:47524->192.168.126.11:17697: read: connection reset by peer" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.316029 4815 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.316073 4815 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.316174 4815 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:40202->192.168.126.11:17697: read: connection reset by peer" start-of-body= Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.316281 4815 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:40202->192.168.126.11:17697: read: connection reset by peer" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.410504 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.425703 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.426511 4815 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.426572 4815 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.695356 4815 apiserver.go:52] "Watching apiserver" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.697097 4815 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.697386 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb"] Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.697674 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.697685 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 07 19:15:20 crc kubenswrapper[4815]: E1207 19:15:20.697756 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.698055 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.698083 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.698093 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.698125 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 07 19:15:20 crc kubenswrapper[4815]: E1207 19:15:20.698102 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 07 19:15:20 crc kubenswrapper[4815]: E1207 19:15:20.698338 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.699328 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.699448 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.699609 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.699771 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.699792 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.700268 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.700383 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.701176 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.704672 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.721200 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.733566 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.742705 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.754622 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.765112 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.775764 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.783825 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.794190 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"538dccc2-452b-4d13-8b0d-df993c2b69a2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae001d2e4c277c57382ca0c4a1286c3c135f2fc06ca5a78df27cbdab588ce576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f92a2d37af34c519a6b00a9643f4a46de779ae0070adb0826e9f227960fe4a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89a76372fcc5ff7131084098e41183d148096169b9ccaf69d59dbef45dcfd755\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f29e69a7aa018d68c2e39f3e0f7a619fb25262acaebf83b81fabb6f9b87a9892\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.794878 4815 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.805641 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.870987 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.871049 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.871083 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.871117 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.871149 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.871181 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.871210 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.871243 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.871333 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.871365 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.871402 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.871435 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.871465 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.871494 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.871507 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.871528 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.871532 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.871625 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.871654 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.871652 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.871693 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.871682 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.871770 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.872003 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.872029 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.872042 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.872056 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.872078 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.872101 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.872120 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.872126 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.872143 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.872156 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.872191 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.872141 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.872250 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.872270 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.872290 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.872306 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.872320 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.872340 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.872348 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.872357 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.872369 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.872396 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.872422 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.872447 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.872472 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.872496 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.872520 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.872562 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.872584 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.872605 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.872662 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.872684 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.872706 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.872767 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.872790 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.872811 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.872832 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.872853 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.872877 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.872900 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.872960 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.872981 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.873013 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.873034 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.873055 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.873074 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.873094 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.873114 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.873137 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.873158 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.873183 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.873205 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.873226 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.873246 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.873266 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.873288 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.873310 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.873332 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.873381 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.873406 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.873435 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.873465 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.873498 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.873521 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.873546 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.873567 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.873589 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.873613 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.873633 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.873655 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.873678 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.873702 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.873731 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.873762 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.873786 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.873868 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.873895 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.873934 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.873973 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.873996 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.874022 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.874045 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.872438 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.874066 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.872509 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.874090 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.874114 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.874138 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.874162 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.874185 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.874207 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.874232 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.874254 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.874278 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.874299 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.874323 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.874346 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.874370 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.874396 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.874420 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.874444 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.874468 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.874490 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.874513 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.874537 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.874559 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.874647 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.874671 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.874694 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.874715 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.874750 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.874776 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.874799 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.874824 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.874848 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.874870 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.874895 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.874936 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.874958 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.874981 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.875004 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.875025 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.875048 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.875070 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.875092 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.875114 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.875135 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.875156 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.875179 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.875201 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.875225 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.875248 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.875273 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.875295 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.875318 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.875341 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.875364 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.875385 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.875407 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.875432 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.875454 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.875476 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.875498 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.875525 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.875546 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.875567 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.875592 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.875615 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.875637 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.875659 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.875680 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.875703 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.875730 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.875763 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.875785 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.875809 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.875831 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.875853 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.875876 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.875898 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.876348 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.876376 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.876404 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.876426 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.876451 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.876474 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.876497 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.876520 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.876573 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.876596 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.876619 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.876641 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.876663 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.876687 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.876716 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.876752 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.876778 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.876801 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.876825 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.876847 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.876870 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.876894 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.876935 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.876959 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.876982 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.877039 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.877067 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.877094 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.877123 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.877150 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.877174 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.877201 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.877227 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.877257 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.877284 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.877309 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.877332 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.877355 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.877377 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.877441 4815 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.877457 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.877471 4815 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.877485 4815 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.877500 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.877514 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.877528 4815 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.877544 4815 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.877557 4815 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.877571 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.877586 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.877599 4815 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.877613 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.883073 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.872597 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.872608 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.872931 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.873128 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.873218 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.873336 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.873410 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.873421 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.873442 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.873543 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.873701 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.873723 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.873902 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.874046 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.874095 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.874132 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.874244 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.874338 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.874518 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.874678 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.874820 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.874822 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.875230 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.875687 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.876066 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.876428 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.876439 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.878094 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.878260 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.878326 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.878614 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.878883 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.879158 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.879527 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.879856 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.880113 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.880699 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.881145 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.881176 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.881444 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.881612 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.881817 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.883212 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.883563 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.885162 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.886383 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.887496 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.887951 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.888050 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.888472 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.888781 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.888879 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.888904 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.888969 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.889240 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.889365 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.889459 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.889635 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.889960 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.889968 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.890383 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.890606 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.890759 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.891552 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.892333 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.892654 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.893212 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.901397 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.901530 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.901884 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.902287 4815 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.902455 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.902607 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.893421 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.893736 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.895035 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.895238 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.895405 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.895572 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.895756 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.895995 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.896171 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.896328 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.896953 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.896984 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.897303 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.897458 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.897583 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.884008 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.897601 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.897770 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.897767 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.897886 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.898044 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.898067 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: E1207 19:15:20.898161 4815 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 07 19:15:20 crc kubenswrapper[4815]: E1207 19:15:20.903122 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-07 19:15:21.403097907 +0000 UTC m=+25.982087962 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.898380 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.902829 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.903337 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.904165 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.906347 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.915408 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.915819 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.915935 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.916026 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: E1207 19:15:20.916143 4815 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.916226 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: E1207 19:15:20.916230 4815 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 07 19:15:20 crc kubenswrapper[4815]: E1207 19:15:20.916389 4815 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.916417 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.916318 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.916710 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.917379 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.917450 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.917856 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.918269 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.918499 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.920518 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.921007 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.921332 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.921549 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.921835 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.922195 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.922246 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.922391 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.922542 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.922851 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.922900 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.923300 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.923317 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.923578 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.923853 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.924078 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.924326 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.924717 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.925042 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.925266 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.925550 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.925696 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.926086 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.926271 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.926597 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.926645 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.926941 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.927027 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.927201 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.927257 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.927681 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.927805 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.928155 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.928385 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.928632 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.928810 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.929040 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.929317 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.929648 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.929871 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.930114 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.930753 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.930785 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.931080 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.931161 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.931273 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.931514 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.931719 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.931837 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.931854 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.931883 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.932254 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.932281 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.932301 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.932630 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.932829 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: E1207 19:15:20.932989 4815 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 07 19:15:20 crc kubenswrapper[4815]: E1207 19:15:20.933060 4815 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 07 19:15:20 crc kubenswrapper[4815]: E1207 19:15:20.933104 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-07 19:15:21.433086174 +0000 UTC m=+26.012076219 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.933123 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.933169 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: E1207 19:15:20.933238 4815 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.933039 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.933315 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: E1207 19:15:20.933452 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-07 19:15:21.433434454 +0000 UTC m=+26.012424489 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.933661 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 07 19:15:20 crc kubenswrapper[4815]: E1207 19:15:20.933777 4815 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.933878 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: E1207 19:15:20.933882 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-07 19:15:21.433873805 +0000 UTC m=+26.012863840 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 07 19:15:20 crc kubenswrapper[4815]: E1207 19:15:20.934024 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-07 19:15:21.434016319 +0000 UTC m=+26.013006364 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.934253 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.934514 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.934567 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.934943 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.939357 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.942588 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.946209 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.946717 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.952078 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.953619 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.953848 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.954550 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.957421 4815 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="539bb1f0e569004272aac994e13bdc7f17f342f5986f08beca97a4c888e046c2" exitCode=255 Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.957976 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"539bb1f0e569004272aac994e13bdc7f17f342f5986f08beca97a4c888e046c2"} Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.960855 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: E1207 19:15:20.965145 4815 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.967056 4815 scope.go:117] "RemoveContainer" containerID="539bb1f0e569004272aac994e13bdc7f17f342f5986f08beca97a4c888e046c2" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.967510 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.969244 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"538dccc2-452b-4d13-8b0d-df993c2b69a2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae001d2e4c277c57382ca0c4a1286c3c135f2fc06ca5a78df27cbdab588ce576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f92a2d37af34c519a6b00a9643f4a46de779ae0070adb0826e9f227960fe4a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89a76372fcc5ff7131084098e41183d148096169b9ccaf69d59dbef45dcfd755\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f29e69a7aa018d68c2e39f3e0f7a619fb25262acaebf83b81fabb6f9b87a9892\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.972751 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.974723 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.977857 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.977927 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.977986 4815 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.978118 4815 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.977985 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.978134 4815 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.978008 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.978246 4815 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.978256 4815 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.978266 4815 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.978318 4815 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.978327 4815 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.978335 4815 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.978360 4815 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.978368 4815 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.978377 4815 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.978384 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.978392 4815 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.978400 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.978408 4815 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.978418 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.978426 4815 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.978434 4815 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.978443 4815 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.978451 4815 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.978460 4815 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.978468 4815 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.978477 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.978486 4815 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.978494 4815 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.978503 4815 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.978511 4815 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.978562 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.978682 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.978692 4815 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.978700 4815 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.978708 4815 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.978732 4815 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.978741 4815 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.978749 4815 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.978773 4815 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.978782 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.978791 4815 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.978814 4815 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.978822 4815 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.978832 4815 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.978841 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.978850 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.978858 4815 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.978867 4815 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.978875 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.978883 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.978891 4815 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.978899 4815 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.978907 4815 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.978924 4815 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.978933 4815 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.978941 4815 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.978949 4815 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.978957 4815 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.978965 4815 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.978974 4815 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.978982 4815 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.978991 4815 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.978999 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.979008 4815 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.979016 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.979025 4815 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.979033 4815 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.979041 4815 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.979050 4815 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.979058 4815 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.979067 4815 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.979076 4815 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.979084 4815 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.979092 4815 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.979100 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.979109 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.979117 4815 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.979125 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.979133 4815 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.979140 4815 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.979148 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.979156 4815 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.979163 4815 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.979171 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.979179 4815 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.979187 4815 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.979194 4815 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.979202 4815 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.979209 4815 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.979219 4815 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.979229 4815 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.979237 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.979245 4815 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.979255 4815 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.979263 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.979271 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.979280 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.979287 4815 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.979296 4815 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.979324 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.979333 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.979341 4815 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.979349 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.979356 4815 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.979385 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.979394 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.979402 4815 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.979412 4815 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.979421 4815 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.979429 4815 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.979437 4815 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.979445 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.979453 4815 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.979461 4815 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.979469 4815 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.979477 4815 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.979485 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.979493 4815 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.979501 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.979547 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.979555 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.979564 4815 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.979591 4815 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.979600 4815 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.979608 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.979616 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.979625 4815 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.979654 4815 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.979663 4815 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.980236 4815 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.980244 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.980253 4815 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.980260 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.980268 4815 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.980275 4815 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.980282 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.980294 4815 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.980303 4815 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.980311 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.980318 4815 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.980326 4815 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.980333 4815 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.980341 4815 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.980348 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.980356 4815 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.980363 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.980371 4815 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.980378 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.980387 4815 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.980394 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.980402 4815 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.980409 4815 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.980416 4815 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.980425 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.980433 4815 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.980440 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.980448 4815 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.980456 4815 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.980463 4815 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.980470 4815 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.980478 4815 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.980486 4815 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.980494 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.980502 4815 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.980510 4815 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.980518 4815 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.980525 4815 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.980533 4815 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.980567 4815 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.981155 4815 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.981169 4815 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.981179 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.981188 4815 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.981196 4815 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.981204 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.981212 4815 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.981220 4815 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.981228 4815 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.981236 4815 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.981244 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.981251 4815 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.981258 4815 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.981266 4815 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.983845 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 07 19:15:20 crc kubenswrapper[4815]: I1207 19:15:20.992884 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 07 19:15:21 crc kubenswrapper[4815]: I1207 19:15:21.003015 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 07 19:15:21 crc kubenswrapper[4815]: I1207 19:15:21.016843 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 07 19:15:21 crc kubenswrapper[4815]: I1207 19:15:21.027777 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 07 19:15:21 crc kubenswrapper[4815]: I1207 19:15:21.039938 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 07 19:15:21 crc kubenswrapper[4815]: I1207 19:15:21.042285 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 07 19:15:21 crc kubenswrapper[4815]: I1207 19:15:21.053622 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 07 19:15:21 crc kubenswrapper[4815]: I1207 19:15:21.058565 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 07 19:15:21 crc kubenswrapper[4815]: W1207 19:15:21.063200 4815 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-ef4bc935c671570058394f4f20107721cf46f5925aa10d831f2271a31afe901f WatchSource:0}: Error finding container ef4bc935c671570058394f4f20107721cf46f5925aa10d831f2271a31afe901f: Status 404 returned error can't find the container with id ef4bc935c671570058394f4f20107721cf46f5925aa10d831f2271a31afe901f Dec 07 19:15:21 crc kubenswrapper[4815]: W1207 19:15:21.066251 4815 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-ff3dd9727cfb5d654e9770d36a2abeb0e177caf4a39ba7be3cec064358c46fb4 WatchSource:0}: Error finding container ff3dd9727cfb5d654e9770d36a2abeb0e177caf4a39ba7be3cec064358c46fb4: Status 404 returned error can't find the container with id ff3dd9727cfb5d654e9770d36a2abeb0e177caf4a39ba7be3cec064358c46fb4 Dec 07 19:15:21 crc kubenswrapper[4815]: W1207 19:15:21.067785 4815 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-2f4e1654e3d8159774fb208dc87347086a603aa4faca6d312f7b82718649c736 WatchSource:0}: Error finding container 2f4e1654e3d8159774fb208dc87347086a603aa4faca6d312f7b82718649c736: Status 404 returned error can't find the container with id 2f4e1654e3d8159774fb208dc87347086a603aa4faca6d312f7b82718649c736 Dec 07 19:15:21 crc kubenswrapper[4815]: I1207 19:15:21.484283 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 07 19:15:21 crc kubenswrapper[4815]: I1207 19:15:21.484460 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 07 19:15:21 crc kubenswrapper[4815]: E1207 19:15:21.484482 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-07 19:15:22.484449358 +0000 UTC m=+27.063439403 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:15:21 crc kubenswrapper[4815]: E1207 19:15:21.484657 4815 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 07 19:15:21 crc kubenswrapper[4815]: E1207 19:15:21.484744 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-07 19:15:22.484720765 +0000 UTC m=+27.063710850 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 07 19:15:21 crc kubenswrapper[4815]: I1207 19:15:21.485328 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 07 19:15:21 crc kubenswrapper[4815]: E1207 19:15:21.485476 4815 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 07 19:15:21 crc kubenswrapper[4815]: E1207 19:15:21.485510 4815 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 07 19:15:21 crc kubenswrapper[4815]: E1207 19:15:21.485529 4815 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 07 19:15:21 crc kubenswrapper[4815]: E1207 19:15:21.485590 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-07 19:15:22.485572658 +0000 UTC m=+27.064562743 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 07 19:15:21 crc kubenswrapper[4815]: I1207 19:15:21.485641 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 07 19:15:21 crc kubenswrapper[4815]: I1207 19:15:21.485686 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 07 19:15:21 crc kubenswrapper[4815]: E1207 19:15:21.485774 4815 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 07 19:15:21 crc kubenswrapper[4815]: E1207 19:15:21.485815 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-07 19:15:22.485807105 +0000 UTC m=+27.064797150 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 07 19:15:21 crc kubenswrapper[4815]: E1207 19:15:21.485900 4815 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 07 19:15:21 crc kubenswrapper[4815]: E1207 19:15:21.485942 4815 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 07 19:15:21 crc kubenswrapper[4815]: E1207 19:15:21.485959 4815 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 07 19:15:21 crc kubenswrapper[4815]: E1207 19:15:21.486032 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-07 19:15:22.48601475 +0000 UTC m=+27.065004835 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 07 19:15:21 crc kubenswrapper[4815]: I1207 19:15:21.771188 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 07 19:15:21 crc kubenswrapper[4815]: E1207 19:15:21.771301 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 07 19:15:21 crc kubenswrapper[4815]: I1207 19:15:21.771620 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 07 19:15:21 crc kubenswrapper[4815]: E1207 19:15:21.771684 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 07 19:15:21 crc kubenswrapper[4815]: I1207 19:15:21.773116 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Dec 07 19:15:21 crc kubenswrapper[4815]: I1207 19:15:21.773640 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Dec 07 19:15:21 crc kubenswrapper[4815]: I1207 19:15:21.774429 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Dec 07 19:15:21 crc kubenswrapper[4815]: I1207 19:15:21.775081 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Dec 07 19:15:21 crc kubenswrapper[4815]: I1207 19:15:21.775616 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Dec 07 19:15:21 crc kubenswrapper[4815]: I1207 19:15:21.776172 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Dec 07 19:15:21 crc kubenswrapper[4815]: I1207 19:15:21.776792 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Dec 07 19:15:21 crc kubenswrapper[4815]: I1207 19:15:21.778504 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Dec 07 19:15:21 crc kubenswrapper[4815]: I1207 19:15:21.779091 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Dec 07 19:15:21 crc kubenswrapper[4815]: I1207 19:15:21.779960 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Dec 07 19:15:21 crc kubenswrapper[4815]: I1207 19:15:21.780412 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Dec 07 19:15:21 crc kubenswrapper[4815]: I1207 19:15:21.781460 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Dec 07 19:15:21 crc kubenswrapper[4815]: I1207 19:15:21.781986 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Dec 07 19:15:21 crc kubenswrapper[4815]: I1207 19:15:21.782867 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Dec 07 19:15:21 crc kubenswrapper[4815]: I1207 19:15:21.783368 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Dec 07 19:15:21 crc kubenswrapper[4815]: I1207 19:15:21.784225 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Dec 07 19:15:21 crc kubenswrapper[4815]: I1207 19:15:21.784794 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Dec 07 19:15:21 crc kubenswrapper[4815]: I1207 19:15:21.785183 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Dec 07 19:15:21 crc kubenswrapper[4815]: I1207 19:15:21.786092 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Dec 07 19:15:21 crc kubenswrapper[4815]: I1207 19:15:21.786648 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Dec 07 19:15:21 crc kubenswrapper[4815]: I1207 19:15:21.787128 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Dec 07 19:15:21 crc kubenswrapper[4815]: I1207 19:15:21.788245 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Dec 07 19:15:21 crc kubenswrapper[4815]: I1207 19:15:21.788707 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Dec 07 19:15:21 crc kubenswrapper[4815]: I1207 19:15:21.789725 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Dec 07 19:15:21 crc kubenswrapper[4815]: I1207 19:15:21.790136 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Dec 07 19:15:21 crc kubenswrapper[4815]: I1207 19:15:21.791172 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Dec 07 19:15:21 crc kubenswrapper[4815]: I1207 19:15:21.791857 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Dec 07 19:15:21 crc kubenswrapper[4815]: I1207 19:15:21.792405 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Dec 07 19:15:21 crc kubenswrapper[4815]: I1207 19:15:21.793325 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Dec 07 19:15:21 crc kubenswrapper[4815]: I1207 19:15:21.793758 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Dec 07 19:15:21 crc kubenswrapper[4815]: I1207 19:15:21.794542 4815 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Dec 07 19:15:21 crc kubenswrapper[4815]: I1207 19:15:21.794640 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Dec 07 19:15:21 crc kubenswrapper[4815]: I1207 19:15:21.796255 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Dec 07 19:15:21 crc kubenswrapper[4815]: I1207 19:15:21.797115 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Dec 07 19:15:21 crc kubenswrapper[4815]: I1207 19:15:21.797540 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Dec 07 19:15:21 crc kubenswrapper[4815]: I1207 19:15:21.799020 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Dec 07 19:15:21 crc kubenswrapper[4815]: I1207 19:15:21.800005 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Dec 07 19:15:21 crc kubenswrapper[4815]: I1207 19:15:21.800512 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Dec 07 19:15:21 crc kubenswrapper[4815]: I1207 19:15:21.801469 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Dec 07 19:15:21 crc kubenswrapper[4815]: I1207 19:15:21.802125 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Dec 07 19:15:21 crc kubenswrapper[4815]: I1207 19:15:21.802560 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Dec 07 19:15:21 crc kubenswrapper[4815]: I1207 19:15:21.803509 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Dec 07 19:15:21 crc kubenswrapper[4815]: I1207 19:15:21.804527 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Dec 07 19:15:21 crc kubenswrapper[4815]: I1207 19:15:21.805154 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Dec 07 19:15:21 crc kubenswrapper[4815]: I1207 19:15:21.805943 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Dec 07 19:15:21 crc kubenswrapper[4815]: I1207 19:15:21.806441 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Dec 07 19:15:21 crc kubenswrapper[4815]: I1207 19:15:21.807286 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Dec 07 19:15:21 crc kubenswrapper[4815]: I1207 19:15:21.808049 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Dec 07 19:15:21 crc kubenswrapper[4815]: I1207 19:15:21.808873 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Dec 07 19:15:21 crc kubenswrapper[4815]: I1207 19:15:21.809320 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Dec 07 19:15:21 crc kubenswrapper[4815]: I1207 19:15:21.809756 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Dec 07 19:15:21 crc kubenswrapper[4815]: I1207 19:15:21.810638 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Dec 07 19:15:21 crc kubenswrapper[4815]: I1207 19:15:21.813306 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Dec 07 19:15:21 crc kubenswrapper[4815]: I1207 19:15:21.813758 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Dec 07 19:15:21 crc kubenswrapper[4815]: I1207 19:15:21.962952 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 07 19:15:21 crc kubenswrapper[4815]: I1207 19:15:21.964971 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c76d9f056d22d1be012327374370269c2f6d6b8bd341c98419fcb744370751e3"} Dec 07 19:15:21 crc kubenswrapper[4815]: I1207 19:15:21.965177 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 07 19:15:21 crc kubenswrapper[4815]: I1207 19:15:21.966037 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"2f4e1654e3d8159774fb208dc87347086a603aa4faca6d312f7b82718649c736"} Dec 07 19:15:21 crc kubenswrapper[4815]: I1207 19:15:21.967306 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"b57a8653da7e9136a6c54872ff67a01c2d6ad5077842b783ffe74e0069bc8693"} Dec 07 19:15:21 crc kubenswrapper[4815]: I1207 19:15:21.967352 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"ff3dd9727cfb5d654e9770d36a2abeb0e177caf4a39ba7be3cec064358c46fb4"} Dec 07 19:15:21 crc kubenswrapper[4815]: I1207 19:15:21.968934 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"2ec34a6371c74ee0778e27d73f5e2948d9ee1bb518f658675df123df60879c61"} Dec 07 19:15:21 crc kubenswrapper[4815]: I1207 19:15:21.968980 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"bc20ae853b989c9e9cfeb47a40bd1a02397e3ce0b0b4feec3fcec56ea7be16a9"} Dec 07 19:15:21 crc kubenswrapper[4815]: I1207 19:15:21.968997 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"ef4bc935c671570058394f4f20107721cf46f5925aa10d831f2271a31afe901f"} Dec 07 19:15:21 crc kubenswrapper[4815]: I1207 19:15:21.987504 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe01e9d3-9839-41ea-82d9-f4b6f48bf3b2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52ab545f838534306ea70cc6e400e9fb56ed71ed8206d72c6626166f3a51a93c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f59886c065844948639bf9828ed20c24546cc38659683c98162568c6b12c029\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bec885c9a80d9b5b1fe630a9b7a3f19b23b0af05f359bd87a664f007812b4c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c76d9f056d22d1be012327374370269c2f6d6b8bd341c98419fcb744370751e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539bb1f0e569004272aac994e13bdc7f17f342f5986f08beca97a4c888e046c2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1207 19:15:09.905103 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1207 19:15:09.906859 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4228347019/tls.crt::/tmp/serving-cert-4228347019/tls.key\\\\\\\"\\\\nI1207 19:15:20.294696 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1207 19:15:20.298748 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1207 19:15:20.298781 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1207 19:15:20.298843 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1207 19:15:20.298860 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1207 19:15:20.305864 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1207 19:15:20.305885 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1207 19:15:20.305891 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1207 19:15:20.305973 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1207 19:15:20.305983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1207 19:15:20.305988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1207 19:15:20.305992 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1207 19:15:20.305997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1207 19:15:20.308727 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://902a764a3f6e3ac55c1817f369dfd69ffbf8528bfe4444a6514cdabd48f09fb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:21Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:22 crc kubenswrapper[4815]: I1207 19:15:22.004475 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:21Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:22 crc kubenswrapper[4815]: I1207 19:15:22.024955 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:22Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:22 crc kubenswrapper[4815]: I1207 19:15:22.039952 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:22Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:22 crc kubenswrapper[4815]: I1207 19:15:22.060304 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:22Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:22 crc kubenswrapper[4815]: I1207 19:15:22.076755 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:22Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:22 crc kubenswrapper[4815]: I1207 19:15:22.099115 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"538dccc2-452b-4d13-8b0d-df993c2b69a2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae001d2e4c277c57382ca0c4a1286c3c135f2fc06ca5a78df27cbdab588ce576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f92a2d37af34c519a6b00a9643f4a46de779ae0070adb0826e9f227960fe4a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89a76372fcc5ff7131084098e41183d148096169b9ccaf69d59dbef45dcfd755\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f29e69a7aa018d68c2e39f3e0f7a619fb25262acaebf83b81fabb6f9b87a9892\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:22Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:22 crc kubenswrapper[4815]: I1207 19:15:22.113215 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:22Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:22 crc kubenswrapper[4815]: I1207 19:15:22.128047 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe01e9d3-9839-41ea-82d9-f4b6f48bf3b2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52ab545f838534306ea70cc6e400e9fb56ed71ed8206d72c6626166f3a51a93c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f59886c065844948639bf9828ed20c24546cc38659683c98162568c6b12c029\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bec885c9a80d9b5b1fe630a9b7a3f19b23b0af05f359bd87a664f007812b4c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c76d9f056d22d1be012327374370269c2f6d6b8bd341c98419fcb744370751e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539bb1f0e569004272aac994e13bdc7f17f342f5986f08beca97a4c888e046c2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1207 19:15:09.905103 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1207 19:15:09.906859 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4228347019/tls.crt::/tmp/serving-cert-4228347019/tls.key\\\\\\\"\\\\nI1207 19:15:20.294696 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1207 19:15:20.298748 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1207 19:15:20.298781 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1207 19:15:20.298843 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1207 19:15:20.298860 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1207 19:15:20.305864 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1207 19:15:20.305885 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1207 19:15:20.305891 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1207 19:15:20.305973 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1207 19:15:20.305983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1207 19:15:20.305988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1207 19:15:20.305992 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1207 19:15:20.305997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1207 19:15:20.308727 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://902a764a3f6e3ac55c1817f369dfd69ffbf8528bfe4444a6514cdabd48f09fb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:22Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:22 crc kubenswrapper[4815]: I1207 19:15:22.144632 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:22Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:22 crc kubenswrapper[4815]: I1207 19:15:22.166144 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b57a8653da7e9136a6c54872ff67a01c2d6ad5077842b783ffe74e0069bc8693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:22Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:22 crc kubenswrapper[4815]: I1207 19:15:22.181875 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:22Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:22 crc kubenswrapper[4815]: I1207 19:15:22.206834 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ec34a6371c74ee0778e27d73f5e2948d9ee1bb518f658675df123df60879c61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc20ae853b989c9e9cfeb47a40bd1a02397e3ce0b0b4feec3fcec56ea7be16a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:22Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:22 crc kubenswrapper[4815]: I1207 19:15:22.218957 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:22Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:22 crc kubenswrapper[4815]: I1207 19:15:22.246336 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"538dccc2-452b-4d13-8b0d-df993c2b69a2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae001d2e4c277c57382ca0c4a1286c3c135f2fc06ca5a78df27cbdab588ce576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f92a2d37af34c519a6b00a9643f4a46de779ae0070adb0826e9f227960fe4a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89a76372fcc5ff7131084098e41183d148096169b9ccaf69d59dbef45dcfd755\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f29e69a7aa018d68c2e39f3e0f7a619fb25262acaebf83b81fabb6f9b87a9892\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:22Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:22 crc kubenswrapper[4815]: I1207 19:15:22.280374 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:22Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:22 crc kubenswrapper[4815]: I1207 19:15:22.493828 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 07 19:15:22 crc kubenswrapper[4815]: I1207 19:15:22.493887 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 07 19:15:22 crc kubenswrapper[4815]: I1207 19:15:22.493930 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 07 19:15:22 crc kubenswrapper[4815]: I1207 19:15:22.493948 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 07 19:15:22 crc kubenswrapper[4815]: I1207 19:15:22.493964 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 07 19:15:22 crc kubenswrapper[4815]: E1207 19:15:22.494035 4815 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 07 19:15:22 crc kubenswrapper[4815]: E1207 19:15:22.494045 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-07 19:15:24.49401911 +0000 UTC m=+29.073009155 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:15:22 crc kubenswrapper[4815]: E1207 19:15:22.494089 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-07 19:15:24.494075931 +0000 UTC m=+29.073065976 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 07 19:15:22 crc kubenswrapper[4815]: E1207 19:15:22.494103 4815 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 07 19:15:22 crc kubenswrapper[4815]: E1207 19:15:22.494118 4815 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 07 19:15:22 crc kubenswrapper[4815]: E1207 19:15:22.494132 4815 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 07 19:15:22 crc kubenswrapper[4815]: E1207 19:15:22.494147 4815 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 07 19:15:22 crc kubenswrapper[4815]: E1207 19:15:22.494157 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-07 19:15:24.494147263 +0000 UTC m=+29.073137298 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 07 19:15:22 crc kubenswrapper[4815]: E1207 19:15:22.494165 4815 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 07 19:15:22 crc kubenswrapper[4815]: E1207 19:15:22.494179 4815 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 07 19:15:22 crc kubenswrapper[4815]: E1207 19:15:22.494186 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-07 19:15:24.494172784 +0000 UTC m=+29.073162839 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 07 19:15:22 crc kubenswrapper[4815]: E1207 19:15:22.494188 4815 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 07 19:15:22 crc kubenswrapper[4815]: E1207 19:15:22.494215 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-07 19:15:24.494208895 +0000 UTC m=+29.073198940 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 07 19:15:22 crc kubenswrapper[4815]: I1207 19:15:22.769270 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 07 19:15:22 crc kubenswrapper[4815]: E1207 19:15:22.769382 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 07 19:15:23 crc kubenswrapper[4815]: I1207 19:15:23.769796 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 07 19:15:23 crc kubenswrapper[4815]: I1207 19:15:23.769885 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 07 19:15:23 crc kubenswrapper[4815]: E1207 19:15:23.770336 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 07 19:15:23 crc kubenswrapper[4815]: E1207 19:15:23.770518 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 07 19:15:23 crc kubenswrapper[4815]: I1207 19:15:23.977892 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"5cb6cb3bfed28d77ff74b7a2273ddb477e21f930f79b169bfbd98fa10a5ead5f"} Dec 07 19:15:24 crc kubenswrapper[4815]: I1207 19:15:24.002354 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe01e9d3-9839-41ea-82d9-f4b6f48bf3b2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52ab545f838534306ea70cc6e400e9fb56ed71ed8206d72c6626166f3a51a93c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f59886c065844948639bf9828ed20c24546cc38659683c98162568c6b12c029\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bec885c9a80d9b5b1fe630a9b7a3f19b23b0af05f359bd87a664f007812b4c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c76d9f056d22d1be012327374370269c2f6d6b8bd341c98419fcb744370751e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539bb1f0e569004272aac994e13bdc7f17f342f5986f08beca97a4c888e046c2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1207 19:15:09.905103 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1207 19:15:09.906859 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4228347019/tls.crt::/tmp/serving-cert-4228347019/tls.key\\\\\\\"\\\\nI1207 19:15:20.294696 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1207 19:15:20.298748 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1207 19:15:20.298781 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1207 19:15:20.298843 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1207 19:15:20.298860 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1207 19:15:20.305864 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1207 19:15:20.305885 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1207 19:15:20.305891 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1207 19:15:20.305973 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1207 19:15:20.305983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1207 19:15:20.305988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1207 19:15:20.305992 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1207 19:15:20.305997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1207 19:15:20.308727 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://902a764a3f6e3ac55c1817f369dfd69ffbf8528bfe4444a6514cdabd48f09fb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:23Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:24 crc kubenswrapper[4815]: I1207 19:15:24.025685 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:24Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:24 crc kubenswrapper[4815]: I1207 19:15:24.047260 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b57a8653da7e9136a6c54872ff67a01c2d6ad5077842b783ffe74e0069bc8693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:24Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:24 crc kubenswrapper[4815]: I1207 19:15:24.078727 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:24Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:24 crc kubenswrapper[4815]: I1207 19:15:24.099350 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ec34a6371c74ee0778e27d73f5e2948d9ee1bb518f658675df123df60879c61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc20ae853b989c9e9cfeb47a40bd1a02397e3ce0b0b4feec3fcec56ea7be16a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:24Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:24 crc kubenswrapper[4815]: I1207 19:15:24.116839 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:24Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:24 crc kubenswrapper[4815]: I1207 19:15:24.134388 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cb6cb3bfed28d77ff74b7a2273ddb477e21f930f79b169bfbd98fa10a5ead5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:24Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:24 crc kubenswrapper[4815]: I1207 19:15:24.154737 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"538dccc2-452b-4d13-8b0d-df993c2b69a2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae001d2e4c277c57382ca0c4a1286c3c135f2fc06ca5a78df27cbdab588ce576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f92a2d37af34c519a6b00a9643f4a46de779ae0070adb0826e9f227960fe4a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89a76372fcc5ff7131084098e41183d148096169b9ccaf69d59dbef45dcfd755\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f29e69a7aa018d68c2e39f3e0f7a619fb25262acaebf83b81fabb6f9b87a9892\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:24Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:24 crc kubenswrapper[4815]: I1207 19:15:24.511622 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 07 19:15:24 crc kubenswrapper[4815]: I1207 19:15:24.511721 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 07 19:15:24 crc kubenswrapper[4815]: I1207 19:15:24.511762 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 07 19:15:24 crc kubenswrapper[4815]: I1207 19:15:24.511800 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 07 19:15:24 crc kubenswrapper[4815]: E1207 19:15:24.511835 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-07 19:15:28.511796375 +0000 UTC m=+33.090786450 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:15:24 crc kubenswrapper[4815]: E1207 19:15:24.511890 4815 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 07 19:15:24 crc kubenswrapper[4815]: I1207 19:15:24.511897 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 07 19:15:24 crc kubenswrapper[4815]: E1207 19:15:24.511987 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-07 19:15:28.511965089 +0000 UTC m=+33.090955214 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 07 19:15:24 crc kubenswrapper[4815]: E1207 19:15:24.511999 4815 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 07 19:15:24 crc kubenswrapper[4815]: E1207 19:15:24.512094 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-07 19:15:28.512067402 +0000 UTC m=+33.091057527 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 07 19:15:24 crc kubenswrapper[4815]: E1207 19:15:24.512101 4815 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 07 19:15:24 crc kubenswrapper[4815]: E1207 19:15:24.512140 4815 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 07 19:15:24 crc kubenswrapper[4815]: E1207 19:15:24.512160 4815 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 07 19:15:24 crc kubenswrapper[4815]: E1207 19:15:24.512107 4815 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 07 19:15:24 crc kubenswrapper[4815]: E1207 19:15:24.512232 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-07 19:15:28.512208686 +0000 UTC m=+33.091198811 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 07 19:15:24 crc kubenswrapper[4815]: E1207 19:15:24.512273 4815 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 07 19:15:24 crc kubenswrapper[4815]: E1207 19:15:24.512310 4815 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 07 19:15:24 crc kubenswrapper[4815]: E1207 19:15:24.512414 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-07 19:15:28.51238251 +0000 UTC m=+33.091372595 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 07 19:15:24 crc kubenswrapper[4815]: I1207 19:15:24.769215 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 07 19:15:24 crc kubenswrapper[4815]: E1207 19:15:24.769348 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 07 19:15:25 crc kubenswrapper[4815]: I1207 19:15:25.503812 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-nz6q9"] Dec 07 19:15:25 crc kubenswrapper[4815]: I1207 19:15:25.504111 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-nz6q9" Dec 07 19:15:25 crc kubenswrapper[4815]: I1207 19:15:25.505894 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 07 19:15:25 crc kubenswrapper[4815]: I1207 19:15:25.507450 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 07 19:15:25 crc kubenswrapper[4815]: I1207 19:15:25.509947 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 07 19:15:25 crc kubenswrapper[4815]: I1207 19:15:25.532863 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe01e9d3-9839-41ea-82d9-f4b6f48bf3b2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52ab545f838534306ea70cc6e400e9fb56ed71ed8206d72c6626166f3a51a93c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f59886c065844948639bf9828ed20c24546cc38659683c98162568c6b12c029\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bec885c9a80d9b5b1fe630a9b7a3f19b23b0af05f359bd87a664f007812b4c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c76d9f056d22d1be012327374370269c2f6d6b8bd341c98419fcb744370751e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539bb1f0e569004272aac994e13bdc7f17f342f5986f08beca97a4c888e046c2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1207 19:15:09.905103 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1207 19:15:09.906859 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4228347019/tls.crt::/tmp/serving-cert-4228347019/tls.key\\\\\\\"\\\\nI1207 19:15:20.294696 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1207 19:15:20.298748 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1207 19:15:20.298781 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1207 19:15:20.298843 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1207 19:15:20.298860 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1207 19:15:20.305864 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1207 19:15:20.305885 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1207 19:15:20.305891 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1207 19:15:20.305973 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1207 19:15:20.305983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1207 19:15:20.305988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1207 19:15:20.305992 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1207 19:15:20.305997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1207 19:15:20.308727 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://902a764a3f6e3ac55c1817f369dfd69ffbf8528bfe4444a6514cdabd48f09fb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:25Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:25 crc kubenswrapper[4815]: I1207 19:15:25.545140 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:25Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:25 crc kubenswrapper[4815]: I1207 19:15:25.610824 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b57a8653da7e9136a6c54872ff67a01c2d6ad5077842b783ffe74e0069bc8693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:25Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:25 crc kubenswrapper[4815]: I1207 19:15:25.619524 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cp5k8\" (UniqueName: \"kubernetes.io/projected/92d414c2-ecc3-4598-ac9d-b982bfd89c7e-kube-api-access-cp5k8\") pod \"node-resolver-nz6q9\" (UID: \"92d414c2-ecc3-4598-ac9d-b982bfd89c7e\") " pod="openshift-dns/node-resolver-nz6q9" Dec 07 19:15:25 crc kubenswrapper[4815]: I1207 19:15:25.619595 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/92d414c2-ecc3-4598-ac9d-b982bfd89c7e-hosts-file\") pod \"node-resolver-nz6q9\" (UID: \"92d414c2-ecc3-4598-ac9d-b982bfd89c7e\") " pod="openshift-dns/node-resolver-nz6q9" Dec 07 19:15:25 crc kubenswrapper[4815]: I1207 19:15:25.629240 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:25Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:25 crc kubenswrapper[4815]: I1207 19:15:25.641973 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ec34a6371c74ee0778e27d73f5e2948d9ee1bb518f658675df123df60879c61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc20ae853b989c9e9cfeb47a40bd1a02397e3ce0b0b4feec3fcec56ea7be16a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:25Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:25 crc kubenswrapper[4815]: I1207 19:15:25.653792 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nz6q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92d414c2-ecc3-4598-ac9d-b982bfd89c7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:25Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:25Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cp5k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nz6q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:25Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:25 crc kubenswrapper[4815]: I1207 19:15:25.663568 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:25Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:25 crc kubenswrapper[4815]: I1207 19:15:25.676128 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cb6cb3bfed28d77ff74b7a2273ddb477e21f930f79b169bfbd98fa10a5ead5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:25Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:25 crc kubenswrapper[4815]: I1207 19:15:25.686403 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"538dccc2-452b-4d13-8b0d-df993c2b69a2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae001d2e4c277c57382ca0c4a1286c3c135f2fc06ca5a78df27cbdab588ce576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f92a2d37af34c519a6b00a9643f4a46de779ae0070adb0826e9f227960fe4a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89a76372fcc5ff7131084098e41183d148096169b9ccaf69d59dbef45dcfd755\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f29e69a7aa018d68c2e39f3e0f7a619fb25262acaebf83b81fabb6f9b87a9892\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:25Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:25 crc kubenswrapper[4815]: I1207 19:15:25.720886 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/92d414c2-ecc3-4598-ac9d-b982bfd89c7e-hosts-file\") pod \"node-resolver-nz6q9\" (UID: \"92d414c2-ecc3-4598-ac9d-b982bfd89c7e\") " pod="openshift-dns/node-resolver-nz6q9" Dec 07 19:15:25 crc kubenswrapper[4815]: I1207 19:15:25.720950 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cp5k8\" (UniqueName: \"kubernetes.io/projected/92d414c2-ecc3-4598-ac9d-b982bfd89c7e-kube-api-access-cp5k8\") pod \"node-resolver-nz6q9\" (UID: \"92d414c2-ecc3-4598-ac9d-b982bfd89c7e\") " pod="openshift-dns/node-resolver-nz6q9" Dec 07 19:15:25 crc kubenswrapper[4815]: I1207 19:15:25.721015 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/92d414c2-ecc3-4598-ac9d-b982bfd89c7e-hosts-file\") pod \"node-resolver-nz6q9\" (UID: \"92d414c2-ecc3-4598-ac9d-b982bfd89c7e\") " pod="openshift-dns/node-resolver-nz6q9" Dec 07 19:15:25 crc kubenswrapper[4815]: I1207 19:15:25.736967 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cp5k8\" (UniqueName: \"kubernetes.io/projected/92d414c2-ecc3-4598-ac9d-b982bfd89c7e-kube-api-access-cp5k8\") pod \"node-resolver-nz6q9\" (UID: \"92d414c2-ecc3-4598-ac9d-b982bfd89c7e\") " pod="openshift-dns/node-resolver-nz6q9" Dec 07 19:15:25 crc kubenswrapper[4815]: I1207 19:15:25.768894 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 07 19:15:25 crc kubenswrapper[4815]: I1207 19:15:25.768976 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 07 19:15:25 crc kubenswrapper[4815]: E1207 19:15:25.769045 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 07 19:15:25 crc kubenswrapper[4815]: E1207 19:15:25.769099 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 07 19:15:25 crc kubenswrapper[4815]: I1207 19:15:25.783862 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe01e9d3-9839-41ea-82d9-f4b6f48bf3b2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52ab545f838534306ea70cc6e400e9fb56ed71ed8206d72c6626166f3a51a93c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f59886c065844948639bf9828ed20c24546cc38659683c98162568c6b12c029\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bec885c9a80d9b5b1fe630a9b7a3f19b23b0af05f359bd87a664f007812b4c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c76d9f056d22d1be012327374370269c2f6d6b8bd341c98419fcb744370751e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539bb1f0e569004272aac994e13bdc7f17f342f5986f08beca97a4c888e046c2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1207 19:15:09.905103 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1207 19:15:09.906859 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4228347019/tls.crt::/tmp/serving-cert-4228347019/tls.key\\\\\\\"\\\\nI1207 19:15:20.294696 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1207 19:15:20.298748 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1207 19:15:20.298781 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1207 19:15:20.298843 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1207 19:15:20.298860 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1207 19:15:20.305864 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1207 19:15:20.305885 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1207 19:15:20.305891 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1207 19:15:20.305973 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1207 19:15:20.305983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1207 19:15:20.305988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1207 19:15:20.305992 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1207 19:15:20.305997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1207 19:15:20.308727 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://902a764a3f6e3ac55c1817f369dfd69ffbf8528bfe4444a6514cdabd48f09fb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:25Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:25 crc kubenswrapper[4815]: I1207 19:15:25.799394 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:25Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:25 crc kubenswrapper[4815]: I1207 19:15:25.816344 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-nz6q9" Dec 07 19:15:25 crc kubenswrapper[4815]: I1207 19:15:25.819718 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b57a8653da7e9136a6c54872ff67a01c2d6ad5077842b783ffe74e0069bc8693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:25Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:25 crc kubenswrapper[4815]: I1207 19:15:25.844004 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:25Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:25 crc kubenswrapper[4815]: I1207 19:15:25.856529 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ec34a6371c74ee0778e27d73f5e2948d9ee1bb518f658675df123df60879c61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc20ae853b989c9e9cfeb47a40bd1a02397e3ce0b0b4feec3fcec56ea7be16a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:25Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:25 crc kubenswrapper[4815]: I1207 19:15:25.868290 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:25Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:25 crc kubenswrapper[4815]: I1207 19:15:25.881049 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nz6q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92d414c2-ecc3-4598-ac9d-b982bfd89c7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:25Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:25Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cp5k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nz6q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:25Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:25 crc kubenswrapper[4815]: I1207 19:15:25.905868 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"538dccc2-452b-4d13-8b0d-df993c2b69a2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae001d2e4c277c57382ca0c4a1286c3c135f2fc06ca5a78df27cbdab588ce576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f92a2d37af34c519a6b00a9643f4a46de779ae0070adb0826e9f227960fe4a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89a76372fcc5ff7131084098e41183d148096169b9ccaf69d59dbef45dcfd755\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f29e69a7aa018d68c2e39f3e0f7a619fb25262acaebf83b81fabb6f9b87a9892\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:25Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:25 crc kubenswrapper[4815]: I1207 19:15:25.933957 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cb6cb3bfed28d77ff74b7a2273ddb477e21f930f79b169bfbd98fa10a5ead5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:25Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:25 crc kubenswrapper[4815]: I1207 19:15:25.989334 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-nz6q9" event={"ID":"92d414c2-ecc3-4598-ac9d-b982bfd89c7e","Type":"ContainerStarted","Data":"df97e91f851a8e0d3219b8d3e6184a8bd57c5279cc6d7d5f12f8d23a2f81280a"} Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.037615 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-s95hp"] Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.037973 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-s95hp" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.039460 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-gkn4h"] Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.039774 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.043022 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.043022 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.043161 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.043172 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.047311 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.047676 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.058623 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.058836 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.059069 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.061540 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.066639 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-gmf4f"] Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.067222 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-gmf4f" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.071339 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cb6cb3bfed28d77ff74b7a2273ddb477e21f930f79b169bfbd98fa10a5ead5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:26Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.071423 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.071502 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.096135 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"538dccc2-452b-4d13-8b0d-df993c2b69a2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae001d2e4c277c57382ca0c4a1286c3c135f2fc06ca5a78df27cbdab588ce576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f92a2d37af34c519a6b00a9643f4a46de779ae0070adb0826e9f227960fe4a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89a76372fcc5ff7131084098e41183d148096169b9ccaf69d59dbef45dcfd755\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f29e69a7aa018d68c2e39f3e0f7a619fb25262acaebf83b81fabb6f9b87a9892\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:26Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.129473 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe01e9d3-9839-41ea-82d9-f4b6f48bf3b2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52ab545f838534306ea70cc6e400e9fb56ed71ed8206d72c6626166f3a51a93c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f59886c065844948639bf9828ed20c24546cc38659683c98162568c6b12c029\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bec885c9a80d9b5b1fe630a9b7a3f19b23b0af05f359bd87a664f007812b4c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c76d9f056d22d1be012327374370269c2f6d6b8bd341c98419fcb744370751e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539bb1f0e569004272aac994e13bdc7f17f342f5986f08beca97a4c888e046c2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1207 19:15:09.905103 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1207 19:15:09.906859 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4228347019/tls.crt::/tmp/serving-cert-4228347019/tls.key\\\\\\\"\\\\nI1207 19:15:20.294696 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1207 19:15:20.298748 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1207 19:15:20.298781 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1207 19:15:20.298843 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1207 19:15:20.298860 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1207 19:15:20.305864 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1207 19:15:20.305885 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1207 19:15:20.305891 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1207 19:15:20.305973 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1207 19:15:20.305983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1207 19:15:20.305988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1207 19:15:20.305992 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1207 19:15:20.305997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1207 19:15:20.308727 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://902a764a3f6e3ac55c1817f369dfd69ffbf8528bfe4444a6514cdabd48f09fb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:26Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.156523 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:26Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.168754 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b57a8653da7e9136a6c54872ff67a01c2d6ad5077842b783ffe74e0069bc8693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:26Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.182409 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:26Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.196307 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ec34a6371c74ee0778e27d73f5e2948d9ee1bb518f658675df123df60879c61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc20ae853b989c9e9cfeb47a40bd1a02397e3ce0b0b4feec3fcec56ea7be16a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:26Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.225366 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nz6q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92d414c2-ecc3-4598-ac9d-b982bfd89c7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:25Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:25Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cp5k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nz6q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:26Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.225725 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0b739f36-d9c4-4fb6-9ead-9df05e283dea-host-run-netns\") pod \"multus-s95hp\" (UID: \"0b739f36-d9c4-4fb6-9ead-9df05e283dea\") " pod="openshift-multus/multus-s95hp" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.225768 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3fefaece-ab52-48e2-9ee9-fb07be1922f1-cnibin\") pod \"multus-additional-cni-plugins-gmf4f\" (UID: \"3fefaece-ab52-48e2-9ee9-fb07be1922f1\") " pod="openshift-multus/multus-additional-cni-plugins-gmf4f" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.225788 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/0b739f36-d9c4-4fb6-9ead-9df05e283dea-host-run-k8s-cni-cncf-io\") pod \"multus-s95hp\" (UID: \"0b739f36-d9c4-4fb6-9ead-9df05e283dea\") " pod="openshift-multus/multus-s95hp" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.225807 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/0b739f36-d9c4-4fb6-9ead-9df05e283dea-hostroot\") pod \"multus-s95hp\" (UID: \"0b739f36-d9c4-4fb6-9ead-9df05e283dea\") " pod="openshift-multus/multus-s95hp" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.225825 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/0b739f36-d9c4-4fb6-9ead-9df05e283dea-multus-daemon-config\") pod \"multus-s95hp\" (UID: \"0b739f36-d9c4-4fb6-9ead-9df05e283dea\") " pod="openshift-multus/multus-s95hp" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.225843 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0b739f36-d9c4-4fb6-9ead-9df05e283dea-system-cni-dir\") pod \"multus-s95hp\" (UID: \"0b739f36-d9c4-4fb6-9ead-9df05e283dea\") " pod="openshift-multus/multus-s95hp" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.225868 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/0b739f36-d9c4-4fb6-9ead-9df05e283dea-multus-socket-dir-parent\") pod \"multus-s95hp\" (UID: \"0b739f36-d9c4-4fb6-9ead-9df05e283dea\") " pod="openshift-multus/multus-s95hp" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.225887 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prbsc\" (UniqueName: \"kubernetes.io/projected/0b739f36-d9c4-4fb6-9ead-9df05e283dea-kube-api-access-prbsc\") pod \"multus-s95hp\" (UID: \"0b739f36-d9c4-4fb6-9ead-9df05e283dea\") " pod="openshift-multus/multus-s95hp" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.225905 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3fefaece-ab52-48e2-9ee9-fb07be1922f1-system-cni-dir\") pod \"multus-additional-cni-plugins-gmf4f\" (UID: \"3fefaece-ab52-48e2-9ee9-fb07be1922f1\") " pod="openshift-multus/multus-additional-cni-plugins-gmf4f" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.225941 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3fefaece-ab52-48e2-9ee9-fb07be1922f1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-gmf4f\" (UID: \"3fefaece-ab52-48e2-9ee9-fb07be1922f1\") " pod="openshift-multus/multus-additional-cni-plugins-gmf4f" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.225959 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/0b739f36-d9c4-4fb6-9ead-9df05e283dea-host-run-multus-certs\") pod \"multus-s95hp\" (UID: \"0b739f36-d9c4-4fb6-9ead-9df05e283dea\") " pod="openshift-multus/multus-s95hp" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.225976 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0b739f36-d9c4-4fb6-9ead-9df05e283dea-cnibin\") pod \"multus-s95hp\" (UID: \"0b739f36-d9c4-4fb6-9ead-9df05e283dea\") " pod="openshift-multus/multus-s95hp" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.225996 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0b739f36-d9c4-4fb6-9ead-9df05e283dea-host-var-lib-cni-bin\") pod \"multus-s95hp\" (UID: \"0b739f36-d9c4-4fb6-9ead-9df05e283dea\") " pod="openshift-multus/multus-s95hp" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.226015 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/0b739f36-d9c4-4fb6-9ead-9df05e283dea-host-var-lib-cni-multus\") pod \"multus-s95hp\" (UID: \"0b739f36-d9c4-4fb6-9ead-9df05e283dea\") " pod="openshift-multus/multus-s95hp" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.226031 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0b739f36-d9c4-4fb6-9ead-9df05e283dea-host-var-lib-kubelet\") pod \"multus-s95hp\" (UID: \"0b739f36-d9c4-4fb6-9ead-9df05e283dea\") " pod="openshift-multus/multus-s95hp" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.226047 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0b739f36-d9c4-4fb6-9ead-9df05e283dea-etc-kubernetes\") pod \"multus-s95hp\" (UID: \"0b739f36-d9c4-4fb6-9ead-9df05e283dea\") " pod="openshift-multus/multus-s95hp" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.226064 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3fefaece-ab52-48e2-9ee9-fb07be1922f1-os-release\") pod \"multus-additional-cni-plugins-gmf4f\" (UID: \"3fefaece-ab52-48e2-9ee9-fb07be1922f1\") " pod="openshift-multus/multus-additional-cni-plugins-gmf4f" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.226081 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3fefaece-ab52-48e2-9ee9-fb07be1922f1-cni-binary-copy\") pod \"multus-additional-cni-plugins-gmf4f\" (UID: \"3fefaece-ab52-48e2-9ee9-fb07be1922f1\") " pod="openshift-multus/multus-additional-cni-plugins-gmf4f" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.226106 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/3d662ba2-aa03-4eea-bd30-8ad40638f6c7-rootfs\") pod \"machine-config-daemon-gkn4h\" (UID: \"3d662ba2-aa03-4eea-bd30-8ad40638f6c7\") " pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.226126 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25wn4\" (UniqueName: \"kubernetes.io/projected/3fefaece-ab52-48e2-9ee9-fb07be1922f1-kube-api-access-25wn4\") pod \"multus-additional-cni-plugins-gmf4f\" (UID: \"3fefaece-ab52-48e2-9ee9-fb07be1922f1\") " pod="openshift-multus/multus-additional-cni-plugins-gmf4f" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.226143 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0b739f36-d9c4-4fb6-9ead-9df05e283dea-os-release\") pod \"multus-s95hp\" (UID: \"0b739f36-d9c4-4fb6-9ead-9df05e283dea\") " pod="openshift-multus/multus-s95hp" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.226159 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0b739f36-d9c4-4fb6-9ead-9df05e283dea-multus-conf-dir\") pod \"multus-s95hp\" (UID: \"0b739f36-d9c4-4fb6-9ead-9df05e283dea\") " pod="openshift-multus/multus-s95hp" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.226177 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0b739f36-d9c4-4fb6-9ead-9df05e283dea-multus-cni-dir\") pod \"multus-s95hp\" (UID: \"0b739f36-d9c4-4fb6-9ead-9df05e283dea\") " pod="openshift-multus/multus-s95hp" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.226194 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0b739f36-d9c4-4fb6-9ead-9df05e283dea-cni-binary-copy\") pod \"multus-s95hp\" (UID: \"0b739f36-d9c4-4fb6-9ead-9df05e283dea\") " pod="openshift-multus/multus-s95hp" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.226211 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3fefaece-ab52-48e2-9ee9-fb07be1922f1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-gmf4f\" (UID: \"3fefaece-ab52-48e2-9ee9-fb07be1922f1\") " pod="openshift-multus/multus-additional-cni-plugins-gmf4f" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.226229 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3d662ba2-aa03-4eea-bd30-8ad40638f6c7-proxy-tls\") pod \"machine-config-daemon-gkn4h\" (UID: \"3d662ba2-aa03-4eea-bd30-8ad40638f6c7\") " pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.226246 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3d662ba2-aa03-4eea-bd30-8ad40638f6c7-mcd-auth-proxy-config\") pod \"machine-config-daemon-gkn4h\" (UID: \"3d662ba2-aa03-4eea-bd30-8ad40638f6c7\") " pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.226266 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2v8vl\" (UniqueName: \"kubernetes.io/projected/3d662ba2-aa03-4eea-bd30-8ad40638f6c7-kube-api-access-2v8vl\") pod \"machine-config-daemon-gkn4h\" (UID: \"3d662ba2-aa03-4eea-bd30-8ad40638f6c7\") " pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.245634 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s95hp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b739f36-d9c4-4fb6-9ead-9df05e283dea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prbsc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s95hp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:26Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.264199 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:26Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.278578 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"538dccc2-452b-4d13-8b0d-df993c2b69a2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae001d2e4c277c57382ca0c4a1286c3c135f2fc06ca5a78df27cbdab588ce576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f92a2d37af34c519a6b00a9643f4a46de779ae0070adb0826e9f227960fe4a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89a76372fcc5ff7131084098e41183d148096169b9ccaf69d59dbef45dcfd755\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f29e69a7aa018d68c2e39f3e0f7a619fb25262acaebf83b81fabb6f9b87a9892\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:26Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.293180 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cb6cb3bfed28d77ff74b7a2273ddb477e21f930f79b169bfbd98fa10a5ead5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:26Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.307817 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:26Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.321227 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b57a8653da7e9136a6c54872ff67a01c2d6ad5077842b783ffe74e0069bc8693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:26Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.326905 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25wn4\" (UniqueName: \"kubernetes.io/projected/3fefaece-ab52-48e2-9ee9-fb07be1922f1-kube-api-access-25wn4\") pod \"multus-additional-cni-plugins-gmf4f\" (UID: \"3fefaece-ab52-48e2-9ee9-fb07be1922f1\") " pod="openshift-multus/multus-additional-cni-plugins-gmf4f" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.326945 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0b739f36-d9c4-4fb6-9ead-9df05e283dea-os-release\") pod \"multus-s95hp\" (UID: \"0b739f36-d9c4-4fb6-9ead-9df05e283dea\") " pod="openshift-multus/multus-s95hp" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.326962 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0b739f36-d9c4-4fb6-9ead-9df05e283dea-multus-conf-dir\") pod \"multus-s95hp\" (UID: \"0b739f36-d9c4-4fb6-9ead-9df05e283dea\") " pod="openshift-multus/multus-s95hp" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.326977 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0b739f36-d9c4-4fb6-9ead-9df05e283dea-multus-cni-dir\") pod \"multus-s95hp\" (UID: \"0b739f36-d9c4-4fb6-9ead-9df05e283dea\") " pod="openshift-multus/multus-s95hp" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.326999 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0b739f36-d9c4-4fb6-9ead-9df05e283dea-cni-binary-copy\") pod \"multus-s95hp\" (UID: \"0b739f36-d9c4-4fb6-9ead-9df05e283dea\") " pod="openshift-multus/multus-s95hp" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.327014 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3fefaece-ab52-48e2-9ee9-fb07be1922f1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-gmf4f\" (UID: \"3fefaece-ab52-48e2-9ee9-fb07be1922f1\") " pod="openshift-multus/multus-additional-cni-plugins-gmf4f" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.327029 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2v8vl\" (UniqueName: \"kubernetes.io/projected/3d662ba2-aa03-4eea-bd30-8ad40638f6c7-kube-api-access-2v8vl\") pod \"machine-config-daemon-gkn4h\" (UID: \"3d662ba2-aa03-4eea-bd30-8ad40638f6c7\") " pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.327043 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3d662ba2-aa03-4eea-bd30-8ad40638f6c7-proxy-tls\") pod \"machine-config-daemon-gkn4h\" (UID: \"3d662ba2-aa03-4eea-bd30-8ad40638f6c7\") " pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.327058 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3d662ba2-aa03-4eea-bd30-8ad40638f6c7-mcd-auth-proxy-config\") pod \"machine-config-daemon-gkn4h\" (UID: \"3d662ba2-aa03-4eea-bd30-8ad40638f6c7\") " pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.327073 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0b739f36-d9c4-4fb6-9ead-9df05e283dea-host-run-netns\") pod \"multus-s95hp\" (UID: \"0b739f36-d9c4-4fb6-9ead-9df05e283dea\") " pod="openshift-multus/multus-s95hp" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.327087 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3fefaece-ab52-48e2-9ee9-fb07be1922f1-cnibin\") pod \"multus-additional-cni-plugins-gmf4f\" (UID: \"3fefaece-ab52-48e2-9ee9-fb07be1922f1\") " pod="openshift-multus/multus-additional-cni-plugins-gmf4f" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.327112 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0b739f36-d9c4-4fb6-9ead-9df05e283dea-system-cni-dir\") pod \"multus-s95hp\" (UID: \"0b739f36-d9c4-4fb6-9ead-9df05e283dea\") " pod="openshift-multus/multus-s95hp" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.327127 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/0b739f36-d9c4-4fb6-9ead-9df05e283dea-host-run-k8s-cni-cncf-io\") pod \"multus-s95hp\" (UID: \"0b739f36-d9c4-4fb6-9ead-9df05e283dea\") " pod="openshift-multus/multus-s95hp" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.327140 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/0b739f36-d9c4-4fb6-9ead-9df05e283dea-hostroot\") pod \"multus-s95hp\" (UID: \"0b739f36-d9c4-4fb6-9ead-9df05e283dea\") " pod="openshift-multus/multus-s95hp" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.327153 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/0b739f36-d9c4-4fb6-9ead-9df05e283dea-multus-daemon-config\") pod \"multus-s95hp\" (UID: \"0b739f36-d9c4-4fb6-9ead-9df05e283dea\") " pod="openshift-multus/multus-s95hp" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.327166 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3fefaece-ab52-48e2-9ee9-fb07be1922f1-system-cni-dir\") pod \"multus-additional-cni-plugins-gmf4f\" (UID: \"3fefaece-ab52-48e2-9ee9-fb07be1922f1\") " pod="openshift-multus/multus-additional-cni-plugins-gmf4f" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.327181 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3fefaece-ab52-48e2-9ee9-fb07be1922f1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-gmf4f\" (UID: \"3fefaece-ab52-48e2-9ee9-fb07be1922f1\") " pod="openshift-multus/multus-additional-cni-plugins-gmf4f" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.327203 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/0b739f36-d9c4-4fb6-9ead-9df05e283dea-multus-socket-dir-parent\") pod \"multus-s95hp\" (UID: \"0b739f36-d9c4-4fb6-9ead-9df05e283dea\") " pod="openshift-multus/multus-s95hp" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.327218 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prbsc\" (UniqueName: \"kubernetes.io/projected/0b739f36-d9c4-4fb6-9ead-9df05e283dea-kube-api-access-prbsc\") pod \"multus-s95hp\" (UID: \"0b739f36-d9c4-4fb6-9ead-9df05e283dea\") " pod="openshift-multus/multus-s95hp" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.327232 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/0b739f36-d9c4-4fb6-9ead-9df05e283dea-host-run-multus-certs\") pod \"multus-s95hp\" (UID: \"0b739f36-d9c4-4fb6-9ead-9df05e283dea\") " pod="openshift-multus/multus-s95hp" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.327246 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0b739f36-d9c4-4fb6-9ead-9df05e283dea-host-var-lib-cni-bin\") pod \"multus-s95hp\" (UID: \"0b739f36-d9c4-4fb6-9ead-9df05e283dea\") " pod="openshift-multus/multus-s95hp" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.327261 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/0b739f36-d9c4-4fb6-9ead-9df05e283dea-host-var-lib-cni-multus\") pod \"multus-s95hp\" (UID: \"0b739f36-d9c4-4fb6-9ead-9df05e283dea\") " pod="openshift-multus/multus-s95hp" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.327277 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0b739f36-d9c4-4fb6-9ead-9df05e283dea-host-var-lib-kubelet\") pod \"multus-s95hp\" (UID: \"0b739f36-d9c4-4fb6-9ead-9df05e283dea\") " pod="openshift-multus/multus-s95hp" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.327290 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0b739f36-d9c4-4fb6-9ead-9df05e283dea-cnibin\") pod \"multus-s95hp\" (UID: \"0b739f36-d9c4-4fb6-9ead-9df05e283dea\") " pod="openshift-multus/multus-s95hp" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.327305 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0b739f36-d9c4-4fb6-9ead-9df05e283dea-etc-kubernetes\") pod \"multus-s95hp\" (UID: \"0b739f36-d9c4-4fb6-9ead-9df05e283dea\") " pod="openshift-multus/multus-s95hp" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.327321 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3fefaece-ab52-48e2-9ee9-fb07be1922f1-os-release\") pod \"multus-additional-cni-plugins-gmf4f\" (UID: \"3fefaece-ab52-48e2-9ee9-fb07be1922f1\") " pod="openshift-multus/multus-additional-cni-plugins-gmf4f" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.327336 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3fefaece-ab52-48e2-9ee9-fb07be1922f1-cni-binary-copy\") pod \"multus-additional-cni-plugins-gmf4f\" (UID: \"3fefaece-ab52-48e2-9ee9-fb07be1922f1\") " pod="openshift-multus/multus-additional-cni-plugins-gmf4f" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.327364 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/3d662ba2-aa03-4eea-bd30-8ad40638f6c7-rootfs\") pod \"machine-config-daemon-gkn4h\" (UID: \"3d662ba2-aa03-4eea-bd30-8ad40638f6c7\") " pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.327418 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/3d662ba2-aa03-4eea-bd30-8ad40638f6c7-rootfs\") pod \"machine-config-daemon-gkn4h\" (UID: \"3d662ba2-aa03-4eea-bd30-8ad40638f6c7\") " pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.327745 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0b739f36-d9c4-4fb6-9ead-9df05e283dea-os-release\") pod \"multus-s95hp\" (UID: \"0b739f36-d9c4-4fb6-9ead-9df05e283dea\") " pod="openshift-multus/multus-s95hp" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.327772 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0b739f36-d9c4-4fb6-9ead-9df05e283dea-multus-conf-dir\") pod \"multus-s95hp\" (UID: \"0b739f36-d9c4-4fb6-9ead-9df05e283dea\") " pod="openshift-multus/multus-s95hp" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.327806 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3fefaece-ab52-48e2-9ee9-fb07be1922f1-system-cni-dir\") pod \"multus-additional-cni-plugins-gmf4f\" (UID: \"3fefaece-ab52-48e2-9ee9-fb07be1922f1\") " pod="openshift-multus/multus-additional-cni-plugins-gmf4f" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.328047 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/0b739f36-d9c4-4fb6-9ead-9df05e283dea-multus-socket-dir-parent\") pod \"multus-s95hp\" (UID: \"0b739f36-d9c4-4fb6-9ead-9df05e283dea\") " pod="openshift-multus/multus-s95hp" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.328121 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0b739f36-d9c4-4fb6-9ead-9df05e283dea-multus-cni-dir\") pod \"multus-s95hp\" (UID: \"0b739f36-d9c4-4fb6-9ead-9df05e283dea\") " pod="openshift-multus/multus-s95hp" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.328299 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/0b739f36-d9c4-4fb6-9ead-9df05e283dea-multus-daemon-config\") pod \"multus-s95hp\" (UID: \"0b739f36-d9c4-4fb6-9ead-9df05e283dea\") " pod="openshift-multus/multus-s95hp" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.328337 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0b739f36-d9c4-4fb6-9ead-9df05e283dea-host-var-lib-kubelet\") pod \"multus-s95hp\" (UID: \"0b739f36-d9c4-4fb6-9ead-9df05e283dea\") " pod="openshift-multus/multus-s95hp" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.328339 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/0b739f36-d9c4-4fb6-9ead-9df05e283dea-host-run-multus-certs\") pod \"multus-s95hp\" (UID: \"0b739f36-d9c4-4fb6-9ead-9df05e283dea\") " pod="openshift-multus/multus-s95hp" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.328306 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3fefaece-ab52-48e2-9ee9-fb07be1922f1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-gmf4f\" (UID: \"3fefaece-ab52-48e2-9ee9-fb07be1922f1\") " pod="openshift-multus/multus-additional-cni-plugins-gmf4f" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.328354 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0b739f36-d9c4-4fb6-9ead-9df05e283dea-host-var-lib-cni-bin\") pod \"multus-s95hp\" (UID: \"0b739f36-d9c4-4fb6-9ead-9df05e283dea\") " pod="openshift-multus/multus-s95hp" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.328356 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0b739f36-d9c4-4fb6-9ead-9df05e283dea-cnibin\") pod \"multus-s95hp\" (UID: \"0b739f36-d9c4-4fb6-9ead-9df05e283dea\") " pod="openshift-multus/multus-s95hp" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.328377 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/0b739f36-d9c4-4fb6-9ead-9df05e283dea-host-var-lib-cni-multus\") pod \"multus-s95hp\" (UID: \"0b739f36-d9c4-4fb6-9ead-9df05e283dea\") " pod="openshift-multus/multus-s95hp" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.328378 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0b739f36-d9c4-4fb6-9ead-9df05e283dea-etc-kubernetes\") pod \"multus-s95hp\" (UID: \"0b739f36-d9c4-4fb6-9ead-9df05e283dea\") " pod="openshift-multus/multus-s95hp" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.328399 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0b739f36-d9c4-4fb6-9ead-9df05e283dea-host-run-netns\") pod \"multus-s95hp\" (UID: \"0b739f36-d9c4-4fb6-9ead-9df05e283dea\") " pod="openshift-multus/multus-s95hp" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.328403 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3fefaece-ab52-48e2-9ee9-fb07be1922f1-os-release\") pod \"multus-additional-cni-plugins-gmf4f\" (UID: \"3fefaece-ab52-48e2-9ee9-fb07be1922f1\") " pod="openshift-multus/multus-additional-cni-plugins-gmf4f" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.328579 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0b739f36-d9c4-4fb6-9ead-9df05e283dea-system-cni-dir\") pod \"multus-s95hp\" (UID: \"0b739f36-d9c4-4fb6-9ead-9df05e283dea\") " pod="openshift-multus/multus-s95hp" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.328579 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/0b739f36-d9c4-4fb6-9ead-9df05e283dea-host-run-k8s-cni-cncf-io\") pod \"multus-s95hp\" (UID: \"0b739f36-d9c4-4fb6-9ead-9df05e283dea\") " pod="openshift-multus/multus-s95hp" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.328591 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3fefaece-ab52-48e2-9ee9-fb07be1922f1-cnibin\") pod \"multus-additional-cni-plugins-gmf4f\" (UID: \"3fefaece-ab52-48e2-9ee9-fb07be1922f1\") " pod="openshift-multus/multus-additional-cni-plugins-gmf4f" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.328607 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/0b739f36-d9c4-4fb6-9ead-9df05e283dea-hostroot\") pod \"multus-s95hp\" (UID: \"0b739f36-d9c4-4fb6-9ead-9df05e283dea\") " pod="openshift-multus/multus-s95hp" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.328685 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0b739f36-d9c4-4fb6-9ead-9df05e283dea-cni-binary-copy\") pod \"multus-s95hp\" (UID: \"0b739f36-d9c4-4fb6-9ead-9df05e283dea\") " pod="openshift-multus/multus-s95hp" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.328849 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3fefaece-ab52-48e2-9ee9-fb07be1922f1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-gmf4f\" (UID: \"3fefaece-ab52-48e2-9ee9-fb07be1922f1\") " pod="openshift-multus/multus-additional-cni-plugins-gmf4f" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.328960 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3fefaece-ab52-48e2-9ee9-fb07be1922f1-cni-binary-copy\") pod \"multus-additional-cni-plugins-gmf4f\" (UID: \"3fefaece-ab52-48e2-9ee9-fb07be1922f1\") " pod="openshift-multus/multus-additional-cni-plugins-gmf4f" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.329186 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3d662ba2-aa03-4eea-bd30-8ad40638f6c7-mcd-auth-proxy-config\") pod \"machine-config-daemon-gkn4h\" (UID: \"3d662ba2-aa03-4eea-bd30-8ad40638f6c7\") " pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.333377 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3d662ba2-aa03-4eea-bd30-8ad40638f6c7-proxy-tls\") pod \"machine-config-daemon-gkn4h\" (UID: \"3d662ba2-aa03-4eea-bd30-8ad40638f6c7\") " pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.348230 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prbsc\" (UniqueName: \"kubernetes.io/projected/0b739f36-d9c4-4fb6-9ead-9df05e283dea-kube-api-access-prbsc\") pod \"multus-s95hp\" (UID: \"0b739f36-d9c4-4fb6-9ead-9df05e283dea\") " pod="openshift-multus/multus-s95hp" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.349372 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2v8vl\" (UniqueName: \"kubernetes.io/projected/3d662ba2-aa03-4eea-bd30-8ad40638f6c7-kube-api-access-2v8vl\") pod \"machine-config-daemon-gkn4h\" (UID: \"3d662ba2-aa03-4eea-bd30-8ad40638f6c7\") " pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.350023 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25wn4\" (UniqueName: \"kubernetes.io/projected/3fefaece-ab52-48e2-9ee9-fb07be1922f1-kube-api-access-25wn4\") pod \"multus-additional-cni-plugins-gmf4f\" (UID: \"3fefaece-ab52-48e2-9ee9-fb07be1922f1\") " pod="openshift-multus/multus-additional-cni-plugins-gmf4f" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.350737 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-s95hp" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.350838 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:26Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.358536 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" Dec 07 19:15:26 crc kubenswrapper[4815]: W1207 19:15:26.360008 4815 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b739f36_d9c4_4fb6_9ead_9df05e283dea.slice/crio-f1fd4c193b9bcee717d2f01da4517066acfca7b628bf2ed8a0add27317d35917 WatchSource:0}: Error finding container f1fd4c193b9bcee717d2f01da4517066acfca7b628bf2ed8a0add27317d35917: Status 404 returned error can't find the container with id f1fd4c193b9bcee717d2f01da4517066acfca7b628bf2ed8a0add27317d35917 Dec 07 19:15:26 crc kubenswrapper[4815]: W1207 19:15:26.367773 4815 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d662ba2_aa03_4eea_bd30_8ad40638f6c7.slice/crio-3767659bd02a5a1421719dce45edfb6a4dcf1405d84afcc383e59c374c313086 WatchSource:0}: Error finding container 3767659bd02a5a1421719dce45edfb6a4dcf1405d84afcc383e59c374c313086: Status 404 returned error can't find the container with id 3767659bd02a5a1421719dce45edfb6a4dcf1405d84afcc383e59c374c313086 Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.367951 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ec34a6371c74ee0778e27d73f5e2948d9ee1bb518f658675df123df60879c61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc20ae853b989c9e9cfeb47a40bd1a02397e3ce0b0b4feec3fcec56ea7be16a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:26Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.380839 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-gmf4f" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.391128 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:26Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:26 crc kubenswrapper[4815]: W1207 19:15:26.399275 4815 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3fefaece_ab52_48e2_9ee9_fb07be1922f1.slice/crio-137de5ad3668f43c9e79743478d5ccfeb1d1e8287b9242ed8f7b5461d290c6b7 WatchSource:0}: Error finding container 137de5ad3668f43c9e79743478d5ccfeb1d1e8287b9242ed8f7b5461d290c6b7: Status 404 returned error can't find the container with id 137de5ad3668f43c9e79743478d5ccfeb1d1e8287b9242ed8f7b5461d290c6b7 Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.407549 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nz6q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92d414c2-ecc3-4598-ac9d-b982bfd89c7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:25Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:25Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cp5k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nz6q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:26Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.430441 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe01e9d3-9839-41ea-82d9-f4b6f48bf3b2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52ab545f838534306ea70cc6e400e9fb56ed71ed8206d72c6626166f3a51a93c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f59886c065844948639bf9828ed20c24546cc38659683c98162568c6b12c029\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bec885c9a80d9b5b1fe630a9b7a3f19b23b0af05f359bd87a664f007812b4c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c76d9f056d22d1be012327374370269c2f6d6b8bd341c98419fcb744370751e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539bb1f0e569004272aac994e13bdc7f17f342f5986f08beca97a4c888e046c2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1207 19:15:09.905103 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1207 19:15:09.906859 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4228347019/tls.crt::/tmp/serving-cert-4228347019/tls.key\\\\\\\"\\\\nI1207 19:15:20.294696 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1207 19:15:20.298748 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1207 19:15:20.298781 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1207 19:15:20.298843 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1207 19:15:20.298860 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1207 19:15:20.305864 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1207 19:15:20.305885 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1207 19:15:20.305891 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1207 19:15:20.305973 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1207 19:15:20.305983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1207 19:15:20.305988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1207 19:15:20.305992 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1207 19:15:20.305997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1207 19:15:20.308727 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://902a764a3f6e3ac55c1817f369dfd69ffbf8528bfe4444a6514cdabd48f09fb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:26Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.446933 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d662ba2-aa03-4eea-bd30-8ad40638f6c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2v8vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2v8vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gkn4h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:26Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.459532 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s95hp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b739f36-d9c4-4fb6-9ead-9df05e283dea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prbsc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s95hp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:26Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.472109 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gmf4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fefaece-ab52-48e2-9ee9-fb07be1922f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gmf4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:26Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.672025 4815 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.673413 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.673691 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.673705 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.674122 4815 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.684612 4815 kubelet_node_status.go:115] "Node was previously registered" node="crc" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.684942 4815 kubelet_node_status.go:79] "Successfully registered node" node="crc" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.686111 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.686145 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.686154 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.686171 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.686183 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:26Z","lastTransitionTime":"2025-12-07T19:15:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:26 crc kubenswrapper[4815]: E1207 19:15:26.723977 4815 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3d749f27-20c8-4d23-ad52-dc6b852bf3b7\\\",\\\"systemUUID\\\":\\\"077277cc-9bde-4aeb-947a-0cf3c49a1ac0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:26Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.728358 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.728382 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.728390 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.728403 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.728411 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:26Z","lastTransitionTime":"2025-12-07T19:15:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:26 crc kubenswrapper[4815]: E1207 19:15:26.740139 4815 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3d749f27-20c8-4d23-ad52-dc6b852bf3b7\\\",\\\"systemUUID\\\":\\\"077277cc-9bde-4aeb-947a-0cf3c49a1ac0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:26Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.743176 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.743198 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.743208 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.743223 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.743234 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:26Z","lastTransitionTime":"2025-12-07T19:15:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.768933 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 07 19:15:26 crc kubenswrapper[4815]: E1207 19:15:26.769053 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 07 19:15:26 crc kubenswrapper[4815]: E1207 19:15:26.769066 4815 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3d749f27-20c8-4d23-ad52-dc6b852bf3b7\\\",\\\"systemUUID\\\":\\\"077277cc-9bde-4aeb-947a-0cf3c49a1ac0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:26Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.772625 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.772653 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.772665 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.772680 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.772693 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:26Z","lastTransitionTime":"2025-12-07T19:15:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:26 crc kubenswrapper[4815]: E1207 19:15:26.801027 4815 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3d749f27-20c8-4d23-ad52-dc6b852bf3b7\\\",\\\"systemUUID\\\":\\\"077277cc-9bde-4aeb-947a-0cf3c49a1ac0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:26Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.805445 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.805466 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.805474 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.805486 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.805496 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:26Z","lastTransitionTime":"2025-12-07T19:15:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:26 crc kubenswrapper[4815]: E1207 19:15:26.826894 4815 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3d749f27-20c8-4d23-ad52-dc6b852bf3b7\\\",\\\"systemUUID\\\":\\\"077277cc-9bde-4aeb-947a-0cf3c49a1ac0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:26Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:26 crc kubenswrapper[4815]: E1207 19:15:26.827199 4815 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.828598 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.828617 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.828626 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.828637 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.828646 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:26Z","lastTransitionTime":"2025-12-07T19:15:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.859474 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-zzw6c"] Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.860210 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.863600 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.863767 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.863945 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.864062 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.864150 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.864285 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.864415 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.893940 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ec34a6371c74ee0778e27d73f5e2948d9ee1bb518f658675df123df60879c61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc20ae853b989c9e9cfeb47a40bd1a02397e3ce0b0b4feec3fcec56ea7be16a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:26Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.905803 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:26Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.917021 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nz6q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92d414c2-ecc3-4598-ac9d-b982bfd89c7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:25Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:25Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cp5k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nz6q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:26Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.928648 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe01e9d3-9839-41ea-82d9-f4b6f48bf3b2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52ab545f838534306ea70cc6e400e9fb56ed71ed8206d72c6626166f3a51a93c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f59886c065844948639bf9828ed20c24546cc38659683c98162568c6b12c029\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bec885c9a80d9b5b1fe630a9b7a3f19b23b0af05f359bd87a664f007812b4c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c76d9f056d22d1be012327374370269c2f6d6b8bd341c98419fcb744370751e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539bb1f0e569004272aac994e13bdc7f17f342f5986f08beca97a4c888e046c2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1207 19:15:09.905103 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1207 19:15:09.906859 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4228347019/tls.crt::/tmp/serving-cert-4228347019/tls.key\\\\\\\"\\\\nI1207 19:15:20.294696 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1207 19:15:20.298748 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1207 19:15:20.298781 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1207 19:15:20.298843 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1207 19:15:20.298860 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1207 19:15:20.305864 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1207 19:15:20.305885 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1207 19:15:20.305891 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1207 19:15:20.305973 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1207 19:15:20.305983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1207 19:15:20.305988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1207 19:15:20.305992 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1207 19:15:20.305997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1207 19:15:20.308727 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://902a764a3f6e3ac55c1817f369dfd69ffbf8528bfe4444a6514cdabd48f09fb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:26Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.930167 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.930193 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.930201 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.930215 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.930223 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:26Z","lastTransitionTime":"2025-12-07T19:15:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.941963 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:26Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.958517 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b57a8653da7e9136a6c54872ff67a01c2d6ad5077842b783ffe74e0069bc8693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:26Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.969038 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:26Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.979261 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d662ba2-aa03-4eea-bd30-8ad40638f6c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2v8vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2v8vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gkn4h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:26Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.990254 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s95hp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b739f36-d9c4-4fb6-9ead-9df05e283dea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prbsc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s95hp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:26Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.992447 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" event={"ID":"3d662ba2-aa03-4eea-bd30-8ad40638f6c7","Type":"ContainerStarted","Data":"4939db657209a8744656f63541b50598888981216b000dd4316a9327fdfbcf34"} Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.992482 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" event={"ID":"3d662ba2-aa03-4eea-bd30-8ad40638f6c7","Type":"ContainerStarted","Data":"55eba62082762b6527923446c2d419d2eb00919331cf8e59bfa9cb27ca65a82e"} Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.992492 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" event={"ID":"3d662ba2-aa03-4eea-bd30-8ad40638f6c7","Type":"ContainerStarted","Data":"3767659bd02a5a1421719dce45edfb6a4dcf1405d84afcc383e59c374c313086"} Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.993888 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-s95hp" event={"ID":"0b739f36-d9c4-4fb6-9ead-9df05e283dea","Type":"ContainerStarted","Data":"73eedf1d85ce69ddf2fed2445ed9318cb4515aee47f89fe9b1db69bbf213ea2a"} Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.993925 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-s95hp" event={"ID":"0b739f36-d9c4-4fb6-9ead-9df05e283dea","Type":"ContainerStarted","Data":"f1fd4c193b9bcee717d2f01da4517066acfca7b628bf2ed8a0add27317d35917"} Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.995587 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-nz6q9" event={"ID":"92d414c2-ecc3-4598-ac9d-b982bfd89c7e","Type":"ContainerStarted","Data":"003f0d6dca39f167925bc3a05634d9dd20f23e54135631e0099a56801f9daf3a"} Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.996965 4815 generic.go:334] "Generic (PLEG): container finished" podID="3fefaece-ab52-48e2-9ee9-fb07be1922f1" containerID="5e14454c2c5dbbdcbfd3ff7ec5d618c2dfe48f41af9846b72f61941b878e53d5" exitCode=0 Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.996990 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gmf4f" event={"ID":"3fefaece-ab52-48e2-9ee9-fb07be1922f1","Type":"ContainerDied","Data":"5e14454c2c5dbbdcbfd3ff7ec5d618c2dfe48f41af9846b72f61941b878e53d5"} Dec 07 19:15:26 crc kubenswrapper[4815]: I1207 19:15:26.997004 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gmf4f" event={"ID":"3fefaece-ab52-48e2-9ee9-fb07be1922f1","Type":"ContainerStarted","Data":"137de5ad3668f43c9e79743478d5ccfeb1d1e8287b9242ed8f7b5461d290c6b7"} Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.009394 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gmf4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fefaece-ab52-48e2-9ee9-fb07be1922f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gmf4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:27Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.019140 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"538dccc2-452b-4d13-8b0d-df993c2b69a2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae001d2e4c277c57382ca0c4a1286c3c135f2fc06ca5a78df27cbdab588ce576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f92a2d37af34c519a6b00a9643f4a46de779ae0070adb0826e9f227960fe4a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89a76372fcc5ff7131084098e41183d148096169b9ccaf69d59dbef45dcfd755\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f29e69a7aa018d68c2e39f3e0f7a619fb25262acaebf83b81fabb6f9b87a9892\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:27Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.032295 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-run-openvswitch\") pod \"ovnkube-node-zzw6c\" (UID: \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.032330 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-host-cni-netd\") pod \"ovnkube-node-zzw6c\" (UID: \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.032347 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-ovn-node-metrics-cert\") pod \"ovnkube-node-zzw6c\" (UID: \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.032362 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-host-run-ovn-kubernetes\") pod \"ovnkube-node-zzw6c\" (UID: \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.032376 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tddph\" (UniqueName: \"kubernetes.io/projected/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-kube-api-access-tddph\") pod \"ovnkube-node-zzw6c\" (UID: \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.032408 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-host-kubelet\") pod \"ovnkube-node-zzw6c\" (UID: \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.032433 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-var-lib-openvswitch\") pod \"ovnkube-node-zzw6c\" (UID: \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.032455 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-node-log\") pod \"ovnkube-node-zzw6c\" (UID: \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.032473 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-host-slash\") pod \"ovnkube-node-zzw6c\" (UID: \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.032491 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-ovnkube-script-lib\") pod \"ovnkube-node-zzw6c\" (UID: \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.032520 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-systemd-units\") pod \"ovnkube-node-zzw6c\" (UID: \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.032536 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zzw6c\" (UID: \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.032551 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-host-run-netns\") pod \"ovnkube-node-zzw6c\" (UID: \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.032565 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-etc-openvswitch\") pod \"ovnkube-node-zzw6c\" (UID: \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.032589 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-run-ovn\") pod \"ovnkube-node-zzw6c\" (UID: \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.032602 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-log-socket\") pod \"ovnkube-node-zzw6c\" (UID: \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.032623 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-env-overrides\") pod \"ovnkube-node-zzw6c\" (UID: \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.032651 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-run-systemd\") pod \"ovnkube-node-zzw6c\" (UID: \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.032668 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-ovnkube-config\") pod \"ovnkube-node-zzw6c\" (UID: \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.032687 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-host-cni-bin\") pod \"ovnkube-node-zzw6c\" (UID: \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.033513 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cb6cb3bfed28d77ff74b7a2273ddb477e21f930f79b169bfbd98fa10a5ead5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:27Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.033699 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.033713 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.033721 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.033735 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.033743 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:27Z","lastTransitionTime":"2025-12-07T19:15:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.058560 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zzw6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:27Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.078451 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"538dccc2-452b-4d13-8b0d-df993c2b69a2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae001d2e4c277c57382ca0c4a1286c3c135f2fc06ca5a78df27cbdab588ce576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f92a2d37af34c519a6b00a9643f4a46de779ae0070adb0826e9f227960fe4a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89a76372fcc5ff7131084098e41183d148096169b9ccaf69d59dbef45dcfd755\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f29e69a7aa018d68c2e39f3e0f7a619fb25262acaebf83b81fabb6f9b87a9892\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:27Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.094421 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cb6cb3bfed28d77ff74b7a2273ddb477e21f930f79b169bfbd98fa10a5ead5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:27Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.112608 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zzw6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:27Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.126718 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:27Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.133354 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-run-openvswitch\") pod \"ovnkube-node-zzw6c\" (UID: \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.133406 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-host-cni-netd\") pod \"ovnkube-node-zzw6c\" (UID: \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.133478 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-run-openvswitch\") pod \"ovnkube-node-zzw6c\" (UID: \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.133426 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-ovn-node-metrics-cert\") pod \"ovnkube-node-zzw6c\" (UID: \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.133476 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-host-cni-netd\") pod \"ovnkube-node-zzw6c\" (UID: \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.133532 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-host-kubelet\") pod \"ovnkube-node-zzw6c\" (UID: \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.133551 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-host-run-ovn-kubernetes\") pod \"ovnkube-node-zzw6c\" (UID: \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.133572 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tddph\" (UniqueName: \"kubernetes.io/projected/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-kube-api-access-tddph\") pod \"ovnkube-node-zzw6c\" (UID: \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.133594 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-host-kubelet\") pod \"ovnkube-node-zzw6c\" (UID: \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.133622 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-host-run-ovn-kubernetes\") pod \"ovnkube-node-zzw6c\" (UID: \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.133622 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-var-lib-openvswitch\") pod \"ovnkube-node-zzw6c\" (UID: \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.133646 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-node-log\") pod \"ovnkube-node-zzw6c\" (UID: \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.133662 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-host-slash\") pod \"ovnkube-node-zzw6c\" (UID: \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.133678 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-ovnkube-script-lib\") pod \"ovnkube-node-zzw6c\" (UID: \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.133739 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-systemd-units\") pod \"ovnkube-node-zzw6c\" (UID: \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.133755 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zzw6c\" (UID: \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.133779 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-host-run-netns\") pod \"ovnkube-node-zzw6c\" (UID: \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.133793 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-etc-openvswitch\") pod \"ovnkube-node-zzw6c\" (UID: \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.133818 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-run-ovn\") pod \"ovnkube-node-zzw6c\" (UID: \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.133832 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-log-socket\") pod \"ovnkube-node-zzw6c\" (UID: \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.133845 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-env-overrides\") pod \"ovnkube-node-zzw6c\" (UID: \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.133860 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-run-systemd\") pod \"ovnkube-node-zzw6c\" (UID: \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.133873 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-ovnkube-config\") pod \"ovnkube-node-zzw6c\" (UID: \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.133897 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-host-cni-bin\") pod \"ovnkube-node-zzw6c\" (UID: \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.134069 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-etc-openvswitch\") pod \"ovnkube-node-zzw6c\" (UID: \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.134106 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-node-log\") pod \"ovnkube-node-zzw6c\" (UID: \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.133646 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-var-lib-openvswitch\") pod \"ovnkube-node-zzw6c\" (UID: \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.134146 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-host-slash\") pod \"ovnkube-node-zzw6c\" (UID: \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.134223 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-run-systemd\") pod \"ovnkube-node-zzw6c\" (UID: \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.134414 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zzw6c\" (UID: \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.134449 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-systemd-units\") pod \"ovnkube-node-zzw6c\" (UID: \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.134478 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-host-cni-bin\") pod \"ovnkube-node-zzw6c\" (UID: \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.134586 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-run-ovn\") pod \"ovnkube-node-zzw6c\" (UID: \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.134614 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-log-socket\") pod \"ovnkube-node-zzw6c\" (UID: \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.134673 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-ovnkube-script-lib\") pod \"ovnkube-node-zzw6c\" (UID: \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.134798 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-host-run-netns\") pod \"ovnkube-node-zzw6c\" (UID: \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.135323 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-env-overrides\") pod \"ovnkube-node-zzw6c\" (UID: \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.135349 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-ovnkube-config\") pod \"ovnkube-node-zzw6c\" (UID: \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.143204 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nz6q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92d414c2-ecc3-4598-ac9d-b982bfd89c7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://003f0d6dca39f167925bc3a05634d9dd20f23e54135631e0099a56801f9daf3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cp5k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nz6q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:27Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.143482 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-ovn-node-metrics-cert\") pod \"ovnkube-node-zzw6c\" (UID: \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.148042 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.148069 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.148079 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.148092 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.148101 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:27Z","lastTransitionTime":"2025-12-07T19:15:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.149997 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tddph\" (UniqueName: \"kubernetes.io/projected/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-kube-api-access-tddph\") pod \"ovnkube-node-zzw6c\" (UID: \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.159959 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe01e9d3-9839-41ea-82d9-f4b6f48bf3b2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52ab545f838534306ea70cc6e400e9fb56ed71ed8206d72c6626166f3a51a93c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f59886c065844948639bf9828ed20c24546cc38659683c98162568c6b12c029\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bec885c9a80d9b5b1fe630a9b7a3f19b23b0af05f359bd87a664f007812b4c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c76d9f056d22d1be012327374370269c2f6d6b8bd341c98419fcb744370751e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539bb1f0e569004272aac994e13bdc7f17f342f5986f08beca97a4c888e046c2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1207 19:15:09.905103 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1207 19:15:09.906859 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4228347019/tls.crt::/tmp/serving-cert-4228347019/tls.key\\\\\\\"\\\\nI1207 19:15:20.294696 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1207 19:15:20.298748 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1207 19:15:20.298781 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1207 19:15:20.298843 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1207 19:15:20.298860 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1207 19:15:20.305864 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1207 19:15:20.305885 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1207 19:15:20.305891 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1207 19:15:20.305973 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1207 19:15:20.305983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1207 19:15:20.305988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1207 19:15:20.305992 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1207 19:15:20.305997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1207 19:15:20.308727 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://902a764a3f6e3ac55c1817f369dfd69ffbf8528bfe4444a6514cdabd48f09fb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:27Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.170895 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.171964 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:27Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.187359 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b57a8653da7e9136a6c54872ff67a01c2d6ad5077842b783ffe74e0069bc8693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:27Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:27 crc kubenswrapper[4815]: W1207 19:15:27.189139 4815 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13ca3d87_e6c7_4d51_9b77_34fdd2a38e1f.slice/crio-7b2f72cd2cbe2cb11f715a46c910cb0ce86591696554d1aaa22ae4b2b3c2e0b2 WatchSource:0}: Error finding container 7b2f72cd2cbe2cb11f715a46c910cb0ce86591696554d1aaa22ae4b2b3c2e0b2: Status 404 returned error can't find the container with id 7b2f72cd2cbe2cb11f715a46c910cb0ce86591696554d1aaa22ae4b2b3c2e0b2 Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.201708 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:27Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.213566 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ec34a6371c74ee0778e27d73f5e2948d9ee1bb518f658675df123df60879c61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc20ae853b989c9e9cfeb47a40bd1a02397e3ce0b0b4feec3fcec56ea7be16a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:27Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.222784 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d662ba2-aa03-4eea-bd30-8ad40638f6c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4939db657209a8744656f63541b50598888981216b000dd4316a9327fdfbcf34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2v8vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55eba62082762b6527923446c2d419d2eb00919331cf8e59bfa9cb27ca65a82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2v8vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gkn4h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:27Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.233660 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s95hp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b739f36-d9c4-4fb6-9ead-9df05e283dea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73eedf1d85ce69ddf2fed2445ed9318cb4515aee47f89fe9b1db69bbf213ea2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prbsc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s95hp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:27Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.247347 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gmf4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fefaece-ab52-48e2-9ee9-fb07be1922f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e14454c2c5dbbdcbfd3ff7ec5d618c2dfe48f41af9846b72f61941b878e53d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e14454c2c5dbbdcbfd3ff7ec5d618c2dfe48f41af9846b72f61941b878e53d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gmf4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:27Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.249890 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.249953 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.249965 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.249980 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.249989 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:27Z","lastTransitionTime":"2025-12-07T19:15:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.353036 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.353367 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.353378 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.353392 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.353401 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:27Z","lastTransitionTime":"2025-12-07T19:15:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.455771 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.455807 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.455816 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.455830 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.455841 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:27Z","lastTransitionTime":"2025-12-07T19:15:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.558602 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.558639 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.558651 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.558700 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.558714 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:27Z","lastTransitionTime":"2025-12-07T19:15:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.661262 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.661300 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.661309 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.661327 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.661337 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:27Z","lastTransitionTime":"2025-12-07T19:15:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.763815 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.763884 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.763900 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.763995 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.764023 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:27Z","lastTransitionTime":"2025-12-07T19:15:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.769143 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.769158 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 07 19:15:27 crc kubenswrapper[4815]: E1207 19:15:27.769371 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 07 19:15:27 crc kubenswrapper[4815]: E1207 19:15:27.769437 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.866701 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.866743 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.866752 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.866772 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.866783 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:27Z","lastTransitionTime":"2025-12-07T19:15:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.969714 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.970149 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.970167 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.970189 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:27 crc kubenswrapper[4815]: I1207 19:15:27.970204 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:27Z","lastTransitionTime":"2025-12-07T19:15:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.001971 4815 generic.go:334] "Generic (PLEG): container finished" podID="3fefaece-ab52-48e2-9ee9-fb07be1922f1" containerID="99f33f1decd49213b8e034c2d1f2a3a51ab99aa18679bd3378becaa27f1ef4b8" exitCode=0 Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.002039 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gmf4f" event={"ID":"3fefaece-ab52-48e2-9ee9-fb07be1922f1","Type":"ContainerDied","Data":"99f33f1decd49213b8e034c2d1f2a3a51ab99aa18679bd3378becaa27f1ef4b8"} Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.003746 4815 generic.go:334] "Generic (PLEG): container finished" podID="13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f" containerID="1a4a4b2d4e80c078ae857e91491c082c6b1c57a5297c485c31fac0c90996b19f" exitCode=0 Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.003788 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" event={"ID":"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f","Type":"ContainerDied","Data":"1a4a4b2d4e80c078ae857e91491c082c6b1c57a5297c485c31fac0c90996b19f"} Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.003841 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" event={"ID":"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f","Type":"ContainerStarted","Data":"7b2f72cd2cbe2cb11f715a46c910cb0ce86591696554d1aaa22ae4b2b3c2e0b2"} Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.019850 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d662ba2-aa03-4eea-bd30-8ad40638f6c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4939db657209a8744656f63541b50598888981216b000dd4316a9327fdfbcf34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2v8vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55eba62082762b6527923446c2d419d2eb00919331cf8e59bfa9cb27ca65a82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2v8vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gkn4h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:28Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.050335 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s95hp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b739f36-d9c4-4fb6-9ead-9df05e283dea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73eedf1d85ce69ddf2fed2445ed9318cb4515aee47f89fe9b1db69bbf213ea2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prbsc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s95hp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:28Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.071818 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gmf4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fefaece-ab52-48e2-9ee9-fb07be1922f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e14454c2c5dbbdcbfd3ff7ec5d618c2dfe48f41af9846b72f61941b878e53d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e14454c2c5dbbdcbfd3ff7ec5d618c2dfe48f41af9846b72f61941b878e53d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f33f1decd49213b8e034c2d1f2a3a51ab99aa18679bd3378becaa27f1ef4b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f33f1decd49213b8e034c2d1f2a3a51ab99aa18679bd3378becaa27f1ef4b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gmf4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:28Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.072709 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.072817 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.072901 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.073015 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.073100 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:28Z","lastTransitionTime":"2025-12-07T19:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.085427 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"538dccc2-452b-4d13-8b0d-df993c2b69a2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae001d2e4c277c57382ca0c4a1286c3c135f2fc06ca5a78df27cbdab588ce576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f92a2d37af34c519a6b00a9643f4a46de779ae0070adb0826e9f227960fe4a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89a76372fcc5ff7131084098e41183d148096169b9ccaf69d59dbef45dcfd755\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f29e69a7aa018d68c2e39f3e0f7a619fb25262acaebf83b81fabb6f9b87a9892\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:28Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.105217 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cb6cb3bfed28d77ff74b7a2273ddb477e21f930f79b169bfbd98fa10a5ead5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:28Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.127489 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zzw6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:28Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.139095 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ec34a6371c74ee0778e27d73f5e2948d9ee1bb518f658675df123df60879c61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc20ae853b989c9e9cfeb47a40bd1a02397e3ce0b0b4feec3fcec56ea7be16a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:28Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.154440 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:28Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.163506 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nz6q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92d414c2-ecc3-4598-ac9d-b982bfd89c7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://003f0d6dca39f167925bc3a05634d9dd20f23e54135631e0099a56801f9daf3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cp5k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nz6q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:28Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.175568 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.175602 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.175611 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.175626 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.175637 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:28Z","lastTransitionTime":"2025-12-07T19:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.179406 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe01e9d3-9839-41ea-82d9-f4b6f48bf3b2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52ab545f838534306ea70cc6e400e9fb56ed71ed8206d72c6626166f3a51a93c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f59886c065844948639bf9828ed20c24546cc38659683c98162568c6b12c029\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bec885c9a80d9b5b1fe630a9b7a3f19b23b0af05f359bd87a664f007812b4c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c76d9f056d22d1be012327374370269c2f6d6b8bd341c98419fcb744370751e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539bb1f0e569004272aac994e13bdc7f17f342f5986f08beca97a4c888e046c2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1207 19:15:09.905103 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1207 19:15:09.906859 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4228347019/tls.crt::/tmp/serving-cert-4228347019/tls.key\\\\\\\"\\\\nI1207 19:15:20.294696 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1207 19:15:20.298748 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1207 19:15:20.298781 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1207 19:15:20.298843 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1207 19:15:20.298860 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1207 19:15:20.305864 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1207 19:15:20.305885 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1207 19:15:20.305891 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1207 19:15:20.305973 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1207 19:15:20.305983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1207 19:15:20.305988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1207 19:15:20.305992 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1207 19:15:20.305997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1207 19:15:20.308727 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://902a764a3f6e3ac55c1817f369dfd69ffbf8528bfe4444a6514cdabd48f09fb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:28Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.190883 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:28Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.204174 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b57a8653da7e9136a6c54872ff67a01c2d6ad5077842b783ffe74e0069bc8693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:28Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.217455 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:28Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.232200 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:28Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.245986 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b57a8653da7e9136a6c54872ff67a01c2d6ad5077842b783ffe74e0069bc8693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:28Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.258330 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:28Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.270741 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ec34a6371c74ee0778e27d73f5e2948d9ee1bb518f658675df123df60879c61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc20ae853b989c9e9cfeb47a40bd1a02397e3ce0b0b4feec3fcec56ea7be16a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:28Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.277995 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.278043 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.278069 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.278084 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.278092 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:28Z","lastTransitionTime":"2025-12-07T19:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.282398 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:28Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.293075 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nz6q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92d414c2-ecc3-4598-ac9d-b982bfd89c7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://003f0d6dca39f167925bc3a05634d9dd20f23e54135631e0099a56801f9daf3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cp5k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nz6q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:28Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.303584 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe01e9d3-9839-41ea-82d9-f4b6f48bf3b2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52ab545f838534306ea70cc6e400e9fb56ed71ed8206d72c6626166f3a51a93c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f59886c065844948639bf9828ed20c24546cc38659683c98162568c6b12c029\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bec885c9a80d9b5b1fe630a9b7a3f19b23b0af05f359bd87a664f007812b4c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c76d9f056d22d1be012327374370269c2f6d6b8bd341c98419fcb744370751e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539bb1f0e569004272aac994e13bdc7f17f342f5986f08beca97a4c888e046c2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1207 19:15:09.905103 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1207 19:15:09.906859 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4228347019/tls.crt::/tmp/serving-cert-4228347019/tls.key\\\\\\\"\\\\nI1207 19:15:20.294696 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1207 19:15:20.298748 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1207 19:15:20.298781 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1207 19:15:20.298843 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1207 19:15:20.298860 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1207 19:15:20.305864 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1207 19:15:20.305885 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1207 19:15:20.305891 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1207 19:15:20.305973 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1207 19:15:20.305983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1207 19:15:20.305988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1207 19:15:20.305992 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1207 19:15:20.305997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1207 19:15:20.308727 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://902a764a3f6e3ac55c1817f369dfd69ffbf8528bfe4444a6514cdabd48f09fb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:28Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.314996 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d662ba2-aa03-4eea-bd30-8ad40638f6c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4939db657209a8744656f63541b50598888981216b000dd4316a9327fdfbcf34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2v8vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55eba62082762b6527923446c2d419d2eb00919331cf8e59bfa9cb27ca65a82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2v8vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gkn4h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:28Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.329252 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s95hp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b739f36-d9c4-4fb6-9ead-9df05e283dea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73eedf1d85ce69ddf2fed2445ed9318cb4515aee47f89fe9b1db69bbf213ea2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prbsc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s95hp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:28Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.345479 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gmf4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fefaece-ab52-48e2-9ee9-fb07be1922f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e14454c2c5dbbdcbfd3ff7ec5d618c2dfe48f41af9846b72f61941b878e53d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e14454c2c5dbbdcbfd3ff7ec5d618c2dfe48f41af9846b72f61941b878e53d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f33f1decd49213b8e034c2d1f2a3a51ab99aa18679bd3378becaa27f1ef4b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f33f1decd49213b8e034c2d1f2a3a51ab99aa18679bd3378becaa27f1ef4b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gmf4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:28Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.368841 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a4a4b2d4e80c078ae857e91491c082c6b1c57a5297c485c31fac0c90996b19f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a4a4b2d4e80c078ae857e91491c082c6b1c57a5297c485c31fac0c90996b19f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zzw6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:28Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.379847 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.379882 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.379892 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.379957 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.379972 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:28Z","lastTransitionTime":"2025-12-07T19:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.382298 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"538dccc2-452b-4d13-8b0d-df993c2b69a2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae001d2e4c277c57382ca0c4a1286c3c135f2fc06ca5a78df27cbdab588ce576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f92a2d37af34c519a6b00a9643f4a46de779ae0070adb0826e9f227960fe4a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89a76372fcc5ff7131084098e41183d148096169b9ccaf69d59dbef45dcfd755\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f29e69a7aa018d68c2e39f3e0f7a619fb25262acaebf83b81fabb6f9b87a9892\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:28Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.392659 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-kh7gd"] Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.393058 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-kh7gd" Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.394055 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cb6cb3bfed28d77ff74b7a2273ddb477e21f930f79b169bfbd98fa10a5ead5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:28Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.395679 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.395753 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.395853 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.396577 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.409858 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ec34a6371c74ee0778e27d73f5e2948d9ee1bb518f658675df123df60879c61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc20ae853b989c9e9cfeb47a40bd1a02397e3ce0b0b4feec3fcec56ea7be16a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:28Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.422762 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:28Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.437575 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nz6q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92d414c2-ecc3-4598-ac9d-b982bfd89c7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://003f0d6dca39f167925bc3a05634d9dd20f23e54135631e0099a56801f9daf3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cp5k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nz6q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:28Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.453050 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe01e9d3-9839-41ea-82d9-f4b6f48bf3b2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52ab545f838534306ea70cc6e400e9fb56ed71ed8206d72c6626166f3a51a93c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f59886c065844948639bf9828ed20c24546cc38659683c98162568c6b12c029\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bec885c9a80d9b5b1fe630a9b7a3f19b23b0af05f359bd87a664f007812b4c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c76d9f056d22d1be012327374370269c2f6d6b8bd341c98419fcb744370751e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539bb1f0e569004272aac994e13bdc7f17f342f5986f08beca97a4c888e046c2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1207 19:15:09.905103 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1207 19:15:09.906859 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4228347019/tls.crt::/tmp/serving-cert-4228347019/tls.key\\\\\\\"\\\\nI1207 19:15:20.294696 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1207 19:15:20.298748 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1207 19:15:20.298781 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1207 19:15:20.298843 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1207 19:15:20.298860 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1207 19:15:20.305864 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1207 19:15:20.305885 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1207 19:15:20.305891 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1207 19:15:20.305973 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1207 19:15:20.305983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1207 19:15:20.305988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1207 19:15:20.305992 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1207 19:15:20.305997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1207 19:15:20.308727 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://902a764a3f6e3ac55c1817f369dfd69ffbf8528bfe4444a6514cdabd48f09fb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:28Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.466659 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:28Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.477721 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b57a8653da7e9136a6c54872ff67a01c2d6ad5077842b783ffe74e0069bc8693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:28Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.484112 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.484168 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.484182 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.484201 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.484215 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:28Z","lastTransitionTime":"2025-12-07T19:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.491190 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:28Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.504230 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d662ba2-aa03-4eea-bd30-8ad40638f6c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4939db657209a8744656f63541b50598888981216b000dd4316a9327fdfbcf34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2v8vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55eba62082762b6527923446c2d419d2eb00919331cf8e59bfa9cb27ca65a82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2v8vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gkn4h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:28Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.516224 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s95hp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b739f36-d9c4-4fb6-9ead-9df05e283dea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73eedf1d85ce69ddf2fed2445ed9318cb4515aee47f89fe9b1db69bbf213ea2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prbsc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s95hp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:28Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.530408 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gmf4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fefaece-ab52-48e2-9ee9-fb07be1922f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e14454c2c5dbbdcbfd3ff7ec5d618c2dfe48f41af9846b72f61941b878e53d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e14454c2c5dbbdcbfd3ff7ec5d618c2dfe48f41af9846b72f61941b878e53d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f33f1decd49213b8e034c2d1f2a3a51ab99aa18679bd3378becaa27f1ef4b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f33f1decd49213b8e034c2d1f2a3a51ab99aa18679bd3378becaa27f1ef4b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gmf4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:28Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.546273 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"538dccc2-452b-4d13-8b0d-df993c2b69a2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae001d2e4c277c57382ca0c4a1286c3c135f2fc06ca5a78df27cbdab588ce576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f92a2d37af34c519a6b00a9643f4a46de779ae0070adb0826e9f227960fe4a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89a76372fcc5ff7131084098e41183d148096169b9ccaf69d59dbef45dcfd755\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f29e69a7aa018d68c2e39f3e0f7a619fb25262acaebf83b81fabb6f9b87a9892\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:28Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.548778 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.548960 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rzjx\" (UniqueName: \"kubernetes.io/projected/29b15186-a725-4067-ba76-336f580327fa-kube-api-access-7rzjx\") pod \"node-ca-kh7gd\" (UID: \"29b15186-a725-4067-ba76-336f580327fa\") " pod="openshift-image-registry/node-ca-kh7gd" Dec 07 19:15:28 crc kubenswrapper[4815]: E1207 19:15:28.549004 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-07 19:15:36.548938389 +0000 UTC m=+41.127928424 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.549082 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.549123 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.549165 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 07 19:15:28 crc kubenswrapper[4815]: E1207 19:15:28.549185 4815 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.549203 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/29b15186-a725-4067-ba76-336f580327fa-host\") pod \"node-ca-kh7gd\" (UID: \"29b15186-a725-4067-ba76-336f580327fa\") " pod="openshift-image-registry/node-ca-kh7gd" Dec 07 19:15:28 crc kubenswrapper[4815]: E1207 19:15:28.549231 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-07 19:15:36.549220846 +0000 UTC m=+41.128210891 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.549260 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/29b15186-a725-4067-ba76-336f580327fa-serviceca\") pod \"node-ca-kh7gd\" (UID: \"29b15186-a725-4067-ba76-336f580327fa\") " pod="openshift-image-registry/node-ca-kh7gd" Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.549306 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 07 19:15:28 crc kubenswrapper[4815]: E1207 19:15:28.549360 4815 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 07 19:15:28 crc kubenswrapper[4815]: E1207 19:15:28.549404 4815 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 07 19:15:28 crc kubenswrapper[4815]: E1207 19:15:28.549408 4815 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 07 19:15:28 crc kubenswrapper[4815]: E1207 19:15:28.549421 4815 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 07 19:15:28 crc kubenswrapper[4815]: E1207 19:15:28.549438 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-07 19:15:36.549431822 +0000 UTC m=+41.128421867 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 07 19:15:28 crc kubenswrapper[4815]: E1207 19:15:28.549373 4815 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 07 19:15:28 crc kubenswrapper[4815]: E1207 19:15:28.549467 4815 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 07 19:15:28 crc kubenswrapper[4815]: E1207 19:15:28.549476 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-07 19:15:36.549456772 +0000 UTC m=+41.128446907 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 07 19:15:28 crc kubenswrapper[4815]: E1207 19:15:28.549480 4815 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 07 19:15:28 crc kubenswrapper[4815]: E1207 19:15:28.549517 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-07 19:15:36.549511024 +0000 UTC m=+41.128501069 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.563165 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cb6cb3bfed28d77ff74b7a2273ddb477e21f930f79b169bfbd98fa10a5ead5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:28Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.583804 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a4a4b2d4e80c078ae857e91491c082c6b1c57a5297c485c31fac0c90996b19f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a4a4b2d4e80c078ae857e91491c082c6b1c57a5297c485c31fac0c90996b19f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zzw6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:28Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.586547 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.586582 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.586593 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.586608 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.586618 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:28Z","lastTransitionTime":"2025-12-07T19:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.596294 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kh7gd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29b15186-a725-4067-ba76-336f580327fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kh7gd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:28Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.650112 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/29b15186-a725-4067-ba76-336f580327fa-host\") pod \"node-ca-kh7gd\" (UID: \"29b15186-a725-4067-ba76-336f580327fa\") " pod="openshift-image-registry/node-ca-kh7gd" Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.650146 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/29b15186-a725-4067-ba76-336f580327fa-serviceca\") pod \"node-ca-kh7gd\" (UID: \"29b15186-a725-4067-ba76-336f580327fa\") " pod="openshift-image-registry/node-ca-kh7gd" Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.650175 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rzjx\" (UniqueName: \"kubernetes.io/projected/29b15186-a725-4067-ba76-336f580327fa-kube-api-access-7rzjx\") pod \"node-ca-kh7gd\" (UID: \"29b15186-a725-4067-ba76-336f580327fa\") " pod="openshift-image-registry/node-ca-kh7gd" Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.650250 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/29b15186-a725-4067-ba76-336f580327fa-host\") pod \"node-ca-kh7gd\" (UID: \"29b15186-a725-4067-ba76-336f580327fa\") " pod="openshift-image-registry/node-ca-kh7gd" Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.651057 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/29b15186-a725-4067-ba76-336f580327fa-serviceca\") pod \"node-ca-kh7gd\" (UID: \"29b15186-a725-4067-ba76-336f580327fa\") " pod="openshift-image-registry/node-ca-kh7gd" Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.672344 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rzjx\" (UniqueName: \"kubernetes.io/projected/29b15186-a725-4067-ba76-336f580327fa-kube-api-access-7rzjx\") pod \"node-ca-kh7gd\" (UID: \"29b15186-a725-4067-ba76-336f580327fa\") " pod="openshift-image-registry/node-ca-kh7gd" Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.689049 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.689091 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.689101 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.689117 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.689127 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:28Z","lastTransitionTime":"2025-12-07T19:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.704817 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-kh7gd" Dec 07 19:15:28 crc kubenswrapper[4815]: W1207 19:15:28.720416 4815 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29b15186_a725_4067_ba76_336f580327fa.slice/crio-489e2fdd991b249a6c0e9ee4d81e561b20dd54f9ee2360c16f3284999028fef7 WatchSource:0}: Error finding container 489e2fdd991b249a6c0e9ee4d81e561b20dd54f9ee2360c16f3284999028fef7: Status 404 returned error can't find the container with id 489e2fdd991b249a6c0e9ee4d81e561b20dd54f9ee2360c16f3284999028fef7 Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.769174 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 07 19:15:28 crc kubenswrapper[4815]: E1207 19:15:28.769315 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.791528 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.791555 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.791565 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.791579 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.791588 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:28Z","lastTransitionTime":"2025-12-07T19:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.895943 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.895978 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.895990 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.896007 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.896019 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:28Z","lastTransitionTime":"2025-12-07T19:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.998719 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.998753 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.998765 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.998779 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:28 crc kubenswrapper[4815]: I1207 19:15:28.998790 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:28Z","lastTransitionTime":"2025-12-07T19:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:29 crc kubenswrapper[4815]: I1207 19:15:29.007820 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" event={"ID":"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f","Type":"ContainerStarted","Data":"d4050d0d7398492efc8809311a01c1a60986b97b6d177846e70c9283399f37b3"} Dec 07 19:15:29 crc kubenswrapper[4815]: I1207 19:15:29.007856 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" event={"ID":"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f","Type":"ContainerStarted","Data":"ac471afc20373bd22caa4941b5ebe38fdf62ad2145fabb4211fa02ce8728ab1d"} Dec 07 19:15:29 crc kubenswrapper[4815]: I1207 19:15:29.007864 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" event={"ID":"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f","Type":"ContainerStarted","Data":"6ab1c01dcbfe50580e64d89dc9e14843419588c89d7d792af89034d8f272e44e"} Dec 07 19:15:29 crc kubenswrapper[4815]: I1207 19:15:29.007872 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" event={"ID":"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f","Type":"ContainerStarted","Data":"60f140818c38211cb31d63c0b540b60681a781c349ba0523866a3c7e80f33756"} Dec 07 19:15:29 crc kubenswrapper[4815]: I1207 19:15:29.009722 4815 generic.go:334] "Generic (PLEG): container finished" podID="3fefaece-ab52-48e2-9ee9-fb07be1922f1" containerID="c519c7c8ea1b7405466970c64269b72cc6e8129838764dfc4a662cb4df419c6f" exitCode=0 Dec 07 19:15:29 crc kubenswrapper[4815]: I1207 19:15:29.009768 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gmf4f" event={"ID":"3fefaece-ab52-48e2-9ee9-fb07be1922f1","Type":"ContainerDied","Data":"c519c7c8ea1b7405466970c64269b72cc6e8129838764dfc4a662cb4df419c6f"} Dec 07 19:15:29 crc kubenswrapper[4815]: I1207 19:15:29.011571 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-kh7gd" event={"ID":"29b15186-a725-4067-ba76-336f580327fa","Type":"ContainerStarted","Data":"489e2fdd991b249a6c0e9ee4d81e561b20dd54f9ee2360c16f3284999028fef7"} Dec 07 19:15:29 crc kubenswrapper[4815]: I1207 19:15:29.027472 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s95hp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b739f36-d9c4-4fb6-9ead-9df05e283dea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73eedf1d85ce69ddf2fed2445ed9318cb4515aee47f89fe9b1db69bbf213ea2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prbsc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s95hp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:29Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:29 crc kubenswrapper[4815]: I1207 19:15:29.043814 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gmf4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fefaece-ab52-48e2-9ee9-fb07be1922f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e14454c2c5dbbdcbfd3ff7ec5d618c2dfe48f41af9846b72f61941b878e53d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e14454c2c5dbbdcbfd3ff7ec5d618c2dfe48f41af9846b72f61941b878e53d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f33f1decd49213b8e034c2d1f2a3a51ab99aa18679bd3378becaa27f1ef4b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f33f1decd49213b8e034c2d1f2a3a51ab99aa18679bd3378becaa27f1ef4b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c519c7c8ea1b7405466970c64269b72cc6e8129838764dfc4a662cb4df419c6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c519c7c8ea1b7405466970c64269b72cc6e8129838764dfc4a662cb4df419c6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gmf4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:29Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:29 crc kubenswrapper[4815]: I1207 19:15:29.058025 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"538dccc2-452b-4d13-8b0d-df993c2b69a2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae001d2e4c277c57382ca0c4a1286c3c135f2fc06ca5a78df27cbdab588ce576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f92a2d37af34c519a6b00a9643f4a46de779ae0070adb0826e9f227960fe4a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89a76372fcc5ff7131084098e41183d148096169b9ccaf69d59dbef45dcfd755\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f29e69a7aa018d68c2e39f3e0f7a619fb25262acaebf83b81fabb6f9b87a9892\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:29Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:29 crc kubenswrapper[4815]: I1207 19:15:29.069757 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cb6cb3bfed28d77ff74b7a2273ddb477e21f930f79b169bfbd98fa10a5ead5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:29Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:29 crc kubenswrapper[4815]: I1207 19:15:29.088025 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a4a4b2d4e80c078ae857e91491c082c6b1c57a5297c485c31fac0c90996b19f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a4a4b2d4e80c078ae857e91491c082c6b1c57a5297c485c31fac0c90996b19f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zzw6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:29Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:29 crc kubenswrapper[4815]: I1207 19:15:29.105724 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:29 crc kubenswrapper[4815]: I1207 19:15:29.105750 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:29 crc kubenswrapper[4815]: I1207 19:15:29.105759 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:29 crc kubenswrapper[4815]: I1207 19:15:29.105771 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:29 crc kubenswrapper[4815]: I1207 19:15:29.105780 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:29Z","lastTransitionTime":"2025-12-07T19:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:29 crc kubenswrapper[4815]: I1207 19:15:29.107985 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kh7gd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29b15186-a725-4067-ba76-336f580327fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kh7gd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:29Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:29 crc kubenswrapper[4815]: I1207 19:15:29.128418 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:29Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:29 crc kubenswrapper[4815]: I1207 19:15:29.142410 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ec34a6371c74ee0778e27d73f5e2948d9ee1bb518f658675df123df60879c61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc20ae853b989c9e9cfeb47a40bd1a02397e3ce0b0b4feec3fcec56ea7be16a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:29Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:29 crc kubenswrapper[4815]: I1207 19:15:29.160162 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:29Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:29 crc kubenswrapper[4815]: I1207 19:15:29.170081 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nz6q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92d414c2-ecc3-4598-ac9d-b982bfd89c7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://003f0d6dca39f167925bc3a05634d9dd20f23e54135631e0099a56801f9daf3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cp5k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nz6q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:29Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:29 crc kubenswrapper[4815]: I1207 19:15:29.185121 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe01e9d3-9839-41ea-82d9-f4b6f48bf3b2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52ab545f838534306ea70cc6e400e9fb56ed71ed8206d72c6626166f3a51a93c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f59886c065844948639bf9828ed20c24546cc38659683c98162568c6b12c029\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bec885c9a80d9b5b1fe630a9b7a3f19b23b0af05f359bd87a664f007812b4c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c76d9f056d22d1be012327374370269c2f6d6b8bd341c98419fcb744370751e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539bb1f0e569004272aac994e13bdc7f17f342f5986f08beca97a4c888e046c2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1207 19:15:09.905103 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1207 19:15:09.906859 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4228347019/tls.crt::/tmp/serving-cert-4228347019/tls.key\\\\\\\"\\\\nI1207 19:15:20.294696 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1207 19:15:20.298748 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1207 19:15:20.298781 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1207 19:15:20.298843 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1207 19:15:20.298860 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1207 19:15:20.305864 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1207 19:15:20.305885 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1207 19:15:20.305891 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1207 19:15:20.305973 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1207 19:15:20.305983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1207 19:15:20.305988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1207 19:15:20.305992 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1207 19:15:20.305997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1207 19:15:20.308727 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://902a764a3f6e3ac55c1817f369dfd69ffbf8528bfe4444a6514cdabd48f09fb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:29Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:29 crc kubenswrapper[4815]: I1207 19:15:29.195489 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:29Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:29 crc kubenswrapper[4815]: I1207 19:15:29.208340 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:29 crc kubenswrapper[4815]: I1207 19:15:29.208530 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:29 crc kubenswrapper[4815]: I1207 19:15:29.208614 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:29 crc kubenswrapper[4815]: I1207 19:15:29.208693 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:29 crc kubenswrapper[4815]: I1207 19:15:29.208756 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:29Z","lastTransitionTime":"2025-12-07T19:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:29 crc kubenswrapper[4815]: I1207 19:15:29.212506 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b57a8653da7e9136a6c54872ff67a01c2d6ad5077842b783ffe74e0069bc8693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:29Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:29 crc kubenswrapper[4815]: I1207 19:15:29.224344 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d662ba2-aa03-4eea-bd30-8ad40638f6c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4939db657209a8744656f63541b50598888981216b000dd4316a9327fdfbcf34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2v8vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55eba62082762b6527923446c2d419d2eb00919331cf8e59bfa9cb27ca65a82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2v8vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gkn4h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:29Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:29 crc kubenswrapper[4815]: I1207 19:15:29.311653 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:29 crc kubenswrapper[4815]: I1207 19:15:29.312003 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:29 crc kubenswrapper[4815]: I1207 19:15:29.312013 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:29 crc kubenswrapper[4815]: I1207 19:15:29.312029 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:29 crc kubenswrapper[4815]: I1207 19:15:29.312038 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:29Z","lastTransitionTime":"2025-12-07T19:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:29 crc kubenswrapper[4815]: I1207 19:15:29.413699 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:29 crc kubenswrapper[4815]: I1207 19:15:29.413736 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:29 crc kubenswrapper[4815]: I1207 19:15:29.413747 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:29 crc kubenswrapper[4815]: I1207 19:15:29.413761 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:29 crc kubenswrapper[4815]: I1207 19:15:29.413771 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:29Z","lastTransitionTime":"2025-12-07T19:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:29 crc kubenswrapper[4815]: I1207 19:15:29.516237 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:29 crc kubenswrapper[4815]: I1207 19:15:29.516276 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:29 crc kubenswrapper[4815]: I1207 19:15:29.516285 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:29 crc kubenswrapper[4815]: I1207 19:15:29.516300 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:29 crc kubenswrapper[4815]: I1207 19:15:29.516309 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:29Z","lastTransitionTime":"2025-12-07T19:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:29 crc kubenswrapper[4815]: I1207 19:15:29.619150 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:29 crc kubenswrapper[4815]: I1207 19:15:29.619482 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:29 crc kubenswrapper[4815]: I1207 19:15:29.619655 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:29 crc kubenswrapper[4815]: I1207 19:15:29.619837 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:29 crc kubenswrapper[4815]: I1207 19:15:29.620036 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:29Z","lastTransitionTime":"2025-12-07T19:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:29 crc kubenswrapper[4815]: I1207 19:15:29.722904 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:29 crc kubenswrapper[4815]: I1207 19:15:29.722977 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:29 crc kubenswrapper[4815]: I1207 19:15:29.722993 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:29 crc kubenswrapper[4815]: I1207 19:15:29.723013 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:29 crc kubenswrapper[4815]: I1207 19:15:29.723034 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:29Z","lastTransitionTime":"2025-12-07T19:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:29 crc kubenswrapper[4815]: I1207 19:15:29.769726 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 07 19:15:29 crc kubenswrapper[4815]: I1207 19:15:29.769772 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 07 19:15:29 crc kubenswrapper[4815]: E1207 19:15:29.769843 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 07 19:15:29 crc kubenswrapper[4815]: E1207 19:15:29.770006 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 07 19:15:29 crc kubenswrapper[4815]: I1207 19:15:29.826124 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:29 crc kubenswrapper[4815]: I1207 19:15:29.826166 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:29 crc kubenswrapper[4815]: I1207 19:15:29.826178 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:29 crc kubenswrapper[4815]: I1207 19:15:29.826199 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:29 crc kubenswrapper[4815]: I1207 19:15:29.826212 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:29Z","lastTransitionTime":"2025-12-07T19:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:29 crc kubenswrapper[4815]: I1207 19:15:29.929177 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:29 crc kubenswrapper[4815]: I1207 19:15:29.929235 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:29 crc kubenswrapper[4815]: I1207 19:15:29.929253 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:29 crc kubenswrapper[4815]: I1207 19:15:29.929277 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:29 crc kubenswrapper[4815]: I1207 19:15:29.929294 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:29Z","lastTransitionTime":"2025-12-07T19:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:30 crc kubenswrapper[4815]: I1207 19:15:30.020712 4815 generic.go:334] "Generic (PLEG): container finished" podID="3fefaece-ab52-48e2-9ee9-fb07be1922f1" containerID="8c9efcd688f4886df56fb4d0e2ff5f5dd0c9e877186e1b53d398b7bed0547609" exitCode=0 Dec 07 19:15:30 crc kubenswrapper[4815]: I1207 19:15:30.020870 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gmf4f" event={"ID":"3fefaece-ab52-48e2-9ee9-fb07be1922f1","Type":"ContainerDied","Data":"8c9efcd688f4886df56fb4d0e2ff5f5dd0c9e877186e1b53d398b7bed0547609"} Dec 07 19:15:30 crc kubenswrapper[4815]: I1207 19:15:30.023583 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-kh7gd" event={"ID":"29b15186-a725-4067-ba76-336f580327fa","Type":"ContainerStarted","Data":"d1fa8bda26912926bc52757424a9930e1eca6ee953f28ffdb205c1b8b8f29d13"} Dec 07 19:15:30 crc kubenswrapper[4815]: I1207 19:15:30.031512 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" event={"ID":"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f","Type":"ContainerStarted","Data":"c63ad6fab49ca4b8deb896aef3b5ff43c3a1ae1035e7ad94815a938939c57c58"} Dec 07 19:15:30 crc kubenswrapper[4815]: I1207 19:15:30.031579 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" event={"ID":"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f","Type":"ContainerStarted","Data":"812356ce6808d8d2a5845dec1132733fab86a0d6853951920ce5fb5083551c7e"} Dec 07 19:15:30 crc kubenswrapper[4815]: I1207 19:15:30.031779 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:30 crc kubenswrapper[4815]: I1207 19:15:30.031834 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:30 crc kubenswrapper[4815]: I1207 19:15:30.031858 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:30 crc kubenswrapper[4815]: I1207 19:15:30.031886 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:30 crc kubenswrapper[4815]: I1207 19:15:30.031911 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:30Z","lastTransitionTime":"2025-12-07T19:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:30 crc kubenswrapper[4815]: I1207 19:15:30.050896 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kh7gd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29b15186-a725-4067-ba76-336f580327fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kh7gd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:30Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:30 crc kubenswrapper[4815]: I1207 19:15:30.072334 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"538dccc2-452b-4d13-8b0d-df993c2b69a2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae001d2e4c277c57382ca0c4a1286c3c135f2fc06ca5a78df27cbdab588ce576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f92a2d37af34c519a6b00a9643f4a46de779ae0070adb0826e9f227960fe4a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89a76372fcc5ff7131084098e41183d148096169b9ccaf69d59dbef45dcfd755\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f29e69a7aa018d68c2e39f3e0f7a619fb25262acaebf83b81fabb6f9b87a9892\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:30Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:30 crc kubenswrapper[4815]: I1207 19:15:30.089036 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cb6cb3bfed28d77ff74b7a2273ddb477e21f930f79b169bfbd98fa10a5ead5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:30Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:30 crc kubenswrapper[4815]: I1207 19:15:30.114962 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a4a4b2d4e80c078ae857e91491c082c6b1c57a5297c485c31fac0c90996b19f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a4a4b2d4e80c078ae857e91491c082c6b1c57a5297c485c31fac0c90996b19f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zzw6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:30Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:30 crc kubenswrapper[4815]: I1207 19:15:30.134245 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:30 crc kubenswrapper[4815]: I1207 19:15:30.134299 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:30 crc kubenswrapper[4815]: I1207 19:15:30.134317 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:30 crc kubenswrapper[4815]: I1207 19:15:30.134345 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:30 crc kubenswrapper[4815]: I1207 19:15:30.134362 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:30Z","lastTransitionTime":"2025-12-07T19:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:30 crc kubenswrapper[4815]: I1207 19:15:30.181073 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b57a8653da7e9136a6c54872ff67a01c2d6ad5077842b783ffe74e0069bc8693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:30Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:30 crc kubenswrapper[4815]: I1207 19:15:30.199905 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:30Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:30 crc kubenswrapper[4815]: I1207 19:15:30.218548 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ec34a6371c74ee0778e27d73f5e2948d9ee1bb518f658675df123df60879c61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc20ae853b989c9e9cfeb47a40bd1a02397e3ce0b0b4feec3fcec56ea7be16a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:30Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:30 crc kubenswrapper[4815]: I1207 19:15:30.231844 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:30Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:30 crc kubenswrapper[4815]: I1207 19:15:30.237334 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:30 crc kubenswrapper[4815]: I1207 19:15:30.237362 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:30 crc kubenswrapper[4815]: I1207 19:15:30.237373 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:30 crc kubenswrapper[4815]: I1207 19:15:30.237387 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:30 crc kubenswrapper[4815]: I1207 19:15:30.237398 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:30Z","lastTransitionTime":"2025-12-07T19:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:30 crc kubenswrapper[4815]: I1207 19:15:30.249574 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nz6q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92d414c2-ecc3-4598-ac9d-b982bfd89c7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://003f0d6dca39f167925bc3a05634d9dd20f23e54135631e0099a56801f9daf3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cp5k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nz6q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:30Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:30 crc kubenswrapper[4815]: I1207 19:15:30.265895 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe01e9d3-9839-41ea-82d9-f4b6f48bf3b2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52ab545f838534306ea70cc6e400e9fb56ed71ed8206d72c6626166f3a51a93c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f59886c065844948639bf9828ed20c24546cc38659683c98162568c6b12c029\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bec885c9a80d9b5b1fe630a9b7a3f19b23b0af05f359bd87a664f007812b4c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c76d9f056d22d1be012327374370269c2f6d6b8bd341c98419fcb744370751e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539bb1f0e569004272aac994e13bdc7f17f342f5986f08beca97a4c888e046c2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1207 19:15:09.905103 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1207 19:15:09.906859 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4228347019/tls.crt::/tmp/serving-cert-4228347019/tls.key\\\\\\\"\\\\nI1207 19:15:20.294696 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1207 19:15:20.298748 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1207 19:15:20.298781 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1207 19:15:20.298843 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1207 19:15:20.298860 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1207 19:15:20.305864 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1207 19:15:20.305885 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1207 19:15:20.305891 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1207 19:15:20.305973 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1207 19:15:20.305983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1207 19:15:20.305988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1207 19:15:20.305992 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1207 19:15:20.305997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1207 19:15:20.308727 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://902a764a3f6e3ac55c1817f369dfd69ffbf8528bfe4444a6514cdabd48f09fb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:30Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:30 crc kubenswrapper[4815]: I1207 19:15:30.280155 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:30Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:30 crc kubenswrapper[4815]: I1207 19:15:30.294817 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d662ba2-aa03-4eea-bd30-8ad40638f6c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4939db657209a8744656f63541b50598888981216b000dd4316a9327fdfbcf34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2v8vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55eba62082762b6527923446c2d419d2eb00919331cf8e59bfa9cb27ca65a82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2v8vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gkn4h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:30Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:30 crc kubenswrapper[4815]: I1207 19:15:30.310011 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gmf4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fefaece-ab52-48e2-9ee9-fb07be1922f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e14454c2c5dbbdcbfd3ff7ec5d618c2dfe48f41af9846b72f61941b878e53d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e14454c2c5dbbdcbfd3ff7ec5d618c2dfe48f41af9846b72f61941b878e53d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f33f1decd49213b8e034c2d1f2a3a51ab99aa18679bd3378becaa27f1ef4b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f33f1decd49213b8e034c2d1f2a3a51ab99aa18679bd3378becaa27f1ef4b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c519c7c8ea1b7405466970c64269b72cc6e8129838764dfc4a662cb4df419c6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c519c7c8ea1b7405466970c64269b72cc6e8129838764dfc4a662cb4df419c6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c9efcd688f4886df56fb4d0e2ff5f5dd0c9e877186e1b53d398b7bed0547609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c9efcd688f4886df56fb4d0e2ff5f5dd0c9e877186e1b53d398b7bed0547609\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gmf4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:30Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:30 crc kubenswrapper[4815]: I1207 19:15:30.324814 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s95hp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b739f36-d9c4-4fb6-9ead-9df05e283dea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73eedf1d85ce69ddf2fed2445ed9318cb4515aee47f89fe9b1db69bbf213ea2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prbsc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s95hp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:30Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:30 crc kubenswrapper[4815]: I1207 19:15:30.337334 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d662ba2-aa03-4eea-bd30-8ad40638f6c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4939db657209a8744656f63541b50598888981216b000dd4316a9327fdfbcf34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2v8vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55eba62082762b6527923446c2d419d2eb00919331cf8e59bfa9cb27ca65a82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2v8vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gkn4h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:30Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:30 crc kubenswrapper[4815]: I1207 19:15:30.339426 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:30 crc kubenswrapper[4815]: I1207 19:15:30.339480 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:30 crc kubenswrapper[4815]: I1207 19:15:30.339492 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:30 crc kubenswrapper[4815]: I1207 19:15:30.339511 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:30 crc kubenswrapper[4815]: I1207 19:15:30.339523 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:30Z","lastTransitionTime":"2025-12-07T19:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:30 crc kubenswrapper[4815]: I1207 19:15:30.355531 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s95hp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b739f36-d9c4-4fb6-9ead-9df05e283dea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73eedf1d85ce69ddf2fed2445ed9318cb4515aee47f89fe9b1db69bbf213ea2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prbsc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s95hp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:30Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:30 crc kubenswrapper[4815]: I1207 19:15:30.375244 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gmf4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fefaece-ab52-48e2-9ee9-fb07be1922f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e14454c2c5dbbdcbfd3ff7ec5d618c2dfe48f41af9846b72f61941b878e53d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e14454c2c5dbbdcbfd3ff7ec5d618c2dfe48f41af9846b72f61941b878e53d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f33f1decd49213b8e034c2d1f2a3a51ab99aa18679bd3378becaa27f1ef4b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f33f1decd49213b8e034c2d1f2a3a51ab99aa18679bd3378becaa27f1ef4b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c519c7c8ea1b7405466970c64269b72cc6e8129838764dfc4a662cb4df419c6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c519c7c8ea1b7405466970c64269b72cc6e8129838764dfc4a662cb4df419c6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c9efcd688f4886df56fb4d0e2ff5f5dd0c9e877186e1b53d398b7bed0547609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c9efcd688f4886df56fb4d0e2ff5f5dd0c9e877186e1b53d398b7bed0547609\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gmf4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:30Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:30 crc kubenswrapper[4815]: I1207 19:15:30.388787 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"538dccc2-452b-4d13-8b0d-df993c2b69a2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae001d2e4c277c57382ca0c4a1286c3c135f2fc06ca5a78df27cbdab588ce576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f92a2d37af34c519a6b00a9643f4a46de779ae0070adb0826e9f227960fe4a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89a76372fcc5ff7131084098e41183d148096169b9ccaf69d59dbef45dcfd755\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f29e69a7aa018d68c2e39f3e0f7a619fb25262acaebf83b81fabb6f9b87a9892\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:30Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:30 crc kubenswrapper[4815]: I1207 19:15:30.401144 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cb6cb3bfed28d77ff74b7a2273ddb477e21f930f79b169bfbd98fa10a5ead5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:30Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:30 crc kubenswrapper[4815]: I1207 19:15:30.420258 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a4a4b2d4e80c078ae857e91491c082c6b1c57a5297c485c31fac0c90996b19f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a4a4b2d4e80c078ae857e91491c082c6b1c57a5297c485c31fac0c90996b19f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zzw6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:30Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:30 crc kubenswrapper[4815]: I1207 19:15:30.429209 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kh7gd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29b15186-a725-4067-ba76-336f580327fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1fa8bda26912926bc52757424a9930e1eca6ee953f28ffdb205c1b8b8f29d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kh7gd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:30Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:30 crc kubenswrapper[4815]: I1207 19:15:30.438456 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nz6q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92d414c2-ecc3-4598-ac9d-b982bfd89c7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://003f0d6dca39f167925bc3a05634d9dd20f23e54135631e0099a56801f9daf3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cp5k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nz6q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:30Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:30 crc kubenswrapper[4815]: I1207 19:15:30.441798 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:30 crc kubenswrapper[4815]: I1207 19:15:30.441834 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:30 crc kubenswrapper[4815]: I1207 19:15:30.441845 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:30 crc kubenswrapper[4815]: I1207 19:15:30.441860 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:30 crc kubenswrapper[4815]: I1207 19:15:30.441871 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:30Z","lastTransitionTime":"2025-12-07T19:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:30 crc kubenswrapper[4815]: I1207 19:15:30.456010 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe01e9d3-9839-41ea-82d9-f4b6f48bf3b2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52ab545f838534306ea70cc6e400e9fb56ed71ed8206d72c6626166f3a51a93c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f59886c065844948639bf9828ed20c24546cc38659683c98162568c6b12c029\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bec885c9a80d9b5b1fe630a9b7a3f19b23b0af05f359bd87a664f007812b4c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c76d9f056d22d1be012327374370269c2f6d6b8bd341c98419fcb744370751e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539bb1f0e569004272aac994e13bdc7f17f342f5986f08beca97a4c888e046c2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1207 19:15:09.905103 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1207 19:15:09.906859 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4228347019/tls.crt::/tmp/serving-cert-4228347019/tls.key\\\\\\\"\\\\nI1207 19:15:20.294696 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1207 19:15:20.298748 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1207 19:15:20.298781 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1207 19:15:20.298843 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1207 19:15:20.298860 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1207 19:15:20.305864 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1207 19:15:20.305885 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1207 19:15:20.305891 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1207 19:15:20.305973 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1207 19:15:20.305983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1207 19:15:20.305988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1207 19:15:20.305992 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1207 19:15:20.305997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1207 19:15:20.308727 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://902a764a3f6e3ac55c1817f369dfd69ffbf8528bfe4444a6514cdabd48f09fb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:30Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:30 crc kubenswrapper[4815]: I1207 19:15:30.469574 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:30Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:30 crc kubenswrapper[4815]: I1207 19:15:30.483867 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b57a8653da7e9136a6c54872ff67a01c2d6ad5077842b783ffe74e0069bc8693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:30Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:30 crc kubenswrapper[4815]: I1207 19:15:30.496005 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:30Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:30 crc kubenswrapper[4815]: I1207 19:15:30.513309 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ec34a6371c74ee0778e27d73f5e2948d9ee1bb518f658675df123df60879c61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc20ae853b989c9e9cfeb47a40bd1a02397e3ce0b0b4feec3fcec56ea7be16a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:30Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:30 crc kubenswrapper[4815]: I1207 19:15:30.528461 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:30Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:30 crc kubenswrapper[4815]: I1207 19:15:30.544130 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:30 crc kubenswrapper[4815]: I1207 19:15:30.544195 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:30 crc kubenswrapper[4815]: I1207 19:15:30.544218 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:30 crc kubenswrapper[4815]: I1207 19:15:30.544250 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:30 crc kubenswrapper[4815]: I1207 19:15:30.544341 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:30Z","lastTransitionTime":"2025-12-07T19:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:30 crc kubenswrapper[4815]: I1207 19:15:30.646814 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:30 crc kubenswrapper[4815]: I1207 19:15:30.646866 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:30 crc kubenswrapper[4815]: I1207 19:15:30.646883 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:30 crc kubenswrapper[4815]: I1207 19:15:30.646905 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:30 crc kubenswrapper[4815]: I1207 19:15:30.646946 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:30Z","lastTransitionTime":"2025-12-07T19:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:30 crc kubenswrapper[4815]: I1207 19:15:30.751040 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:30 crc kubenswrapper[4815]: I1207 19:15:30.751099 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:30 crc kubenswrapper[4815]: I1207 19:15:30.751117 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:30 crc kubenswrapper[4815]: I1207 19:15:30.751146 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:30 crc kubenswrapper[4815]: I1207 19:15:30.751162 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:30Z","lastTransitionTime":"2025-12-07T19:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:30 crc kubenswrapper[4815]: I1207 19:15:30.769416 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 07 19:15:30 crc kubenswrapper[4815]: E1207 19:15:30.769608 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 07 19:15:30 crc kubenswrapper[4815]: I1207 19:15:30.854365 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:30 crc kubenswrapper[4815]: I1207 19:15:30.854437 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:30 crc kubenswrapper[4815]: I1207 19:15:30.854456 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:30 crc kubenswrapper[4815]: I1207 19:15:30.854483 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:30 crc kubenswrapper[4815]: I1207 19:15:30.854501 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:30Z","lastTransitionTime":"2025-12-07T19:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:30 crc kubenswrapper[4815]: I1207 19:15:30.956693 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:30 crc kubenswrapper[4815]: I1207 19:15:30.957060 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:30 crc kubenswrapper[4815]: I1207 19:15:30.957226 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:30 crc kubenswrapper[4815]: I1207 19:15:30.957354 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:30 crc kubenswrapper[4815]: I1207 19:15:30.957484 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:30Z","lastTransitionTime":"2025-12-07T19:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:31 crc kubenswrapper[4815]: I1207 19:15:31.041224 4815 generic.go:334] "Generic (PLEG): container finished" podID="3fefaece-ab52-48e2-9ee9-fb07be1922f1" containerID="4a4a622d5231f5bacbb9dba0290a87f10310a0053be2dd692718fcf1002deb44" exitCode=0 Dec 07 19:15:31 crc kubenswrapper[4815]: I1207 19:15:31.041302 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gmf4f" event={"ID":"3fefaece-ab52-48e2-9ee9-fb07be1922f1","Type":"ContainerDied","Data":"4a4a622d5231f5bacbb9dba0290a87f10310a0053be2dd692718fcf1002deb44"} Dec 07 19:15:31 crc kubenswrapper[4815]: I1207 19:15:31.060236 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:31 crc kubenswrapper[4815]: I1207 19:15:31.060277 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:31 crc kubenswrapper[4815]: I1207 19:15:31.060295 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:31 crc kubenswrapper[4815]: I1207 19:15:31.060320 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:31 crc kubenswrapper[4815]: I1207 19:15:31.060339 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:31Z","lastTransitionTime":"2025-12-07T19:15:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:31 crc kubenswrapper[4815]: I1207 19:15:31.061180 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:31Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:31 crc kubenswrapper[4815]: I1207 19:15:31.083315 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ec34a6371c74ee0778e27d73f5e2948d9ee1bb518f658675df123df60879c61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc20ae853b989c9e9cfeb47a40bd1a02397e3ce0b0b4feec3fcec56ea7be16a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:31Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:31 crc kubenswrapper[4815]: I1207 19:15:31.102078 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:31Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:31 crc kubenswrapper[4815]: I1207 19:15:31.115034 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nz6q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92d414c2-ecc3-4598-ac9d-b982bfd89c7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://003f0d6dca39f167925bc3a05634d9dd20f23e54135631e0099a56801f9daf3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cp5k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nz6q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:31Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:31 crc kubenswrapper[4815]: I1207 19:15:31.134290 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe01e9d3-9839-41ea-82d9-f4b6f48bf3b2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52ab545f838534306ea70cc6e400e9fb56ed71ed8206d72c6626166f3a51a93c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f59886c065844948639bf9828ed20c24546cc38659683c98162568c6b12c029\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bec885c9a80d9b5b1fe630a9b7a3f19b23b0af05f359bd87a664f007812b4c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c76d9f056d22d1be012327374370269c2f6d6b8bd341c98419fcb744370751e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539bb1f0e569004272aac994e13bdc7f17f342f5986f08beca97a4c888e046c2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1207 19:15:09.905103 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1207 19:15:09.906859 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4228347019/tls.crt::/tmp/serving-cert-4228347019/tls.key\\\\\\\"\\\\nI1207 19:15:20.294696 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1207 19:15:20.298748 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1207 19:15:20.298781 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1207 19:15:20.298843 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1207 19:15:20.298860 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1207 19:15:20.305864 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1207 19:15:20.305885 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1207 19:15:20.305891 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1207 19:15:20.305973 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1207 19:15:20.305983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1207 19:15:20.305988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1207 19:15:20.305992 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1207 19:15:20.305997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1207 19:15:20.308727 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://902a764a3f6e3ac55c1817f369dfd69ffbf8528bfe4444a6514cdabd48f09fb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:31Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:31 crc kubenswrapper[4815]: I1207 19:15:31.151338 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:31Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:31 crc kubenswrapper[4815]: I1207 19:15:31.163537 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:31 crc kubenswrapper[4815]: I1207 19:15:31.163598 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:31 crc kubenswrapper[4815]: I1207 19:15:31.163618 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:31 crc kubenswrapper[4815]: I1207 19:15:31.163643 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:31 crc kubenswrapper[4815]: I1207 19:15:31.163661 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:31Z","lastTransitionTime":"2025-12-07T19:15:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:31 crc kubenswrapper[4815]: I1207 19:15:31.164996 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b57a8653da7e9136a6c54872ff67a01c2d6ad5077842b783ffe74e0069bc8693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:31Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:31 crc kubenswrapper[4815]: I1207 19:15:31.183950 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d662ba2-aa03-4eea-bd30-8ad40638f6c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4939db657209a8744656f63541b50598888981216b000dd4316a9327fdfbcf34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2v8vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55eba62082762b6527923446c2d419d2eb00919331cf8e59bfa9cb27ca65a82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2v8vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gkn4h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:31Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:31 crc kubenswrapper[4815]: I1207 19:15:31.201123 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s95hp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b739f36-d9c4-4fb6-9ead-9df05e283dea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73eedf1d85ce69ddf2fed2445ed9318cb4515aee47f89fe9b1db69bbf213ea2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prbsc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s95hp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:31Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:31 crc kubenswrapper[4815]: I1207 19:15:31.219256 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gmf4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fefaece-ab52-48e2-9ee9-fb07be1922f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e14454c2c5dbbdcbfd3ff7ec5d618c2dfe48f41af9846b72f61941b878e53d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e14454c2c5dbbdcbfd3ff7ec5d618c2dfe48f41af9846b72f61941b878e53d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f33f1decd49213b8e034c2d1f2a3a51ab99aa18679bd3378becaa27f1ef4b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f33f1decd49213b8e034c2d1f2a3a51ab99aa18679bd3378becaa27f1ef4b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c519c7c8ea1b7405466970c64269b72cc6e8129838764dfc4a662cb4df419c6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c519c7c8ea1b7405466970c64269b72cc6e8129838764dfc4a662cb4df419c6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c9efcd688f4886df56fb4d0e2ff5f5dd0c9e877186e1b53d398b7bed0547609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c9efcd688f4886df56fb4d0e2ff5f5dd0c9e877186e1b53d398b7bed0547609\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a4a622d5231f5bacbb9dba0290a87f10310a0053be2dd692718fcf1002deb44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a4a622d5231f5bacbb9dba0290a87f10310a0053be2dd692718fcf1002deb44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gmf4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:31Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:31 crc kubenswrapper[4815]: I1207 19:15:31.238552 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"538dccc2-452b-4d13-8b0d-df993c2b69a2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae001d2e4c277c57382ca0c4a1286c3c135f2fc06ca5a78df27cbdab588ce576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f92a2d37af34c519a6b00a9643f4a46de779ae0070adb0826e9f227960fe4a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89a76372fcc5ff7131084098e41183d148096169b9ccaf69d59dbef45dcfd755\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f29e69a7aa018d68c2e39f3e0f7a619fb25262acaebf83b81fabb6f9b87a9892\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:31Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:31 crc kubenswrapper[4815]: I1207 19:15:31.253211 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cb6cb3bfed28d77ff74b7a2273ddb477e21f930f79b169bfbd98fa10a5ead5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:31Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:31 crc kubenswrapper[4815]: I1207 19:15:31.266456 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:31 crc kubenswrapper[4815]: I1207 19:15:31.266504 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:31 crc kubenswrapper[4815]: I1207 19:15:31.266518 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:31 crc kubenswrapper[4815]: I1207 19:15:31.266537 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:31 crc kubenswrapper[4815]: I1207 19:15:31.266550 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:31Z","lastTransitionTime":"2025-12-07T19:15:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:31 crc kubenswrapper[4815]: I1207 19:15:31.277292 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a4a4b2d4e80c078ae857e91491c082c6b1c57a5297c485c31fac0c90996b19f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a4a4b2d4e80c078ae857e91491c082c6b1c57a5297c485c31fac0c90996b19f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zzw6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:31Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:31 crc kubenswrapper[4815]: I1207 19:15:31.287977 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kh7gd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29b15186-a725-4067-ba76-336f580327fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1fa8bda26912926bc52757424a9930e1eca6ee953f28ffdb205c1b8b8f29d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kh7gd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:31Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:31 crc kubenswrapper[4815]: I1207 19:15:31.369224 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:31 crc kubenswrapper[4815]: I1207 19:15:31.369269 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:31 crc kubenswrapper[4815]: I1207 19:15:31.369279 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:31 crc kubenswrapper[4815]: I1207 19:15:31.369295 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:31 crc kubenswrapper[4815]: I1207 19:15:31.369305 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:31Z","lastTransitionTime":"2025-12-07T19:15:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:31 crc kubenswrapper[4815]: I1207 19:15:31.495950 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:31 crc kubenswrapper[4815]: I1207 19:15:31.496017 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:31 crc kubenswrapper[4815]: I1207 19:15:31.496035 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:31 crc kubenswrapper[4815]: I1207 19:15:31.496063 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:31 crc kubenswrapper[4815]: I1207 19:15:31.496080 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:31Z","lastTransitionTime":"2025-12-07T19:15:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:31 crc kubenswrapper[4815]: I1207 19:15:31.598878 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:31 crc kubenswrapper[4815]: I1207 19:15:31.599177 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:31 crc kubenswrapper[4815]: I1207 19:15:31.599190 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:31 crc kubenswrapper[4815]: I1207 19:15:31.599208 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:31 crc kubenswrapper[4815]: I1207 19:15:31.599221 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:31Z","lastTransitionTime":"2025-12-07T19:15:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:31 crc kubenswrapper[4815]: I1207 19:15:31.701831 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:31 crc kubenswrapper[4815]: I1207 19:15:31.701874 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:31 crc kubenswrapper[4815]: I1207 19:15:31.701885 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:31 crc kubenswrapper[4815]: I1207 19:15:31.701901 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:31 crc kubenswrapper[4815]: I1207 19:15:31.701928 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:31Z","lastTransitionTime":"2025-12-07T19:15:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:31 crc kubenswrapper[4815]: I1207 19:15:31.769199 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 07 19:15:31 crc kubenswrapper[4815]: I1207 19:15:31.769292 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 07 19:15:31 crc kubenswrapper[4815]: E1207 19:15:31.769394 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 07 19:15:31 crc kubenswrapper[4815]: E1207 19:15:31.769505 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 07 19:15:31 crc kubenswrapper[4815]: I1207 19:15:31.804757 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:31 crc kubenswrapper[4815]: I1207 19:15:31.804796 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:31 crc kubenswrapper[4815]: I1207 19:15:31.804807 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:31 crc kubenswrapper[4815]: I1207 19:15:31.804824 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:31 crc kubenswrapper[4815]: I1207 19:15:31.804837 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:31Z","lastTransitionTime":"2025-12-07T19:15:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:31 crc kubenswrapper[4815]: I1207 19:15:31.906645 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:31 crc kubenswrapper[4815]: I1207 19:15:31.906684 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:31 crc kubenswrapper[4815]: I1207 19:15:31.906696 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:31 crc kubenswrapper[4815]: I1207 19:15:31.906714 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:31 crc kubenswrapper[4815]: I1207 19:15:31.906729 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:31Z","lastTransitionTime":"2025-12-07T19:15:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:32 crc kubenswrapper[4815]: I1207 19:15:32.009879 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:32 crc kubenswrapper[4815]: I1207 19:15:32.009969 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:32 crc kubenswrapper[4815]: I1207 19:15:32.009988 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:32 crc kubenswrapper[4815]: I1207 19:15:32.010016 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:32 crc kubenswrapper[4815]: I1207 19:15:32.010040 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:32Z","lastTransitionTime":"2025-12-07T19:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:32 crc kubenswrapper[4815]: I1207 19:15:32.048512 4815 generic.go:334] "Generic (PLEG): container finished" podID="3fefaece-ab52-48e2-9ee9-fb07be1922f1" containerID="825649f5768765281bad22f9b86ed162b80cc258ce11a746ec06c9f415a21529" exitCode=0 Dec 07 19:15:32 crc kubenswrapper[4815]: I1207 19:15:32.048616 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gmf4f" event={"ID":"3fefaece-ab52-48e2-9ee9-fb07be1922f1","Type":"ContainerDied","Data":"825649f5768765281bad22f9b86ed162b80cc258ce11a746ec06c9f415a21529"} Dec 07 19:15:32 crc kubenswrapper[4815]: I1207 19:15:32.056376 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" event={"ID":"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f","Type":"ContainerStarted","Data":"ba72f8ec4b6a36d0c1a6fd8f9f8e5fad577a00b8ae515738a007b994b989d134"} Dec 07 19:15:32 crc kubenswrapper[4815]: I1207 19:15:32.073885 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b57a8653da7e9136a6c54872ff67a01c2d6ad5077842b783ffe74e0069bc8693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:32Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:32 crc kubenswrapper[4815]: I1207 19:15:32.088455 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:32Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:32 crc kubenswrapper[4815]: I1207 19:15:32.102393 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ec34a6371c74ee0778e27d73f5e2948d9ee1bb518f658675df123df60879c61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc20ae853b989c9e9cfeb47a40bd1a02397e3ce0b0b4feec3fcec56ea7be16a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:32Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:32 crc kubenswrapper[4815]: I1207 19:15:32.112767 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:32 crc kubenswrapper[4815]: I1207 19:15:32.112874 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:32 crc kubenswrapper[4815]: I1207 19:15:32.112897 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:32 crc kubenswrapper[4815]: I1207 19:15:32.112953 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:32 crc kubenswrapper[4815]: I1207 19:15:32.112974 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:32Z","lastTransitionTime":"2025-12-07T19:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:32 crc kubenswrapper[4815]: I1207 19:15:32.123363 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:32Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:32 crc kubenswrapper[4815]: I1207 19:15:32.135671 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nz6q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92d414c2-ecc3-4598-ac9d-b982bfd89c7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://003f0d6dca39f167925bc3a05634d9dd20f23e54135631e0099a56801f9daf3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cp5k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nz6q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:32Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:32 crc kubenswrapper[4815]: I1207 19:15:32.151493 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe01e9d3-9839-41ea-82d9-f4b6f48bf3b2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52ab545f838534306ea70cc6e400e9fb56ed71ed8206d72c6626166f3a51a93c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f59886c065844948639bf9828ed20c24546cc38659683c98162568c6b12c029\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bec885c9a80d9b5b1fe630a9b7a3f19b23b0af05f359bd87a664f007812b4c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c76d9f056d22d1be012327374370269c2f6d6b8bd341c98419fcb744370751e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539bb1f0e569004272aac994e13bdc7f17f342f5986f08beca97a4c888e046c2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1207 19:15:09.905103 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1207 19:15:09.906859 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4228347019/tls.crt::/tmp/serving-cert-4228347019/tls.key\\\\\\\"\\\\nI1207 19:15:20.294696 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1207 19:15:20.298748 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1207 19:15:20.298781 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1207 19:15:20.298843 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1207 19:15:20.298860 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1207 19:15:20.305864 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1207 19:15:20.305885 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1207 19:15:20.305891 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1207 19:15:20.305973 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1207 19:15:20.305983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1207 19:15:20.305988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1207 19:15:20.305992 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1207 19:15:20.305997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1207 19:15:20.308727 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://902a764a3f6e3ac55c1817f369dfd69ffbf8528bfe4444a6514cdabd48f09fb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:32Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:32 crc kubenswrapper[4815]: I1207 19:15:32.163333 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:32Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:32 crc kubenswrapper[4815]: I1207 19:15:32.173865 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d662ba2-aa03-4eea-bd30-8ad40638f6c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4939db657209a8744656f63541b50598888981216b000dd4316a9327fdfbcf34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2v8vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55eba62082762b6527923446c2d419d2eb00919331cf8e59bfa9cb27ca65a82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2v8vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gkn4h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:32Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:32 crc kubenswrapper[4815]: I1207 19:15:32.199528 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gmf4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fefaece-ab52-48e2-9ee9-fb07be1922f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e14454c2c5dbbdcbfd3ff7ec5d618c2dfe48f41af9846b72f61941b878e53d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e14454c2c5dbbdcbfd3ff7ec5d618c2dfe48f41af9846b72f61941b878e53d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f33f1decd49213b8e034c2d1f2a3a51ab99aa18679bd3378becaa27f1ef4b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f33f1decd49213b8e034c2d1f2a3a51ab99aa18679bd3378becaa27f1ef4b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c519c7c8ea1b7405466970c64269b72cc6e8129838764dfc4a662cb4df419c6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c519c7c8ea1b7405466970c64269b72cc6e8129838764dfc4a662cb4df419c6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c9efcd688f4886df56fb4d0e2ff5f5dd0c9e877186e1b53d398b7bed0547609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c9efcd688f4886df56fb4d0e2ff5f5dd0c9e877186e1b53d398b7bed0547609\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a4a622d5231f5bacbb9dba0290a87f10310a0053be2dd692718fcf1002deb44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a4a622d5231f5bacbb9dba0290a87f10310a0053be2dd692718fcf1002deb44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://825649f5768765281bad22f9b86ed162b80cc258ce11a746ec06c9f415a21529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://825649f5768765281bad22f9b86ed162b80cc258ce11a746ec06c9f415a21529\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gmf4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:32Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:32 crc kubenswrapper[4815]: I1207 19:15:32.215177 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:32 crc kubenswrapper[4815]: I1207 19:15:32.215213 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:32 crc kubenswrapper[4815]: I1207 19:15:32.215224 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:32 crc kubenswrapper[4815]: I1207 19:15:32.215241 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:32 crc kubenswrapper[4815]: I1207 19:15:32.215253 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:32Z","lastTransitionTime":"2025-12-07T19:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:32 crc kubenswrapper[4815]: I1207 19:15:32.221541 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s95hp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b739f36-d9c4-4fb6-9ead-9df05e283dea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73eedf1d85ce69ddf2fed2445ed9318cb4515aee47f89fe9b1db69bbf213ea2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prbsc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s95hp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:32Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:32 crc kubenswrapper[4815]: I1207 19:15:32.238839 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kh7gd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29b15186-a725-4067-ba76-336f580327fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1fa8bda26912926bc52757424a9930e1eca6ee953f28ffdb205c1b8b8f29d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kh7gd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:32Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:32 crc kubenswrapper[4815]: I1207 19:15:32.254813 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"538dccc2-452b-4d13-8b0d-df993c2b69a2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae001d2e4c277c57382ca0c4a1286c3c135f2fc06ca5a78df27cbdab588ce576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f92a2d37af34c519a6b00a9643f4a46de779ae0070adb0826e9f227960fe4a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89a76372fcc5ff7131084098e41183d148096169b9ccaf69d59dbef45dcfd755\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f29e69a7aa018d68c2e39f3e0f7a619fb25262acaebf83b81fabb6f9b87a9892\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:32Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:32 crc kubenswrapper[4815]: I1207 19:15:32.271324 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cb6cb3bfed28d77ff74b7a2273ddb477e21f930f79b169bfbd98fa10a5ead5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:32Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:32 crc kubenswrapper[4815]: I1207 19:15:32.293500 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a4a4b2d4e80c078ae857e91491c082c6b1c57a5297c485c31fac0c90996b19f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a4a4b2d4e80c078ae857e91491c082c6b1c57a5297c485c31fac0c90996b19f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zzw6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:32Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:32 crc kubenswrapper[4815]: I1207 19:15:32.318249 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:32 crc kubenswrapper[4815]: I1207 19:15:32.318323 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:32 crc kubenswrapper[4815]: I1207 19:15:32.318342 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:32 crc kubenswrapper[4815]: I1207 19:15:32.318370 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:32 crc kubenswrapper[4815]: I1207 19:15:32.318388 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:32Z","lastTransitionTime":"2025-12-07T19:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:32 crc kubenswrapper[4815]: I1207 19:15:32.426034 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:32 crc kubenswrapper[4815]: I1207 19:15:32.426101 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:32 crc kubenswrapper[4815]: I1207 19:15:32.426120 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:32 crc kubenswrapper[4815]: I1207 19:15:32.426148 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:32 crc kubenswrapper[4815]: I1207 19:15:32.426171 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:32Z","lastTransitionTime":"2025-12-07T19:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:32 crc kubenswrapper[4815]: I1207 19:15:32.529401 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:32 crc kubenswrapper[4815]: I1207 19:15:32.529445 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:32 crc kubenswrapper[4815]: I1207 19:15:32.529457 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:32 crc kubenswrapper[4815]: I1207 19:15:32.529476 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:32 crc kubenswrapper[4815]: I1207 19:15:32.529490 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:32Z","lastTransitionTime":"2025-12-07T19:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:32 crc kubenswrapper[4815]: I1207 19:15:32.633567 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:32 crc kubenswrapper[4815]: I1207 19:15:32.633638 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:32 crc kubenswrapper[4815]: I1207 19:15:32.633657 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:32 crc kubenswrapper[4815]: I1207 19:15:32.633681 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:32 crc kubenswrapper[4815]: I1207 19:15:32.633700 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:32Z","lastTransitionTime":"2025-12-07T19:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:32 crc kubenswrapper[4815]: I1207 19:15:32.736839 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:32 crc kubenswrapper[4815]: I1207 19:15:32.736881 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:32 crc kubenswrapper[4815]: I1207 19:15:32.736896 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:32 crc kubenswrapper[4815]: I1207 19:15:32.736944 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:32 crc kubenswrapper[4815]: I1207 19:15:32.736959 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:32Z","lastTransitionTime":"2025-12-07T19:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:32 crc kubenswrapper[4815]: I1207 19:15:32.769355 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 07 19:15:32 crc kubenswrapper[4815]: E1207 19:15:32.769598 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 07 19:15:32 crc kubenswrapper[4815]: I1207 19:15:32.840014 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:32 crc kubenswrapper[4815]: I1207 19:15:32.840083 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:32 crc kubenswrapper[4815]: I1207 19:15:32.840106 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:32 crc kubenswrapper[4815]: I1207 19:15:32.840134 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:32 crc kubenswrapper[4815]: I1207 19:15:32.840156 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:32Z","lastTransitionTime":"2025-12-07T19:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:32 crc kubenswrapper[4815]: I1207 19:15:32.947013 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:32 crc kubenswrapper[4815]: I1207 19:15:32.947073 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:32 crc kubenswrapper[4815]: I1207 19:15:32.947091 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:32 crc kubenswrapper[4815]: I1207 19:15:32.947119 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:32 crc kubenswrapper[4815]: I1207 19:15:32.947141 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:32Z","lastTransitionTime":"2025-12-07T19:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:33 crc kubenswrapper[4815]: I1207 19:15:33.050408 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:33 crc kubenswrapper[4815]: I1207 19:15:33.050473 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:33 crc kubenswrapper[4815]: I1207 19:15:33.050490 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:33 crc kubenswrapper[4815]: I1207 19:15:33.050515 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:33 crc kubenswrapper[4815]: I1207 19:15:33.050532 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:33Z","lastTransitionTime":"2025-12-07T19:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:33 crc kubenswrapper[4815]: I1207 19:15:33.066097 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gmf4f" event={"ID":"3fefaece-ab52-48e2-9ee9-fb07be1922f1","Type":"ContainerStarted","Data":"600d32fcae56dd0fa63421283e9b0fdfc23c4589a1dc4a66e5ff1f4c6a415e2f"} Dec 07 19:15:33 crc kubenswrapper[4815]: I1207 19:15:33.089192 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s95hp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b739f36-d9c4-4fb6-9ead-9df05e283dea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73eedf1d85ce69ddf2fed2445ed9318cb4515aee47f89fe9b1db69bbf213ea2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prbsc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s95hp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:33Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:33 crc kubenswrapper[4815]: I1207 19:15:33.112718 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gmf4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fefaece-ab52-48e2-9ee9-fb07be1922f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://600d32fcae56dd0fa63421283e9b0fdfc23c4589a1dc4a66e5ff1f4c6a415e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e14454c2c5dbbdcbfd3ff7ec5d618c2dfe48f41af9846b72f61941b878e53d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e14454c2c5dbbdcbfd3ff7ec5d618c2dfe48f41af9846b72f61941b878e53d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f33f1decd49213b8e034c2d1f2a3a51ab99aa18679bd3378becaa27f1ef4b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f33f1decd49213b8e034c2d1f2a3a51ab99aa18679bd3378becaa27f1ef4b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c519c7c8ea1b7405466970c64269b72cc6e8129838764dfc4a662cb4df419c6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c519c7c8ea1b7405466970c64269b72cc6e8129838764dfc4a662cb4df419c6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c9efcd688f4886df56fb4d0e2ff5f5dd0c9e877186e1b53d398b7bed0547609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c9efcd688f4886df56fb4d0e2ff5f5dd0c9e877186e1b53d398b7bed0547609\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a4a622d5231f5bacbb9dba0290a87f10310a0053be2dd692718fcf1002deb44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a4a622d5231f5bacbb9dba0290a87f10310a0053be2dd692718fcf1002deb44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://825649f5768765281bad22f9b86ed162b80cc258ce11a746ec06c9f415a21529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://825649f5768765281bad22f9b86ed162b80cc258ce11a746ec06c9f415a21529\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gmf4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:33Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:33 crc kubenswrapper[4815]: I1207 19:15:33.133385 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"538dccc2-452b-4d13-8b0d-df993c2b69a2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae001d2e4c277c57382ca0c4a1286c3c135f2fc06ca5a78df27cbdab588ce576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f92a2d37af34c519a6b00a9643f4a46de779ae0070adb0826e9f227960fe4a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89a76372fcc5ff7131084098e41183d148096169b9ccaf69d59dbef45dcfd755\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f29e69a7aa018d68c2e39f3e0f7a619fb25262acaebf83b81fabb6f9b87a9892\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:33Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:33 crc kubenswrapper[4815]: I1207 19:15:33.151212 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cb6cb3bfed28d77ff74b7a2273ddb477e21f930f79b169bfbd98fa10a5ead5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:33Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:33 crc kubenswrapper[4815]: I1207 19:15:33.153762 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:33 crc kubenswrapper[4815]: I1207 19:15:33.153830 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:33 crc kubenswrapper[4815]: I1207 19:15:33.153857 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:33 crc kubenswrapper[4815]: I1207 19:15:33.153888 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:33 crc kubenswrapper[4815]: I1207 19:15:33.153911 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:33Z","lastTransitionTime":"2025-12-07T19:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:33 crc kubenswrapper[4815]: I1207 19:15:33.181748 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a4a4b2d4e80c078ae857e91491c082c6b1c57a5297c485c31fac0c90996b19f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a4a4b2d4e80c078ae857e91491c082c6b1c57a5297c485c31fac0c90996b19f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zzw6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:33Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:33 crc kubenswrapper[4815]: I1207 19:15:33.197760 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kh7gd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29b15186-a725-4067-ba76-336f580327fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1fa8bda26912926bc52757424a9930e1eca6ee953f28ffdb205c1b8b8f29d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kh7gd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:33Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:33 crc kubenswrapper[4815]: I1207 19:15:33.216410 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nz6q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92d414c2-ecc3-4598-ac9d-b982bfd89c7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://003f0d6dca39f167925bc3a05634d9dd20f23e54135631e0099a56801f9daf3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cp5k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nz6q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:33Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:33 crc kubenswrapper[4815]: I1207 19:15:33.241070 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe01e9d3-9839-41ea-82d9-f4b6f48bf3b2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52ab545f838534306ea70cc6e400e9fb56ed71ed8206d72c6626166f3a51a93c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f59886c065844948639bf9828ed20c24546cc38659683c98162568c6b12c029\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bec885c9a80d9b5b1fe630a9b7a3f19b23b0af05f359bd87a664f007812b4c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c76d9f056d22d1be012327374370269c2f6d6b8bd341c98419fcb744370751e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539bb1f0e569004272aac994e13bdc7f17f342f5986f08beca97a4c888e046c2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1207 19:15:09.905103 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1207 19:15:09.906859 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4228347019/tls.crt::/tmp/serving-cert-4228347019/tls.key\\\\\\\"\\\\nI1207 19:15:20.294696 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1207 19:15:20.298748 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1207 19:15:20.298781 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1207 19:15:20.298843 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1207 19:15:20.298860 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1207 19:15:20.305864 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1207 19:15:20.305885 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1207 19:15:20.305891 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1207 19:15:20.305973 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1207 19:15:20.305983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1207 19:15:20.305988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1207 19:15:20.305992 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1207 19:15:20.305997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1207 19:15:20.308727 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://902a764a3f6e3ac55c1817f369dfd69ffbf8528bfe4444a6514cdabd48f09fb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:33Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:33 crc kubenswrapper[4815]: I1207 19:15:33.256632 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:33 crc kubenswrapper[4815]: I1207 19:15:33.256667 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:33 crc kubenswrapper[4815]: I1207 19:15:33.256678 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:33 crc kubenswrapper[4815]: I1207 19:15:33.256694 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:33 crc kubenswrapper[4815]: I1207 19:15:33.256707 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:33Z","lastTransitionTime":"2025-12-07T19:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:33 crc kubenswrapper[4815]: I1207 19:15:33.264403 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:33Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:33 crc kubenswrapper[4815]: I1207 19:15:33.282791 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b57a8653da7e9136a6c54872ff67a01c2d6ad5077842b783ffe74e0069bc8693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:33Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:33 crc kubenswrapper[4815]: I1207 19:15:33.300664 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:33Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:33 crc kubenswrapper[4815]: I1207 19:15:33.320090 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ec34a6371c74ee0778e27d73f5e2948d9ee1bb518f658675df123df60879c61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc20ae853b989c9e9cfeb47a40bd1a02397e3ce0b0b4feec3fcec56ea7be16a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:33Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:33 crc kubenswrapper[4815]: I1207 19:15:33.339491 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:33Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:33 crc kubenswrapper[4815]: I1207 19:15:33.359494 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:33 crc kubenswrapper[4815]: I1207 19:15:33.359582 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:33 crc kubenswrapper[4815]: I1207 19:15:33.359666 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d662ba2-aa03-4eea-bd30-8ad40638f6c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4939db657209a8744656f63541b50598888981216b000dd4316a9327fdfbcf34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2v8vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55eba62082762b6527923446c2d419d2eb00919331cf8e59bfa9cb27ca65a82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2v8vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gkn4h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:33Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:33 crc kubenswrapper[4815]: I1207 19:15:33.359738 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:33 crc kubenswrapper[4815]: I1207 19:15:33.359806 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:33 crc kubenswrapper[4815]: I1207 19:15:33.359826 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:33Z","lastTransitionTime":"2025-12-07T19:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:33 crc kubenswrapper[4815]: I1207 19:15:33.462408 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:33 crc kubenswrapper[4815]: I1207 19:15:33.462473 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:33 crc kubenswrapper[4815]: I1207 19:15:33.462492 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:33 crc kubenswrapper[4815]: I1207 19:15:33.462518 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:33 crc kubenswrapper[4815]: I1207 19:15:33.462536 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:33Z","lastTransitionTime":"2025-12-07T19:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:33 crc kubenswrapper[4815]: I1207 19:15:33.565747 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:33 crc kubenswrapper[4815]: I1207 19:15:33.565815 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:33 crc kubenswrapper[4815]: I1207 19:15:33.565850 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:33 crc kubenswrapper[4815]: I1207 19:15:33.565883 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:33 crc kubenswrapper[4815]: I1207 19:15:33.565911 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:33Z","lastTransitionTime":"2025-12-07T19:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:33 crc kubenswrapper[4815]: I1207 19:15:33.669218 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:33 crc kubenswrapper[4815]: I1207 19:15:33.669328 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:33 crc kubenswrapper[4815]: I1207 19:15:33.669352 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:33 crc kubenswrapper[4815]: I1207 19:15:33.669380 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:33 crc kubenswrapper[4815]: I1207 19:15:33.669403 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:33Z","lastTransitionTime":"2025-12-07T19:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:33 crc kubenswrapper[4815]: I1207 19:15:33.769334 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 07 19:15:33 crc kubenswrapper[4815]: E1207 19:15:33.769533 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 07 19:15:33 crc kubenswrapper[4815]: I1207 19:15:33.769611 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 07 19:15:33 crc kubenswrapper[4815]: E1207 19:15:33.769794 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 07 19:15:33 crc kubenswrapper[4815]: I1207 19:15:33.771706 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:33 crc kubenswrapper[4815]: I1207 19:15:33.771765 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:33 crc kubenswrapper[4815]: I1207 19:15:33.771784 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:33 crc kubenswrapper[4815]: I1207 19:15:33.771808 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:33 crc kubenswrapper[4815]: I1207 19:15:33.771828 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:33Z","lastTransitionTime":"2025-12-07T19:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:33 crc kubenswrapper[4815]: I1207 19:15:33.874468 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:33 crc kubenswrapper[4815]: I1207 19:15:33.874512 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:33 crc kubenswrapper[4815]: I1207 19:15:33.874529 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:33 crc kubenswrapper[4815]: I1207 19:15:33.874552 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:33 crc kubenswrapper[4815]: I1207 19:15:33.874573 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:33Z","lastTransitionTime":"2025-12-07T19:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:33 crc kubenswrapper[4815]: I1207 19:15:33.977623 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:33 crc kubenswrapper[4815]: I1207 19:15:33.977720 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:33 crc kubenswrapper[4815]: I1207 19:15:33.977740 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:33 crc kubenswrapper[4815]: I1207 19:15:33.977763 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:33 crc kubenswrapper[4815]: I1207 19:15:33.977820 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:33Z","lastTransitionTime":"2025-12-07T19:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:34 crc kubenswrapper[4815]: I1207 19:15:34.080464 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:34 crc kubenswrapper[4815]: I1207 19:15:34.080515 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:34 crc kubenswrapper[4815]: I1207 19:15:34.080535 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:34 crc kubenswrapper[4815]: I1207 19:15:34.080561 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:34 crc kubenswrapper[4815]: I1207 19:15:34.080582 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:34Z","lastTransitionTime":"2025-12-07T19:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:34 crc kubenswrapper[4815]: I1207 19:15:34.183075 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:34 crc kubenswrapper[4815]: I1207 19:15:34.183115 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:34 crc kubenswrapper[4815]: I1207 19:15:34.183128 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:34 crc kubenswrapper[4815]: I1207 19:15:34.183146 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:34 crc kubenswrapper[4815]: I1207 19:15:34.183160 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:34Z","lastTransitionTime":"2025-12-07T19:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:34 crc kubenswrapper[4815]: I1207 19:15:34.286160 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:34 crc kubenswrapper[4815]: I1207 19:15:34.286207 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:34 crc kubenswrapper[4815]: I1207 19:15:34.286218 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:34 crc kubenswrapper[4815]: I1207 19:15:34.286236 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:34 crc kubenswrapper[4815]: I1207 19:15:34.286249 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:34Z","lastTransitionTime":"2025-12-07T19:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:34 crc kubenswrapper[4815]: I1207 19:15:34.388284 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:34 crc kubenswrapper[4815]: I1207 19:15:34.388592 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:34 crc kubenswrapper[4815]: I1207 19:15:34.388605 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:34 crc kubenswrapper[4815]: I1207 19:15:34.388623 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:34 crc kubenswrapper[4815]: I1207 19:15:34.388633 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:34Z","lastTransitionTime":"2025-12-07T19:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:34 crc kubenswrapper[4815]: I1207 19:15:34.492068 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:34 crc kubenswrapper[4815]: I1207 19:15:34.492111 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:34 crc kubenswrapper[4815]: I1207 19:15:34.492119 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:34 crc kubenswrapper[4815]: I1207 19:15:34.492137 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:34 crc kubenswrapper[4815]: I1207 19:15:34.492146 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:34Z","lastTransitionTime":"2025-12-07T19:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:34 crc kubenswrapper[4815]: I1207 19:15:34.594948 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:34 crc kubenswrapper[4815]: I1207 19:15:34.595009 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:34 crc kubenswrapper[4815]: I1207 19:15:34.595027 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:34 crc kubenswrapper[4815]: I1207 19:15:34.595050 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:34 crc kubenswrapper[4815]: I1207 19:15:34.595080 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:34Z","lastTransitionTime":"2025-12-07T19:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:34 crc kubenswrapper[4815]: I1207 19:15:34.698754 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:34 crc kubenswrapper[4815]: I1207 19:15:34.698811 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:34 crc kubenswrapper[4815]: I1207 19:15:34.698828 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:34 crc kubenswrapper[4815]: I1207 19:15:34.698855 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:34 crc kubenswrapper[4815]: I1207 19:15:34.698872 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:34Z","lastTransitionTime":"2025-12-07T19:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:34 crc kubenswrapper[4815]: I1207 19:15:34.768812 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 07 19:15:34 crc kubenswrapper[4815]: E1207 19:15:34.768978 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 07 19:15:34 crc kubenswrapper[4815]: I1207 19:15:34.801933 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:34 crc kubenswrapper[4815]: I1207 19:15:34.801968 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:34 crc kubenswrapper[4815]: I1207 19:15:34.801981 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:34 crc kubenswrapper[4815]: I1207 19:15:34.801999 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:34 crc kubenswrapper[4815]: I1207 19:15:34.802010 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:34Z","lastTransitionTime":"2025-12-07T19:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:34 crc kubenswrapper[4815]: I1207 19:15:34.904488 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:34 crc kubenswrapper[4815]: I1207 19:15:34.904861 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:34 crc kubenswrapper[4815]: I1207 19:15:34.905050 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:34 crc kubenswrapper[4815]: I1207 19:15:34.905188 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:34 crc kubenswrapper[4815]: I1207 19:15:34.905313 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:34Z","lastTransitionTime":"2025-12-07T19:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:35 crc kubenswrapper[4815]: I1207 19:15:35.008461 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:35 crc kubenswrapper[4815]: I1207 19:15:35.009059 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:35 crc kubenswrapper[4815]: I1207 19:15:35.009077 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:35 crc kubenswrapper[4815]: I1207 19:15:35.009099 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:35 crc kubenswrapper[4815]: I1207 19:15:35.009111 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:35Z","lastTransitionTime":"2025-12-07T19:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:35 crc kubenswrapper[4815]: I1207 19:15:35.080553 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" event={"ID":"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f","Type":"ContainerStarted","Data":"69f7f34258562f08a32090f2be4e879a9646f9adf946e66561c4e127eac3ffc9"} Dec 07 19:15:35 crc kubenswrapper[4815]: I1207 19:15:35.081076 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" Dec 07 19:15:35 crc kubenswrapper[4815]: I1207 19:15:35.081132 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" Dec 07 19:15:35 crc kubenswrapper[4815]: I1207 19:15:35.081157 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" Dec 07 19:15:35 crc kubenswrapper[4815]: I1207 19:15:35.102630 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s95hp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b739f36-d9c4-4fb6-9ead-9df05e283dea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73eedf1d85ce69ddf2fed2445ed9318cb4515aee47f89fe9b1db69bbf213ea2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prbsc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s95hp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:35Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:35 crc kubenswrapper[4815]: I1207 19:15:35.112147 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:35 crc kubenswrapper[4815]: I1207 19:15:35.112196 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:35 crc kubenswrapper[4815]: I1207 19:15:35.112214 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:35 crc kubenswrapper[4815]: I1207 19:15:35.112241 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:35 crc kubenswrapper[4815]: I1207 19:15:35.112257 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:35Z","lastTransitionTime":"2025-12-07T19:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:35 crc kubenswrapper[4815]: I1207 19:15:35.124503 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" Dec 07 19:15:35 crc kubenswrapper[4815]: I1207 19:15:35.124845 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" Dec 07 19:15:35 crc kubenswrapper[4815]: I1207 19:15:35.136034 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gmf4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fefaece-ab52-48e2-9ee9-fb07be1922f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://600d32fcae56dd0fa63421283e9b0fdfc23c4589a1dc4a66e5ff1f4c6a415e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e14454c2c5dbbdcbfd3ff7ec5d618c2dfe48f41af9846b72f61941b878e53d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e14454c2c5dbbdcbfd3ff7ec5d618c2dfe48f41af9846b72f61941b878e53d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f33f1decd49213b8e034c2d1f2a3a51ab99aa18679bd3378becaa27f1ef4b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f33f1decd49213b8e034c2d1f2a3a51ab99aa18679bd3378becaa27f1ef4b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c519c7c8ea1b7405466970c64269b72cc6e8129838764dfc4a662cb4df419c6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c519c7c8ea1b7405466970c64269b72cc6e8129838764dfc4a662cb4df419c6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c9efcd688f4886df56fb4d0e2ff5f5dd0c9e877186e1b53d398b7bed0547609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c9efcd688f4886df56fb4d0e2ff5f5dd0c9e877186e1b53d398b7bed0547609\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a4a622d5231f5bacbb9dba0290a87f10310a0053be2dd692718fcf1002deb44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a4a622d5231f5bacbb9dba0290a87f10310a0053be2dd692718fcf1002deb44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://825649f5768765281bad22f9b86ed162b80cc258ce11a746ec06c9f415a21529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://825649f5768765281bad22f9b86ed162b80cc258ce11a746ec06c9f415a21529\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gmf4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:35Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:35 crc kubenswrapper[4815]: I1207 19:15:35.167584 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac471afc20373bd22caa4941b5ebe38fdf62ad2145fabb4211fa02ce8728ab1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4050d0d7398492efc8809311a01c1a60986b97b6d177846e70c9283399f37b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c63ad6fab49ca4b8deb896aef3b5ff43c3a1ae1035e7ad94815a938939c57c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://812356ce6808d8d2a5845dec1132733fab86a0d6853951920ce5fb5083551c7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ab1c01dcbfe50580e64d89dc9e14843419588c89d7d792af89034d8f272e44e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f140818c38211cb31d63c0b540b60681a781c349ba0523866a3c7e80f33756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69f7f34258562f08a32090f2be4e879a9646f9adf946e66561c4e127eac3ffc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba72f8ec4b6a36d0c1a6fd8f9f8e5fad577a00b8ae515738a007b994b989d134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a4a4b2d4e80c078ae857e91491c082c6b1c57a5297c485c31fac0c90996b19f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a4a4b2d4e80c078ae857e91491c082c6b1c57a5297c485c31fac0c90996b19f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zzw6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:35Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:35 crc kubenswrapper[4815]: I1207 19:15:35.185293 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kh7gd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29b15186-a725-4067-ba76-336f580327fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1fa8bda26912926bc52757424a9930e1eca6ee953f28ffdb205c1b8b8f29d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kh7gd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:35Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:35 crc kubenswrapper[4815]: I1207 19:15:35.205291 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"538dccc2-452b-4d13-8b0d-df993c2b69a2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae001d2e4c277c57382ca0c4a1286c3c135f2fc06ca5a78df27cbdab588ce576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f92a2d37af34c519a6b00a9643f4a46de779ae0070adb0826e9f227960fe4a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89a76372fcc5ff7131084098e41183d148096169b9ccaf69d59dbef45dcfd755\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f29e69a7aa018d68c2e39f3e0f7a619fb25262acaebf83b81fabb6f9b87a9892\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:35Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:35 crc kubenswrapper[4815]: I1207 19:15:35.214830 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:35 crc kubenswrapper[4815]: I1207 19:15:35.214895 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:35 crc kubenswrapper[4815]: I1207 19:15:35.214912 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:35 crc kubenswrapper[4815]: I1207 19:15:35.214960 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:35 crc kubenswrapper[4815]: I1207 19:15:35.214977 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:35Z","lastTransitionTime":"2025-12-07T19:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:35 crc kubenswrapper[4815]: I1207 19:15:35.224016 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cb6cb3bfed28d77ff74b7a2273ddb477e21f930f79b169bfbd98fa10a5ead5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:35Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:35 crc kubenswrapper[4815]: I1207 19:15:35.244907 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:35Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:35 crc kubenswrapper[4815]: I1207 19:15:35.267680 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b57a8653da7e9136a6c54872ff67a01c2d6ad5077842b783ffe74e0069bc8693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:35Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:35 crc kubenswrapper[4815]: I1207 19:15:35.289591 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:35Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:35 crc kubenswrapper[4815]: I1207 19:15:35.310091 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ec34a6371c74ee0778e27d73f5e2948d9ee1bb518f658675df123df60879c61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc20ae853b989c9e9cfeb47a40bd1a02397e3ce0b0b4feec3fcec56ea7be16a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:35Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:35 crc kubenswrapper[4815]: I1207 19:15:35.317359 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:35 crc kubenswrapper[4815]: I1207 19:15:35.317447 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:35 crc kubenswrapper[4815]: I1207 19:15:35.317466 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:35 crc kubenswrapper[4815]: I1207 19:15:35.317489 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:35 crc kubenswrapper[4815]: I1207 19:15:35.317506 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:35Z","lastTransitionTime":"2025-12-07T19:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:35 crc kubenswrapper[4815]: I1207 19:15:35.330954 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:35Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:35 crc kubenswrapper[4815]: I1207 19:15:35.349653 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nz6q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92d414c2-ecc3-4598-ac9d-b982bfd89c7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://003f0d6dca39f167925bc3a05634d9dd20f23e54135631e0099a56801f9daf3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cp5k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nz6q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:35Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:35 crc kubenswrapper[4815]: I1207 19:15:35.374533 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe01e9d3-9839-41ea-82d9-f4b6f48bf3b2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52ab545f838534306ea70cc6e400e9fb56ed71ed8206d72c6626166f3a51a93c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f59886c065844948639bf9828ed20c24546cc38659683c98162568c6b12c029\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bec885c9a80d9b5b1fe630a9b7a3f19b23b0af05f359bd87a664f007812b4c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c76d9f056d22d1be012327374370269c2f6d6b8bd341c98419fcb744370751e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539bb1f0e569004272aac994e13bdc7f17f342f5986f08beca97a4c888e046c2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1207 19:15:09.905103 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1207 19:15:09.906859 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4228347019/tls.crt::/tmp/serving-cert-4228347019/tls.key\\\\\\\"\\\\nI1207 19:15:20.294696 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1207 19:15:20.298748 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1207 19:15:20.298781 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1207 19:15:20.298843 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1207 19:15:20.298860 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1207 19:15:20.305864 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1207 19:15:20.305885 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1207 19:15:20.305891 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1207 19:15:20.305973 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1207 19:15:20.305983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1207 19:15:20.305988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1207 19:15:20.305992 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1207 19:15:20.305997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1207 19:15:20.308727 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://902a764a3f6e3ac55c1817f369dfd69ffbf8528bfe4444a6514cdabd48f09fb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:35Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:35 crc kubenswrapper[4815]: I1207 19:15:35.391523 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d662ba2-aa03-4eea-bd30-8ad40638f6c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4939db657209a8744656f63541b50598888981216b000dd4316a9327fdfbcf34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2v8vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55eba62082762b6527923446c2d419d2eb00919331cf8e59bfa9cb27ca65a82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2v8vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gkn4h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:35Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:35 crc kubenswrapper[4815]: I1207 19:15:35.408048 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d662ba2-aa03-4eea-bd30-8ad40638f6c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4939db657209a8744656f63541b50598888981216b000dd4316a9327fdfbcf34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2v8vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55eba62082762b6527923446c2d419d2eb00919331cf8e59bfa9cb27ca65a82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2v8vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gkn4h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:35Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:35 crc kubenswrapper[4815]: I1207 19:15:35.420029 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:35 crc kubenswrapper[4815]: I1207 19:15:35.420107 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:35 crc kubenswrapper[4815]: I1207 19:15:35.420127 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:35 crc kubenswrapper[4815]: I1207 19:15:35.420701 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:35 crc kubenswrapper[4815]: I1207 19:15:35.420967 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:35Z","lastTransitionTime":"2025-12-07T19:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:35 crc kubenswrapper[4815]: I1207 19:15:35.432268 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s95hp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b739f36-d9c4-4fb6-9ead-9df05e283dea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73eedf1d85ce69ddf2fed2445ed9318cb4515aee47f89fe9b1db69bbf213ea2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prbsc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s95hp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:35Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:35 crc kubenswrapper[4815]: I1207 19:15:35.455848 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gmf4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fefaece-ab52-48e2-9ee9-fb07be1922f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://600d32fcae56dd0fa63421283e9b0fdfc23c4589a1dc4a66e5ff1f4c6a415e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e14454c2c5dbbdcbfd3ff7ec5d618c2dfe48f41af9846b72f61941b878e53d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e14454c2c5dbbdcbfd3ff7ec5d618c2dfe48f41af9846b72f61941b878e53d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f33f1decd49213b8e034c2d1f2a3a51ab99aa18679bd3378becaa27f1ef4b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f33f1decd49213b8e034c2d1f2a3a51ab99aa18679bd3378becaa27f1ef4b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c519c7c8ea1b7405466970c64269b72cc6e8129838764dfc4a662cb4df419c6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c519c7c8ea1b7405466970c64269b72cc6e8129838764dfc4a662cb4df419c6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c9efcd688f4886df56fb4d0e2ff5f5dd0c9e877186e1b53d398b7bed0547609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c9efcd688f4886df56fb4d0e2ff5f5dd0c9e877186e1b53d398b7bed0547609\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a4a622d5231f5bacbb9dba0290a87f10310a0053be2dd692718fcf1002deb44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a4a622d5231f5bacbb9dba0290a87f10310a0053be2dd692718fcf1002deb44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://825649f5768765281bad22f9b86ed162b80cc258ce11a746ec06c9f415a21529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://825649f5768765281bad22f9b86ed162b80cc258ce11a746ec06c9f415a21529\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gmf4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:35Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:35 crc kubenswrapper[4815]: I1207 19:15:35.471715 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"538dccc2-452b-4d13-8b0d-df993c2b69a2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae001d2e4c277c57382ca0c4a1286c3c135f2fc06ca5a78df27cbdab588ce576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f92a2d37af34c519a6b00a9643f4a46de779ae0070adb0826e9f227960fe4a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89a76372fcc5ff7131084098e41183d148096169b9ccaf69d59dbef45dcfd755\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f29e69a7aa018d68c2e39f3e0f7a619fb25262acaebf83b81fabb6f9b87a9892\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:35Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:35 crc kubenswrapper[4815]: I1207 19:15:35.483666 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cb6cb3bfed28d77ff74b7a2273ddb477e21f930f79b169bfbd98fa10a5ead5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:35Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:35 crc kubenswrapper[4815]: I1207 19:15:35.503823 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac471afc20373bd22caa4941b5ebe38fdf62ad2145fabb4211fa02ce8728ab1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4050d0d7398492efc8809311a01c1a60986b97b6d177846e70c9283399f37b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c63ad6fab49ca4b8deb896aef3b5ff43c3a1ae1035e7ad94815a938939c57c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://812356ce6808d8d2a5845dec1132733fab86a0d6853951920ce5fb5083551c7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ab1c01dcbfe50580e64d89dc9e14843419588c89d7d792af89034d8f272e44e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f140818c38211cb31d63c0b540b60681a781c349ba0523866a3c7e80f33756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69f7f34258562f08a32090f2be4e879a9646f9adf946e66561c4e127eac3ffc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba72f8ec4b6a36d0c1a6fd8f9f8e5fad577a00b8ae515738a007b994b989d134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a4a4b2d4e80c078ae857e91491c082c6b1c57a5297c485c31fac0c90996b19f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a4a4b2d4e80c078ae857e91491c082c6b1c57a5297c485c31fac0c90996b19f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zzw6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:35Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:35 crc kubenswrapper[4815]: I1207 19:15:35.516186 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kh7gd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29b15186-a725-4067-ba76-336f580327fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1fa8bda26912926bc52757424a9930e1eca6ee953f28ffdb205c1b8b8f29d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kh7gd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:35Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:35 crc kubenswrapper[4815]: I1207 19:15:35.527238 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:35 crc kubenswrapper[4815]: I1207 19:15:35.527353 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:35 crc kubenswrapper[4815]: I1207 19:15:35.527379 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:35 crc kubenswrapper[4815]: I1207 19:15:35.527413 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:35 crc kubenswrapper[4815]: I1207 19:15:35.527438 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:35Z","lastTransitionTime":"2025-12-07T19:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:35 crc kubenswrapper[4815]: I1207 19:15:35.538560 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ec34a6371c74ee0778e27d73f5e2948d9ee1bb518f658675df123df60879c61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc20ae853b989c9e9cfeb47a40bd1a02397e3ce0b0b4feec3fcec56ea7be16a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:35Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:35 crc kubenswrapper[4815]: I1207 19:15:35.555855 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:35Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:35 crc kubenswrapper[4815]: I1207 19:15:35.571534 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nz6q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92d414c2-ecc3-4598-ac9d-b982bfd89c7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://003f0d6dca39f167925bc3a05634d9dd20f23e54135631e0099a56801f9daf3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cp5k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nz6q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:35Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:35 crc kubenswrapper[4815]: I1207 19:15:35.590330 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe01e9d3-9839-41ea-82d9-f4b6f48bf3b2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52ab545f838534306ea70cc6e400e9fb56ed71ed8206d72c6626166f3a51a93c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f59886c065844948639bf9828ed20c24546cc38659683c98162568c6b12c029\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bec885c9a80d9b5b1fe630a9b7a3f19b23b0af05f359bd87a664f007812b4c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c76d9f056d22d1be012327374370269c2f6d6b8bd341c98419fcb744370751e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539bb1f0e569004272aac994e13bdc7f17f342f5986f08beca97a4c888e046c2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1207 19:15:09.905103 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1207 19:15:09.906859 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4228347019/tls.crt::/tmp/serving-cert-4228347019/tls.key\\\\\\\"\\\\nI1207 19:15:20.294696 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1207 19:15:20.298748 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1207 19:15:20.298781 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1207 19:15:20.298843 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1207 19:15:20.298860 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1207 19:15:20.305864 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1207 19:15:20.305885 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1207 19:15:20.305891 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1207 19:15:20.305973 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1207 19:15:20.305983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1207 19:15:20.305988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1207 19:15:20.305992 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1207 19:15:20.305997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1207 19:15:20.308727 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://902a764a3f6e3ac55c1817f369dfd69ffbf8528bfe4444a6514cdabd48f09fb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:35Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:35 crc kubenswrapper[4815]: I1207 19:15:35.611668 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:35Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:35 crc kubenswrapper[4815]: I1207 19:15:35.627771 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b57a8653da7e9136a6c54872ff67a01c2d6ad5077842b783ffe74e0069bc8693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:35Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:35 crc kubenswrapper[4815]: I1207 19:15:35.630736 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:35 crc kubenswrapper[4815]: I1207 19:15:35.630799 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:35 crc kubenswrapper[4815]: I1207 19:15:35.630826 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:35 crc kubenswrapper[4815]: I1207 19:15:35.630982 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:35 crc kubenswrapper[4815]: I1207 19:15:35.631038 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:35Z","lastTransitionTime":"2025-12-07T19:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:35 crc kubenswrapper[4815]: I1207 19:15:35.646857 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:35Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:35 crc kubenswrapper[4815]: I1207 19:15:35.735785 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:35 crc kubenswrapper[4815]: I1207 19:15:35.735832 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:35 crc kubenswrapper[4815]: I1207 19:15:35.735848 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:35 crc kubenswrapper[4815]: I1207 19:15:35.735872 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:35 crc kubenswrapper[4815]: I1207 19:15:35.735889 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:35Z","lastTransitionTime":"2025-12-07T19:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:35 crc kubenswrapper[4815]: I1207 19:15:35.774069 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 07 19:15:35 crc kubenswrapper[4815]: E1207 19:15:35.774304 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 07 19:15:35 crc kubenswrapper[4815]: I1207 19:15:35.774751 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 07 19:15:35 crc kubenswrapper[4815]: E1207 19:15:35.777894 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 07 19:15:35 crc kubenswrapper[4815]: I1207 19:15:35.808494 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe01e9d3-9839-41ea-82d9-f4b6f48bf3b2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52ab545f838534306ea70cc6e400e9fb56ed71ed8206d72c6626166f3a51a93c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f59886c065844948639bf9828ed20c24546cc38659683c98162568c6b12c029\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bec885c9a80d9b5b1fe630a9b7a3f19b23b0af05f359bd87a664f007812b4c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c76d9f056d22d1be012327374370269c2f6d6b8bd341c98419fcb744370751e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539bb1f0e569004272aac994e13bdc7f17f342f5986f08beca97a4c888e046c2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1207 19:15:09.905103 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1207 19:15:09.906859 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4228347019/tls.crt::/tmp/serving-cert-4228347019/tls.key\\\\\\\"\\\\nI1207 19:15:20.294696 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1207 19:15:20.298748 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1207 19:15:20.298781 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1207 19:15:20.298843 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1207 19:15:20.298860 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1207 19:15:20.305864 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1207 19:15:20.305885 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1207 19:15:20.305891 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1207 19:15:20.305973 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1207 19:15:20.305983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1207 19:15:20.305988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1207 19:15:20.305992 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1207 19:15:20.305997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1207 19:15:20.308727 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://902a764a3f6e3ac55c1817f369dfd69ffbf8528bfe4444a6514cdabd48f09fb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:35Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:35 crc kubenswrapper[4815]: I1207 19:15:35.840167 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:35 crc kubenswrapper[4815]: I1207 19:15:35.840256 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:35 crc kubenswrapper[4815]: I1207 19:15:35.840272 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:35 crc kubenswrapper[4815]: I1207 19:15:35.840288 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:35 crc kubenswrapper[4815]: I1207 19:15:35.840357 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:35Z","lastTransitionTime":"2025-12-07T19:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:35 crc kubenswrapper[4815]: I1207 19:15:35.849490 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:35Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:35 crc kubenswrapper[4815]: I1207 19:15:35.873365 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b57a8653da7e9136a6c54872ff67a01c2d6ad5077842b783ffe74e0069bc8693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:35Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:35 crc kubenswrapper[4815]: I1207 19:15:35.895868 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:35Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:35 crc kubenswrapper[4815]: I1207 19:15:35.932838 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ec34a6371c74ee0778e27d73f5e2948d9ee1bb518f658675df123df60879c61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc20ae853b989c9e9cfeb47a40bd1a02397e3ce0b0b4feec3fcec56ea7be16a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:35Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:35 crc kubenswrapper[4815]: I1207 19:15:35.942641 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:35 crc kubenswrapper[4815]: I1207 19:15:35.942665 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:35 crc kubenswrapper[4815]: I1207 19:15:35.942680 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:35 crc kubenswrapper[4815]: I1207 19:15:35.942692 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:35 crc kubenswrapper[4815]: I1207 19:15:35.942702 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:35Z","lastTransitionTime":"2025-12-07T19:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:35 crc kubenswrapper[4815]: I1207 19:15:35.956675 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:35Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:35 crc kubenswrapper[4815]: I1207 19:15:35.971874 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nz6q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92d414c2-ecc3-4598-ac9d-b982bfd89c7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://003f0d6dca39f167925bc3a05634d9dd20f23e54135631e0099a56801f9daf3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cp5k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nz6q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:35Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:35 crc kubenswrapper[4815]: I1207 19:15:35.991811 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d662ba2-aa03-4eea-bd30-8ad40638f6c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4939db657209a8744656f63541b50598888981216b000dd4316a9327fdfbcf34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2v8vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55eba62082762b6527923446c2d419d2eb00919331cf8e59bfa9cb27ca65a82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2v8vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gkn4h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:35Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:36 crc kubenswrapper[4815]: I1207 19:15:36.006528 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s95hp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b739f36-d9c4-4fb6-9ead-9df05e283dea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73eedf1d85ce69ddf2fed2445ed9318cb4515aee47f89fe9b1db69bbf213ea2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prbsc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s95hp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:36Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:36 crc kubenswrapper[4815]: I1207 19:15:36.023072 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gmf4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fefaece-ab52-48e2-9ee9-fb07be1922f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://600d32fcae56dd0fa63421283e9b0fdfc23c4589a1dc4a66e5ff1f4c6a415e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e14454c2c5dbbdcbfd3ff7ec5d618c2dfe48f41af9846b72f61941b878e53d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e14454c2c5dbbdcbfd3ff7ec5d618c2dfe48f41af9846b72f61941b878e53d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f33f1decd49213b8e034c2d1f2a3a51ab99aa18679bd3378becaa27f1ef4b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f33f1decd49213b8e034c2d1f2a3a51ab99aa18679bd3378becaa27f1ef4b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c519c7c8ea1b7405466970c64269b72cc6e8129838764dfc4a662cb4df419c6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c519c7c8ea1b7405466970c64269b72cc6e8129838764dfc4a662cb4df419c6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c9efcd688f4886df56fb4d0e2ff5f5dd0c9e877186e1b53d398b7bed0547609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c9efcd688f4886df56fb4d0e2ff5f5dd0c9e877186e1b53d398b7bed0547609\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a4a622d5231f5bacbb9dba0290a87f10310a0053be2dd692718fcf1002deb44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a4a622d5231f5bacbb9dba0290a87f10310a0053be2dd692718fcf1002deb44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://825649f5768765281bad22f9b86ed162b80cc258ce11a746ec06c9f415a21529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://825649f5768765281bad22f9b86ed162b80cc258ce11a746ec06c9f415a21529\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gmf4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:36Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:36 crc kubenswrapper[4815]: I1207 19:15:36.039227 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cb6cb3bfed28d77ff74b7a2273ddb477e21f930f79b169bfbd98fa10a5ead5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:36Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:36 crc kubenswrapper[4815]: I1207 19:15:36.044755 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:36 crc kubenswrapper[4815]: I1207 19:15:36.044784 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:36 crc kubenswrapper[4815]: I1207 19:15:36.044793 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:36 crc kubenswrapper[4815]: I1207 19:15:36.044807 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:36 crc kubenswrapper[4815]: I1207 19:15:36.044815 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:36Z","lastTransitionTime":"2025-12-07T19:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:36 crc kubenswrapper[4815]: I1207 19:15:36.055136 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac471afc20373bd22caa4941b5ebe38fdf62ad2145fabb4211fa02ce8728ab1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4050d0d7398492efc8809311a01c1a60986b97b6d177846e70c9283399f37b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c63ad6fab49ca4b8deb896aef3b5ff43c3a1ae1035e7ad94815a938939c57c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://812356ce6808d8d2a5845dec1132733fab86a0d6853951920ce5fb5083551c7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ab1c01dcbfe50580e64d89dc9e14843419588c89d7d792af89034d8f272e44e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f140818c38211cb31d63c0b540b60681a781c349ba0523866a3c7e80f33756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69f7f34258562f08a32090f2be4e879a9646f9adf946e66561c4e127eac3ffc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba72f8ec4b6a36d0c1a6fd8f9f8e5fad577a00b8ae515738a007b994b989d134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a4a4b2d4e80c078ae857e91491c082c6b1c57a5297c485c31fac0c90996b19f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a4a4b2d4e80c078ae857e91491c082c6b1c57a5297c485c31fac0c90996b19f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zzw6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:36Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:36 crc kubenswrapper[4815]: I1207 19:15:36.065535 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kh7gd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29b15186-a725-4067-ba76-336f580327fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1fa8bda26912926bc52757424a9930e1eca6ee953f28ffdb205c1b8b8f29d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kh7gd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:36Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:36 crc kubenswrapper[4815]: I1207 19:15:36.077308 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"538dccc2-452b-4d13-8b0d-df993c2b69a2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae001d2e4c277c57382ca0c4a1286c3c135f2fc06ca5a78df27cbdab588ce576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f92a2d37af34c519a6b00a9643f4a46de779ae0070adb0826e9f227960fe4a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89a76372fcc5ff7131084098e41183d148096169b9ccaf69d59dbef45dcfd755\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f29e69a7aa018d68c2e39f3e0f7a619fb25262acaebf83b81fabb6f9b87a9892\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:36Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:36 crc kubenswrapper[4815]: I1207 19:15:36.146484 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:36 crc kubenswrapper[4815]: I1207 19:15:36.146512 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:36 crc kubenswrapper[4815]: I1207 19:15:36.146522 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:36 crc kubenswrapper[4815]: I1207 19:15:36.146537 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:36 crc kubenswrapper[4815]: I1207 19:15:36.146546 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:36Z","lastTransitionTime":"2025-12-07T19:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:36 crc kubenswrapper[4815]: I1207 19:15:36.248422 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:36 crc kubenswrapper[4815]: I1207 19:15:36.248454 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:36 crc kubenswrapper[4815]: I1207 19:15:36.248464 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:36 crc kubenswrapper[4815]: I1207 19:15:36.248482 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:36 crc kubenswrapper[4815]: I1207 19:15:36.248499 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:36Z","lastTransitionTime":"2025-12-07T19:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:36 crc kubenswrapper[4815]: I1207 19:15:36.352709 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:36 crc kubenswrapper[4815]: I1207 19:15:36.352775 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:36 crc kubenswrapper[4815]: I1207 19:15:36.352798 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:36 crc kubenswrapper[4815]: I1207 19:15:36.352829 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:36 crc kubenswrapper[4815]: I1207 19:15:36.352847 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:36Z","lastTransitionTime":"2025-12-07T19:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:36 crc kubenswrapper[4815]: I1207 19:15:36.455732 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:36 crc kubenswrapper[4815]: I1207 19:15:36.455782 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:36 crc kubenswrapper[4815]: I1207 19:15:36.455794 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:36 crc kubenswrapper[4815]: I1207 19:15:36.455810 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:36 crc kubenswrapper[4815]: I1207 19:15:36.455821 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:36Z","lastTransitionTime":"2025-12-07T19:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:36 crc kubenswrapper[4815]: I1207 19:15:36.559435 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:36 crc kubenswrapper[4815]: I1207 19:15:36.559493 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:36 crc kubenswrapper[4815]: I1207 19:15:36.559511 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:36 crc kubenswrapper[4815]: I1207 19:15:36.559535 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:36 crc kubenswrapper[4815]: I1207 19:15:36.559553 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:36Z","lastTransitionTime":"2025-12-07T19:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:36 crc kubenswrapper[4815]: I1207 19:15:36.647804 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 07 19:15:36 crc kubenswrapper[4815]: I1207 19:15:36.647956 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 07 19:15:36 crc kubenswrapper[4815]: I1207 19:15:36.648007 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 07 19:15:36 crc kubenswrapper[4815]: I1207 19:15:36.648049 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 07 19:15:36 crc kubenswrapper[4815]: E1207 19:15:36.648162 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-07 19:15:52.648114217 +0000 UTC m=+57.227104302 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:15:36 crc kubenswrapper[4815]: E1207 19:15:36.648197 4815 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 07 19:15:36 crc kubenswrapper[4815]: E1207 19:15:36.648213 4815 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 07 19:15:36 crc kubenswrapper[4815]: I1207 19:15:36.648264 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 07 19:15:36 crc kubenswrapper[4815]: E1207 19:15:36.648227 4815 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 07 19:15:36 crc kubenswrapper[4815]: E1207 19:15:36.648353 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-07 19:15:52.648315742 +0000 UTC m=+57.227305857 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 07 19:15:36 crc kubenswrapper[4815]: E1207 19:15:36.648368 4815 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 07 19:15:36 crc kubenswrapper[4815]: E1207 19:15:36.648435 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-07 19:15:52.648412225 +0000 UTC m=+57.227402310 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 07 19:15:36 crc kubenswrapper[4815]: E1207 19:15:36.648440 4815 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 07 19:15:36 crc kubenswrapper[4815]: E1207 19:15:36.648516 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-07 19:15:52.648501327 +0000 UTC m=+57.227491412 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 07 19:15:36 crc kubenswrapper[4815]: E1207 19:15:36.649010 4815 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 07 19:15:36 crc kubenswrapper[4815]: E1207 19:15:36.649060 4815 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 07 19:15:36 crc kubenswrapper[4815]: E1207 19:15:36.649081 4815 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 07 19:15:36 crc kubenswrapper[4815]: E1207 19:15:36.649186 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-07 19:15:52.649157975 +0000 UTC m=+57.228148060 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 07 19:15:36 crc kubenswrapper[4815]: I1207 19:15:36.662968 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:36 crc kubenswrapper[4815]: I1207 19:15:36.663079 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:36 crc kubenswrapper[4815]: I1207 19:15:36.663101 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:36 crc kubenswrapper[4815]: I1207 19:15:36.663126 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:36 crc kubenswrapper[4815]: I1207 19:15:36.663144 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:36Z","lastTransitionTime":"2025-12-07T19:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:36 crc kubenswrapper[4815]: I1207 19:15:36.766533 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:36 crc kubenswrapper[4815]: I1207 19:15:36.766627 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:36 crc kubenswrapper[4815]: I1207 19:15:36.767118 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:36 crc kubenswrapper[4815]: I1207 19:15:36.767274 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:36 crc kubenswrapper[4815]: I1207 19:15:36.767297 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:36Z","lastTransitionTime":"2025-12-07T19:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:36 crc kubenswrapper[4815]: I1207 19:15:36.769809 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 07 19:15:36 crc kubenswrapper[4815]: E1207 19:15:36.769991 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 07 19:15:36 crc kubenswrapper[4815]: I1207 19:15:36.894162 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:36 crc kubenswrapper[4815]: I1207 19:15:36.894251 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:36 crc kubenswrapper[4815]: I1207 19:15:36.894270 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:36 crc kubenswrapper[4815]: I1207 19:15:36.894295 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:36 crc kubenswrapper[4815]: I1207 19:15:36.894314 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:36Z","lastTransitionTime":"2025-12-07T19:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:36 crc kubenswrapper[4815]: I1207 19:15:36.997161 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:36 crc kubenswrapper[4815]: I1207 19:15:36.997238 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:36 crc kubenswrapper[4815]: I1207 19:15:36.997261 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:36 crc kubenswrapper[4815]: I1207 19:15:36.997292 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:36 crc kubenswrapper[4815]: I1207 19:15:36.997316 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:36Z","lastTransitionTime":"2025-12-07T19:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:37 crc kubenswrapper[4815]: I1207 19:15:37.101877 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:37 crc kubenswrapper[4815]: I1207 19:15:37.102349 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:37 crc kubenswrapper[4815]: I1207 19:15:37.102373 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:37 crc kubenswrapper[4815]: I1207 19:15:37.102405 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:37 crc kubenswrapper[4815]: I1207 19:15:37.102424 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:37Z","lastTransitionTime":"2025-12-07T19:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:37 crc kubenswrapper[4815]: I1207 19:15:37.110865 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:37 crc kubenswrapper[4815]: I1207 19:15:37.110911 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:37 crc kubenswrapper[4815]: I1207 19:15:37.110965 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:37 crc kubenswrapper[4815]: I1207 19:15:37.110991 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:37 crc kubenswrapper[4815]: I1207 19:15:37.111012 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:37Z","lastTransitionTime":"2025-12-07T19:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:37 crc kubenswrapper[4815]: E1207 19:15:37.132024 4815 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:15:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:15:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:15:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:15:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3d749f27-20c8-4d23-ad52-dc6b852bf3b7\\\",\\\"systemUUID\\\":\\\"077277cc-9bde-4aeb-947a-0cf3c49a1ac0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:37Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:37 crc kubenswrapper[4815]: I1207 19:15:37.137545 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:37 crc kubenswrapper[4815]: I1207 19:15:37.137599 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:37 crc kubenswrapper[4815]: I1207 19:15:37.137624 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:37 crc kubenswrapper[4815]: I1207 19:15:37.137653 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:37 crc kubenswrapper[4815]: I1207 19:15:37.137674 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:37Z","lastTransitionTime":"2025-12-07T19:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:37 crc kubenswrapper[4815]: E1207 19:15:37.158763 4815 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:15:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:15:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:15:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:15:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3d749f27-20c8-4d23-ad52-dc6b852bf3b7\\\",\\\"systemUUID\\\":\\\"077277cc-9bde-4aeb-947a-0cf3c49a1ac0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:37Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:37 crc kubenswrapper[4815]: I1207 19:15:37.163586 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:37 crc kubenswrapper[4815]: I1207 19:15:37.163637 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:37 crc kubenswrapper[4815]: I1207 19:15:37.163654 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:37 crc kubenswrapper[4815]: I1207 19:15:37.163679 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:37 crc kubenswrapper[4815]: I1207 19:15:37.163697 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:37Z","lastTransitionTime":"2025-12-07T19:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:37 crc kubenswrapper[4815]: E1207 19:15:37.187749 4815 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:15:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:15:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:15:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:15:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3d749f27-20c8-4d23-ad52-dc6b852bf3b7\\\",\\\"systemUUID\\\":\\\"077277cc-9bde-4aeb-947a-0cf3c49a1ac0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:37Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:37 crc kubenswrapper[4815]: I1207 19:15:37.193235 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:37 crc kubenswrapper[4815]: I1207 19:15:37.193550 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:37 crc kubenswrapper[4815]: I1207 19:15:37.193757 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:37 crc kubenswrapper[4815]: I1207 19:15:37.194105 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:37 crc kubenswrapper[4815]: I1207 19:15:37.194249 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:37Z","lastTransitionTime":"2025-12-07T19:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:37 crc kubenswrapper[4815]: E1207 19:15:37.217131 4815 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:15:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:15:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:15:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:15:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3d749f27-20c8-4d23-ad52-dc6b852bf3b7\\\",\\\"systemUUID\\\":\\\"077277cc-9bde-4aeb-947a-0cf3c49a1ac0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:37Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:37 crc kubenswrapper[4815]: I1207 19:15:37.222694 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:37 crc kubenswrapper[4815]: I1207 19:15:37.222912 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:37 crc kubenswrapper[4815]: I1207 19:15:37.223065 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:37 crc kubenswrapper[4815]: I1207 19:15:37.223220 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:37 crc kubenswrapper[4815]: I1207 19:15:37.223354 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:37Z","lastTransitionTime":"2025-12-07T19:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:37 crc kubenswrapper[4815]: E1207 19:15:37.249831 4815 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:15:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:15:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:15:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:15:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3d749f27-20c8-4d23-ad52-dc6b852bf3b7\\\",\\\"systemUUID\\\":\\\"077277cc-9bde-4aeb-947a-0cf3c49a1ac0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:37Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:37 crc kubenswrapper[4815]: E1207 19:15:37.250249 4815 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 07 19:15:37 crc kubenswrapper[4815]: I1207 19:15:37.252337 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:37 crc kubenswrapper[4815]: I1207 19:15:37.252419 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:37 crc kubenswrapper[4815]: I1207 19:15:37.252439 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:37 crc kubenswrapper[4815]: I1207 19:15:37.252464 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:37 crc kubenswrapper[4815]: I1207 19:15:37.252483 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:37Z","lastTransitionTime":"2025-12-07T19:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:37 crc kubenswrapper[4815]: I1207 19:15:37.355848 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:37 crc kubenswrapper[4815]: I1207 19:15:37.355941 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:37 crc kubenswrapper[4815]: I1207 19:15:37.355966 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:37 crc kubenswrapper[4815]: I1207 19:15:37.355997 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:37 crc kubenswrapper[4815]: I1207 19:15:37.356017 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:37Z","lastTransitionTime":"2025-12-07T19:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:37 crc kubenswrapper[4815]: I1207 19:15:37.459288 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:37 crc kubenswrapper[4815]: I1207 19:15:37.459619 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:37 crc kubenswrapper[4815]: I1207 19:15:37.459789 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:37 crc kubenswrapper[4815]: I1207 19:15:37.460061 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:37 crc kubenswrapper[4815]: I1207 19:15:37.460249 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:37Z","lastTransitionTime":"2025-12-07T19:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:37 crc kubenswrapper[4815]: I1207 19:15:37.563300 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:37 crc kubenswrapper[4815]: I1207 19:15:37.563350 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:37 crc kubenswrapper[4815]: I1207 19:15:37.563363 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:37 crc kubenswrapper[4815]: I1207 19:15:37.563381 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:37 crc kubenswrapper[4815]: I1207 19:15:37.563415 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:37Z","lastTransitionTime":"2025-12-07T19:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:37 crc kubenswrapper[4815]: I1207 19:15:37.667056 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:37 crc kubenswrapper[4815]: I1207 19:15:37.667128 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:37 crc kubenswrapper[4815]: I1207 19:15:37.667147 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:37 crc kubenswrapper[4815]: I1207 19:15:37.667170 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:37 crc kubenswrapper[4815]: I1207 19:15:37.667188 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:37Z","lastTransitionTime":"2025-12-07T19:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:37 crc kubenswrapper[4815]: I1207 19:15:37.769382 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 07 19:15:37 crc kubenswrapper[4815]: E1207 19:15:37.769608 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 07 19:15:37 crc kubenswrapper[4815]: I1207 19:15:37.769640 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 07 19:15:37 crc kubenswrapper[4815]: E1207 19:15:37.769831 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 07 19:15:37 crc kubenswrapper[4815]: I1207 19:15:37.770476 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:37 crc kubenswrapper[4815]: I1207 19:15:37.770533 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:37 crc kubenswrapper[4815]: I1207 19:15:37.770551 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:37 crc kubenswrapper[4815]: I1207 19:15:37.770572 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:37 crc kubenswrapper[4815]: I1207 19:15:37.770590 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:37Z","lastTransitionTime":"2025-12-07T19:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:37 crc kubenswrapper[4815]: I1207 19:15:37.873283 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:37 crc kubenswrapper[4815]: I1207 19:15:37.873346 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:37 crc kubenswrapper[4815]: I1207 19:15:37.873370 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:37 crc kubenswrapper[4815]: I1207 19:15:37.873400 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:37 crc kubenswrapper[4815]: I1207 19:15:37.873423 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:37Z","lastTransitionTime":"2025-12-07T19:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:37 crc kubenswrapper[4815]: I1207 19:15:37.977019 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:37 crc kubenswrapper[4815]: I1207 19:15:37.977094 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:37 crc kubenswrapper[4815]: I1207 19:15:37.977118 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:37 crc kubenswrapper[4815]: I1207 19:15:37.977148 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:37 crc kubenswrapper[4815]: I1207 19:15:37.977171 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:37Z","lastTransitionTime":"2025-12-07T19:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:38 crc kubenswrapper[4815]: I1207 19:15:38.080233 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:38 crc kubenswrapper[4815]: I1207 19:15:38.080323 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:38 crc kubenswrapper[4815]: I1207 19:15:38.080346 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:38 crc kubenswrapper[4815]: I1207 19:15:38.080378 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:38 crc kubenswrapper[4815]: I1207 19:15:38.080400 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:38Z","lastTransitionTime":"2025-12-07T19:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:38 crc kubenswrapper[4815]: I1207 19:15:38.099520 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zzw6c_13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f/ovnkube-controller/0.log" Dec 07 19:15:38 crc kubenswrapper[4815]: I1207 19:15:38.104611 4815 generic.go:334] "Generic (PLEG): container finished" podID="13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f" containerID="69f7f34258562f08a32090f2be4e879a9646f9adf946e66561c4e127eac3ffc9" exitCode=1 Dec 07 19:15:38 crc kubenswrapper[4815]: I1207 19:15:38.104668 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" event={"ID":"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f","Type":"ContainerDied","Data":"69f7f34258562f08a32090f2be4e879a9646f9adf946e66561c4e127eac3ffc9"} Dec 07 19:15:38 crc kubenswrapper[4815]: I1207 19:15:38.106123 4815 scope.go:117] "RemoveContainer" containerID="69f7f34258562f08a32090f2be4e879a9646f9adf946e66561c4e127eac3ffc9" Dec 07 19:15:38 crc kubenswrapper[4815]: I1207 19:15:38.159355 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s95hp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b739f36-d9c4-4fb6-9ead-9df05e283dea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73eedf1d85ce69ddf2fed2445ed9318cb4515aee47f89fe9b1db69bbf213ea2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prbsc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s95hp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:38Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:38 crc kubenswrapper[4815]: I1207 19:15:38.186086 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:38 crc kubenswrapper[4815]: I1207 19:15:38.186154 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:38 crc kubenswrapper[4815]: I1207 19:15:38.186171 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:38 crc kubenswrapper[4815]: I1207 19:15:38.186201 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:38 crc kubenswrapper[4815]: I1207 19:15:38.186220 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:38Z","lastTransitionTime":"2025-12-07T19:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:38 crc kubenswrapper[4815]: I1207 19:15:38.186227 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gmf4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fefaece-ab52-48e2-9ee9-fb07be1922f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://600d32fcae56dd0fa63421283e9b0fdfc23c4589a1dc4a66e5ff1f4c6a415e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e14454c2c5dbbdcbfd3ff7ec5d618c2dfe48f41af9846b72f61941b878e53d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e14454c2c5dbbdcbfd3ff7ec5d618c2dfe48f41af9846b72f61941b878e53d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f33f1decd49213b8e034c2d1f2a3a51ab99aa18679bd3378becaa27f1ef4b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f33f1decd49213b8e034c2d1f2a3a51ab99aa18679bd3378becaa27f1ef4b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c519c7c8ea1b7405466970c64269b72cc6e8129838764dfc4a662cb4df419c6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c519c7c8ea1b7405466970c64269b72cc6e8129838764dfc4a662cb4df419c6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c9efcd688f4886df56fb4d0e2ff5f5dd0c9e877186e1b53d398b7bed0547609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c9efcd688f4886df56fb4d0e2ff5f5dd0c9e877186e1b53d398b7bed0547609\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a4a622d5231f5bacbb9dba0290a87f10310a0053be2dd692718fcf1002deb44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a4a622d5231f5bacbb9dba0290a87f10310a0053be2dd692718fcf1002deb44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://825649f5768765281bad22f9b86ed162b80cc258ce11a746ec06c9f415a21529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://825649f5768765281bad22f9b86ed162b80cc258ce11a746ec06c9f415a21529\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gmf4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:38Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:38 crc kubenswrapper[4815]: I1207 19:15:38.206765 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"538dccc2-452b-4d13-8b0d-df993c2b69a2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae001d2e4c277c57382ca0c4a1286c3c135f2fc06ca5a78df27cbdab588ce576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f92a2d37af34c519a6b00a9643f4a46de779ae0070adb0826e9f227960fe4a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89a76372fcc5ff7131084098e41183d148096169b9ccaf69d59dbef45dcfd755\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f29e69a7aa018d68c2e39f3e0f7a619fb25262acaebf83b81fabb6f9b87a9892\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:38Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:38 crc kubenswrapper[4815]: I1207 19:15:38.225169 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cb6cb3bfed28d77ff74b7a2273ddb477e21f930f79b169bfbd98fa10a5ead5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:38Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:38 crc kubenswrapper[4815]: I1207 19:15:38.257789 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac471afc20373bd22caa4941b5ebe38fdf62ad2145fabb4211fa02ce8728ab1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4050d0d7398492efc8809311a01c1a60986b97b6d177846e70c9283399f37b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c63ad6fab49ca4b8deb896aef3b5ff43c3a1ae1035e7ad94815a938939c57c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://812356ce6808d8d2a5845dec1132733fab86a0d6853951920ce5fb5083551c7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ab1c01dcbfe50580e64d89dc9e14843419588c89d7d792af89034d8f272e44e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f140818c38211cb31d63c0b540b60681a781c349ba0523866a3c7e80f33756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69f7f34258562f08a32090f2be4e879a9646f9adf946e66561c4e127eac3ffc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69f7f34258562f08a32090f2be4e879a9646f9adf946e66561c4e127eac3ffc9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-07T19:15:37Z\\\",\\\"message\\\":\\\"er Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-kube-controller-manager/kube-controller-manager]} name:Service_openshift-kube-controller-manager/kube-controller-manager_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.36:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {ba175bbe-5cc4-47e6-a32d-57693e1320bd}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1207 19:15:36.490000 6038 services_controller.go:452] Built service openshift-machine-api/machine-api-controllers per-node LB for network=default: []services.LB{}\\\\nI1207 19:15:36.490018 6038 services_controller.go:453] Built service openshift-machine-api/machine-api-controllers template LB for network=default: []services.LB{}\\\\nI1207 19:15:36.490032 6038 services_controller.go:454] Service openshift-machine-api/machine-api-controllers for network=default has 3 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF1207 19:15:36.490040 6038 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba72f8ec4b6a36d0c1a6fd8f9f8e5fad577a00b8ae515738a007b994b989d134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a4a4b2d4e80c078ae857e91491c082c6b1c57a5297c485c31fac0c90996b19f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a4a4b2d4e80c078ae857e91491c082c6b1c57a5297c485c31fac0c90996b19f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zzw6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:38Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:38 crc kubenswrapper[4815]: I1207 19:15:38.272827 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kh7gd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29b15186-a725-4067-ba76-336f580327fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1fa8bda26912926bc52757424a9930e1eca6ee953f28ffdb205c1b8b8f29d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kh7gd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:38Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:38 crc kubenswrapper[4815]: I1207 19:15:38.289641 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:38 crc kubenswrapper[4815]: I1207 19:15:38.289687 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:38 crc kubenswrapper[4815]: I1207 19:15:38.289704 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:38 crc kubenswrapper[4815]: I1207 19:15:38.289729 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:38 crc kubenswrapper[4815]: I1207 19:15:38.289747 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:38Z","lastTransitionTime":"2025-12-07T19:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:38 crc kubenswrapper[4815]: I1207 19:15:38.295053 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ec34a6371c74ee0778e27d73f5e2948d9ee1bb518f658675df123df60879c61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc20ae853b989c9e9cfeb47a40bd1a02397e3ce0b0b4feec3fcec56ea7be16a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:38Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:38 crc kubenswrapper[4815]: I1207 19:15:38.314847 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:38Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:38 crc kubenswrapper[4815]: I1207 19:15:38.333253 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nz6q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92d414c2-ecc3-4598-ac9d-b982bfd89c7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://003f0d6dca39f167925bc3a05634d9dd20f23e54135631e0099a56801f9daf3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cp5k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nz6q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:38Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:38 crc kubenswrapper[4815]: I1207 19:15:38.355817 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe01e9d3-9839-41ea-82d9-f4b6f48bf3b2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52ab545f838534306ea70cc6e400e9fb56ed71ed8206d72c6626166f3a51a93c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f59886c065844948639bf9828ed20c24546cc38659683c98162568c6b12c029\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bec885c9a80d9b5b1fe630a9b7a3f19b23b0af05f359bd87a664f007812b4c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c76d9f056d22d1be012327374370269c2f6d6b8bd341c98419fcb744370751e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539bb1f0e569004272aac994e13bdc7f17f342f5986f08beca97a4c888e046c2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1207 19:15:09.905103 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1207 19:15:09.906859 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4228347019/tls.crt::/tmp/serving-cert-4228347019/tls.key\\\\\\\"\\\\nI1207 19:15:20.294696 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1207 19:15:20.298748 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1207 19:15:20.298781 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1207 19:15:20.298843 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1207 19:15:20.298860 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1207 19:15:20.305864 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1207 19:15:20.305885 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1207 19:15:20.305891 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1207 19:15:20.305973 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1207 19:15:20.305983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1207 19:15:20.305988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1207 19:15:20.305992 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1207 19:15:20.305997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1207 19:15:20.308727 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://902a764a3f6e3ac55c1817f369dfd69ffbf8528bfe4444a6514cdabd48f09fb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:38Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:38 crc kubenswrapper[4815]: I1207 19:15:38.377970 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:38Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:38 crc kubenswrapper[4815]: I1207 19:15:38.390863 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b57a8653da7e9136a6c54872ff67a01c2d6ad5077842b783ffe74e0069bc8693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:38Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:38 crc kubenswrapper[4815]: I1207 19:15:38.392056 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:38 crc kubenswrapper[4815]: I1207 19:15:38.392097 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:38 crc kubenswrapper[4815]: I1207 19:15:38.392130 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:38 crc kubenswrapper[4815]: I1207 19:15:38.392153 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:38 crc kubenswrapper[4815]: I1207 19:15:38.392170 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:38Z","lastTransitionTime":"2025-12-07T19:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:38 crc kubenswrapper[4815]: I1207 19:15:38.403973 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:38Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:38 crc kubenswrapper[4815]: I1207 19:15:38.412974 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d662ba2-aa03-4eea-bd30-8ad40638f6c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4939db657209a8744656f63541b50598888981216b000dd4316a9327fdfbcf34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2v8vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55eba62082762b6527923446c2d419d2eb00919331cf8e59bfa9cb27ca65a82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2v8vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gkn4h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:38Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:38 crc kubenswrapper[4815]: I1207 19:15:38.495111 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:38 crc kubenswrapper[4815]: I1207 19:15:38.495178 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:38 crc kubenswrapper[4815]: I1207 19:15:38.495190 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:38 crc kubenswrapper[4815]: I1207 19:15:38.495210 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:38 crc kubenswrapper[4815]: I1207 19:15:38.495222 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:38Z","lastTransitionTime":"2025-12-07T19:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:38 crc kubenswrapper[4815]: I1207 19:15:38.612238 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:38 crc kubenswrapper[4815]: I1207 19:15:38.612289 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:38 crc kubenswrapper[4815]: I1207 19:15:38.612304 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:38 crc kubenswrapper[4815]: I1207 19:15:38.612327 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:38 crc kubenswrapper[4815]: I1207 19:15:38.612348 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:38Z","lastTransitionTime":"2025-12-07T19:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:38 crc kubenswrapper[4815]: I1207 19:15:38.715889 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:38 crc kubenswrapper[4815]: I1207 19:15:38.715963 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:38 crc kubenswrapper[4815]: I1207 19:15:38.715980 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:38 crc kubenswrapper[4815]: I1207 19:15:38.716004 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:38 crc kubenswrapper[4815]: I1207 19:15:38.716023 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:38Z","lastTransitionTime":"2025-12-07T19:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:38 crc kubenswrapper[4815]: I1207 19:15:38.769891 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 07 19:15:38 crc kubenswrapper[4815]: E1207 19:15:38.770122 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 07 19:15:38 crc kubenswrapper[4815]: I1207 19:15:38.818083 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:38 crc kubenswrapper[4815]: I1207 19:15:38.818139 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:38 crc kubenswrapper[4815]: I1207 19:15:38.818160 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:38 crc kubenswrapper[4815]: I1207 19:15:38.818186 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:38 crc kubenswrapper[4815]: I1207 19:15:38.818205 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:38Z","lastTransitionTime":"2025-12-07T19:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:38 crc kubenswrapper[4815]: I1207 19:15:38.921572 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:38 crc kubenswrapper[4815]: I1207 19:15:38.921682 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:38 crc kubenswrapper[4815]: I1207 19:15:38.921699 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:38 crc kubenswrapper[4815]: I1207 19:15:38.921722 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:38 crc kubenswrapper[4815]: I1207 19:15:38.921740 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:38Z","lastTransitionTime":"2025-12-07T19:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:39 crc kubenswrapper[4815]: I1207 19:15:39.024200 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:39 crc kubenswrapper[4815]: I1207 19:15:39.024277 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:39 crc kubenswrapper[4815]: I1207 19:15:39.024301 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:39 crc kubenswrapper[4815]: I1207 19:15:39.024331 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:39 crc kubenswrapper[4815]: I1207 19:15:39.024357 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:39Z","lastTransitionTime":"2025-12-07T19:15:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:39 crc kubenswrapper[4815]: I1207 19:15:39.127196 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:39 crc kubenswrapper[4815]: I1207 19:15:39.128172 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:39 crc kubenswrapper[4815]: I1207 19:15:39.128694 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:39 crc kubenswrapper[4815]: I1207 19:15:39.128840 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:39 crc kubenswrapper[4815]: I1207 19:15:39.129010 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:39Z","lastTransitionTime":"2025-12-07T19:15:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:39 crc kubenswrapper[4815]: I1207 19:15:39.232490 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:39 crc kubenswrapper[4815]: I1207 19:15:39.232544 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:39 crc kubenswrapper[4815]: I1207 19:15:39.232562 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:39 crc kubenswrapper[4815]: I1207 19:15:39.232584 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:39 crc kubenswrapper[4815]: I1207 19:15:39.232604 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:39Z","lastTransitionTime":"2025-12-07T19:15:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:39 crc kubenswrapper[4815]: I1207 19:15:39.335776 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:39 crc kubenswrapper[4815]: I1207 19:15:39.335835 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:39 crc kubenswrapper[4815]: I1207 19:15:39.335853 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:39 crc kubenswrapper[4815]: I1207 19:15:39.335879 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:39 crc kubenswrapper[4815]: I1207 19:15:39.335900 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:39Z","lastTransitionTime":"2025-12-07T19:15:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:39 crc kubenswrapper[4815]: I1207 19:15:39.440108 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:39 crc kubenswrapper[4815]: I1207 19:15:39.440171 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:39 crc kubenswrapper[4815]: I1207 19:15:39.440190 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:39 crc kubenswrapper[4815]: I1207 19:15:39.440216 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:39 crc kubenswrapper[4815]: I1207 19:15:39.440234 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:39Z","lastTransitionTime":"2025-12-07T19:15:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:39 crc kubenswrapper[4815]: I1207 19:15:39.504466 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tqxds"] Dec 07 19:15:39 crc kubenswrapper[4815]: I1207 19:15:39.504953 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tqxds" Dec 07 19:15:39 crc kubenswrapper[4815]: I1207 19:15:39.506496 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 07 19:15:39 crc kubenswrapper[4815]: I1207 19:15:39.507568 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 07 19:15:39 crc kubenswrapper[4815]: I1207 19:15:39.521317 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s95hp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b739f36-d9c4-4fb6-9ead-9df05e283dea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73eedf1d85ce69ddf2fed2445ed9318cb4515aee47f89fe9b1db69bbf213ea2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prbsc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s95hp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:39Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:39 crc kubenswrapper[4815]: I1207 19:15:39.537218 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gmf4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fefaece-ab52-48e2-9ee9-fb07be1922f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://600d32fcae56dd0fa63421283e9b0fdfc23c4589a1dc4a66e5ff1f4c6a415e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e14454c2c5dbbdcbfd3ff7ec5d618c2dfe48f41af9846b72f61941b878e53d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e14454c2c5dbbdcbfd3ff7ec5d618c2dfe48f41af9846b72f61941b878e53d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f33f1decd49213b8e034c2d1f2a3a51ab99aa18679bd3378becaa27f1ef4b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f33f1decd49213b8e034c2d1f2a3a51ab99aa18679bd3378becaa27f1ef4b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c519c7c8ea1b7405466970c64269b72cc6e8129838764dfc4a662cb4df419c6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c519c7c8ea1b7405466970c64269b72cc6e8129838764dfc4a662cb4df419c6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c9efcd688f4886df56fb4d0e2ff5f5dd0c9e877186e1b53d398b7bed0547609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c9efcd688f4886df56fb4d0e2ff5f5dd0c9e877186e1b53d398b7bed0547609\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a4a622d5231f5bacbb9dba0290a87f10310a0053be2dd692718fcf1002deb44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a4a622d5231f5bacbb9dba0290a87f10310a0053be2dd692718fcf1002deb44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://825649f5768765281bad22f9b86ed162b80cc258ce11a746ec06c9f415a21529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://825649f5768765281bad22f9b86ed162b80cc258ce11a746ec06c9f415a21529\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gmf4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:39Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:39 crc kubenswrapper[4815]: I1207 19:15:39.542788 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:39 crc kubenswrapper[4815]: I1207 19:15:39.542831 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:39 crc kubenswrapper[4815]: I1207 19:15:39.542847 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:39 crc kubenswrapper[4815]: I1207 19:15:39.542870 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:39 crc kubenswrapper[4815]: I1207 19:15:39.542886 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:39Z","lastTransitionTime":"2025-12-07T19:15:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:39 crc kubenswrapper[4815]: I1207 19:15:39.550896 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tqxds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dfd265a-0c7f-40bd-9226-82c2b1abbeda\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzfhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzfhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tqxds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:39Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:39 crc kubenswrapper[4815]: I1207 19:15:39.576212 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"538dccc2-452b-4d13-8b0d-df993c2b69a2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae001d2e4c277c57382ca0c4a1286c3c135f2fc06ca5a78df27cbdab588ce576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f92a2d37af34c519a6b00a9643f4a46de779ae0070adb0826e9f227960fe4a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89a76372fcc5ff7131084098e41183d148096169b9ccaf69d59dbef45dcfd755\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f29e69a7aa018d68c2e39f3e0f7a619fb25262acaebf83b81fabb6f9b87a9892\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:39Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:39 crc kubenswrapper[4815]: I1207 19:15:39.588964 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5dfd265a-0c7f-40bd-9226-82c2b1abbeda-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-tqxds\" (UID: \"5dfd265a-0c7f-40bd-9226-82c2b1abbeda\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tqxds" Dec 07 19:15:39 crc kubenswrapper[4815]: I1207 19:15:39.589036 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5dfd265a-0c7f-40bd-9226-82c2b1abbeda-env-overrides\") pod \"ovnkube-control-plane-749d76644c-tqxds\" (UID: \"5dfd265a-0c7f-40bd-9226-82c2b1abbeda\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tqxds" Dec 07 19:15:39 crc kubenswrapper[4815]: I1207 19:15:39.589067 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzfhb\" (UniqueName: \"kubernetes.io/projected/5dfd265a-0c7f-40bd-9226-82c2b1abbeda-kube-api-access-rzfhb\") pod \"ovnkube-control-plane-749d76644c-tqxds\" (UID: \"5dfd265a-0c7f-40bd-9226-82c2b1abbeda\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tqxds" Dec 07 19:15:39 crc kubenswrapper[4815]: I1207 19:15:39.589102 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5dfd265a-0c7f-40bd-9226-82c2b1abbeda-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-tqxds\" (UID: \"5dfd265a-0c7f-40bd-9226-82c2b1abbeda\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tqxds" Dec 07 19:15:39 crc kubenswrapper[4815]: I1207 19:15:39.606766 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cb6cb3bfed28d77ff74b7a2273ddb477e21f930f79b169bfbd98fa10a5ead5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:39Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:39 crc kubenswrapper[4815]: I1207 19:15:39.641667 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac471afc20373bd22caa4941b5ebe38fdf62ad2145fabb4211fa02ce8728ab1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4050d0d7398492efc8809311a01c1a60986b97b6d177846e70c9283399f37b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c63ad6fab49ca4b8deb896aef3b5ff43c3a1ae1035e7ad94815a938939c57c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://812356ce6808d8d2a5845dec1132733fab86a0d6853951920ce5fb5083551c7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ab1c01dcbfe50580e64d89dc9e14843419588c89d7d792af89034d8f272e44e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f140818c38211cb31d63c0b540b60681a781c349ba0523866a3c7e80f33756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://69f7f34258562f08a32090f2be4e879a9646f9adf946e66561c4e127eac3ffc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69f7f34258562f08a32090f2be4e879a9646f9adf946e66561c4e127eac3ffc9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-07T19:15:37Z\\\",\\\"message\\\":\\\"er Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-kube-controller-manager/kube-controller-manager]} name:Service_openshift-kube-controller-manager/kube-controller-manager_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.36:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {ba175bbe-5cc4-47e6-a32d-57693e1320bd}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1207 19:15:36.490000 6038 services_controller.go:452] Built service openshift-machine-api/machine-api-controllers per-node LB for network=default: []services.LB{}\\\\nI1207 19:15:36.490018 6038 services_controller.go:453] Built service openshift-machine-api/machine-api-controllers template LB for network=default: []services.LB{}\\\\nI1207 19:15:36.490032 6038 services_controller.go:454] Service openshift-machine-api/machine-api-controllers for network=default has 3 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF1207 19:15:36.490040 6038 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba72f8ec4b6a36d0c1a6fd8f9f8e5fad577a00b8ae515738a007b994b989d134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a4a4b2d4e80c078ae857e91491c082c6b1c57a5297c485c31fac0c90996b19f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a4a4b2d4e80c078ae857e91491c082c6b1c57a5297c485c31fac0c90996b19f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zzw6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:39Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:39 crc kubenswrapper[4815]: I1207 19:15:39.645339 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:39 crc kubenswrapper[4815]: I1207 19:15:39.645379 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:39 crc kubenswrapper[4815]: I1207 19:15:39.645390 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:39 crc kubenswrapper[4815]: I1207 19:15:39.645404 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:39 crc kubenswrapper[4815]: I1207 19:15:39.645415 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:39Z","lastTransitionTime":"2025-12-07T19:15:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:39 crc kubenswrapper[4815]: I1207 19:15:39.656932 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kh7gd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29b15186-a725-4067-ba76-336f580327fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1fa8bda26912926bc52757424a9930e1eca6ee953f28ffdb205c1b8b8f29d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kh7gd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:39Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:39 crc kubenswrapper[4815]: I1207 19:15:39.677429 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe01e9d3-9839-41ea-82d9-f4b6f48bf3b2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52ab545f838534306ea70cc6e400e9fb56ed71ed8206d72c6626166f3a51a93c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f59886c065844948639bf9828ed20c24546cc38659683c98162568c6b12c029\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bec885c9a80d9b5b1fe630a9b7a3f19b23b0af05f359bd87a664f007812b4c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c76d9f056d22d1be012327374370269c2f6d6b8bd341c98419fcb744370751e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539bb1f0e569004272aac994e13bdc7f17f342f5986f08beca97a4c888e046c2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1207 19:15:09.905103 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1207 19:15:09.906859 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4228347019/tls.crt::/tmp/serving-cert-4228347019/tls.key\\\\\\\"\\\\nI1207 19:15:20.294696 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1207 19:15:20.298748 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1207 19:15:20.298781 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1207 19:15:20.298843 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1207 19:15:20.298860 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1207 19:15:20.305864 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1207 19:15:20.305885 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1207 19:15:20.305891 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1207 19:15:20.305973 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1207 19:15:20.305983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1207 19:15:20.305988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1207 19:15:20.305992 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1207 19:15:20.305997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1207 19:15:20.308727 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://902a764a3f6e3ac55c1817f369dfd69ffbf8528bfe4444a6514cdabd48f09fb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:39Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:39 crc kubenswrapper[4815]: I1207 19:15:39.690365 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5dfd265a-0c7f-40bd-9226-82c2b1abbeda-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-tqxds\" (UID: \"5dfd265a-0c7f-40bd-9226-82c2b1abbeda\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tqxds" Dec 07 19:15:39 crc kubenswrapper[4815]: I1207 19:15:39.690410 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5dfd265a-0c7f-40bd-9226-82c2b1abbeda-env-overrides\") pod \"ovnkube-control-plane-749d76644c-tqxds\" (UID: \"5dfd265a-0c7f-40bd-9226-82c2b1abbeda\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tqxds" Dec 07 19:15:39 crc kubenswrapper[4815]: I1207 19:15:39.690538 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzfhb\" (UniqueName: \"kubernetes.io/projected/5dfd265a-0c7f-40bd-9226-82c2b1abbeda-kube-api-access-rzfhb\") pod \"ovnkube-control-plane-749d76644c-tqxds\" (UID: \"5dfd265a-0c7f-40bd-9226-82c2b1abbeda\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tqxds" Dec 07 19:15:39 crc kubenswrapper[4815]: I1207 19:15:39.690573 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5dfd265a-0c7f-40bd-9226-82c2b1abbeda-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-tqxds\" (UID: \"5dfd265a-0c7f-40bd-9226-82c2b1abbeda\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tqxds" Dec 07 19:15:39 crc kubenswrapper[4815]: I1207 19:15:39.691137 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5dfd265a-0c7f-40bd-9226-82c2b1abbeda-env-overrides\") pod \"ovnkube-control-plane-749d76644c-tqxds\" (UID: \"5dfd265a-0c7f-40bd-9226-82c2b1abbeda\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tqxds" Dec 07 19:15:39 crc kubenswrapper[4815]: I1207 19:15:39.691256 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5dfd265a-0c7f-40bd-9226-82c2b1abbeda-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-tqxds\" (UID: \"5dfd265a-0c7f-40bd-9226-82c2b1abbeda\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tqxds" Dec 07 19:15:39 crc kubenswrapper[4815]: I1207 19:15:39.702588 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5dfd265a-0c7f-40bd-9226-82c2b1abbeda-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-tqxds\" (UID: \"5dfd265a-0c7f-40bd-9226-82c2b1abbeda\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tqxds" Dec 07 19:15:39 crc kubenswrapper[4815]: I1207 19:15:39.709001 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:39Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:39 crc kubenswrapper[4815]: I1207 19:15:39.717837 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzfhb\" (UniqueName: \"kubernetes.io/projected/5dfd265a-0c7f-40bd-9226-82c2b1abbeda-kube-api-access-rzfhb\") pod \"ovnkube-control-plane-749d76644c-tqxds\" (UID: \"5dfd265a-0c7f-40bd-9226-82c2b1abbeda\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tqxds" Dec 07 19:15:39 crc kubenswrapper[4815]: I1207 19:15:39.723215 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b57a8653da7e9136a6c54872ff67a01c2d6ad5077842b783ffe74e0069bc8693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:39Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:39 crc kubenswrapper[4815]: I1207 19:15:39.734497 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:39Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:39 crc kubenswrapper[4815]: I1207 19:15:39.748197 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:39 crc kubenswrapper[4815]: I1207 19:15:39.748228 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:39 crc kubenswrapper[4815]: I1207 19:15:39.748239 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:39 crc kubenswrapper[4815]: I1207 19:15:39.748254 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:39 crc kubenswrapper[4815]: I1207 19:15:39.748266 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:39Z","lastTransitionTime":"2025-12-07T19:15:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:39 crc kubenswrapper[4815]: I1207 19:15:39.748767 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ec34a6371c74ee0778e27d73f5e2948d9ee1bb518f658675df123df60879c61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc20ae853b989c9e9cfeb47a40bd1a02397e3ce0b0b4feec3fcec56ea7be16a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:39Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:39 crc kubenswrapper[4815]: I1207 19:15:39.764153 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:39Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:39 crc kubenswrapper[4815]: I1207 19:15:39.769127 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 07 19:15:39 crc kubenswrapper[4815]: I1207 19:15:39.769209 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 07 19:15:39 crc kubenswrapper[4815]: E1207 19:15:39.769335 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 07 19:15:39 crc kubenswrapper[4815]: E1207 19:15:39.769660 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 07 19:15:39 crc kubenswrapper[4815]: I1207 19:15:39.772448 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nz6q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92d414c2-ecc3-4598-ac9d-b982bfd89c7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://003f0d6dca39f167925bc3a05634d9dd20f23e54135631e0099a56801f9daf3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cp5k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nz6q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:39Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:39 crc kubenswrapper[4815]: I1207 19:15:39.781247 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d662ba2-aa03-4eea-bd30-8ad40638f6c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4939db657209a8744656f63541b50598888981216b000dd4316a9327fdfbcf34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2v8vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55eba62082762b6527923446c2d419d2eb00919331cf8e59bfa9cb27ca65a82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2v8vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gkn4h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:39Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:39 crc kubenswrapper[4815]: I1207 19:15:39.824566 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tqxds" Dec 07 19:15:39 crc kubenswrapper[4815]: W1207 19:15:39.838767 4815 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5dfd265a_0c7f_40bd_9226_82c2b1abbeda.slice/crio-010809facfee7d9033ef2db09bc39371c6cc31ce60fab8904410f4fe3511b204 WatchSource:0}: Error finding container 010809facfee7d9033ef2db09bc39371c6cc31ce60fab8904410f4fe3511b204: Status 404 returned error can't find the container with id 010809facfee7d9033ef2db09bc39371c6cc31ce60fab8904410f4fe3511b204 Dec 07 19:15:39 crc kubenswrapper[4815]: I1207 19:15:39.850246 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:39 crc kubenswrapper[4815]: I1207 19:15:39.850274 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:39 crc kubenswrapper[4815]: I1207 19:15:39.850282 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:39 crc kubenswrapper[4815]: I1207 19:15:39.850294 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:39 crc kubenswrapper[4815]: I1207 19:15:39.850303 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:39Z","lastTransitionTime":"2025-12-07T19:15:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:39 crc kubenswrapper[4815]: I1207 19:15:39.952543 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:39 crc kubenswrapper[4815]: I1207 19:15:39.952600 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:39 crc kubenswrapper[4815]: I1207 19:15:39.952620 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:39 crc kubenswrapper[4815]: I1207 19:15:39.952651 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:39 crc kubenswrapper[4815]: I1207 19:15:39.952667 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:39Z","lastTransitionTime":"2025-12-07T19:15:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.054572 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.054609 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.054617 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.054631 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.054640 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:40Z","lastTransitionTime":"2025-12-07T19:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.114824 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zzw6c_13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f/ovnkube-controller/0.log" Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.118158 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" event={"ID":"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f","Type":"ContainerStarted","Data":"c6dacc25cfd09c798577f8ef8470f6a4d7f1fe33b265ab8f798ea6ae999bd3ac"} Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.119040 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.119150 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tqxds" event={"ID":"5dfd265a-0c7f-40bd-9226-82c2b1abbeda","Type":"ContainerStarted","Data":"010809facfee7d9033ef2db09bc39371c6cc31ce60fab8904410f4fe3511b204"} Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.132151 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s95hp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b739f36-d9c4-4fb6-9ead-9df05e283dea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73eedf1d85ce69ddf2fed2445ed9318cb4515aee47f89fe9b1db69bbf213ea2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prbsc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s95hp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:40Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.145584 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gmf4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fefaece-ab52-48e2-9ee9-fb07be1922f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://600d32fcae56dd0fa63421283e9b0fdfc23c4589a1dc4a66e5ff1f4c6a415e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e14454c2c5dbbdcbfd3ff7ec5d618c2dfe48f41af9846b72f61941b878e53d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e14454c2c5dbbdcbfd3ff7ec5d618c2dfe48f41af9846b72f61941b878e53d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f33f1decd49213b8e034c2d1f2a3a51ab99aa18679bd3378becaa27f1ef4b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f33f1decd49213b8e034c2d1f2a3a51ab99aa18679bd3378becaa27f1ef4b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c519c7c8ea1b7405466970c64269b72cc6e8129838764dfc4a662cb4df419c6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c519c7c8ea1b7405466970c64269b72cc6e8129838764dfc4a662cb4df419c6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c9efcd688f4886df56fb4d0e2ff5f5dd0c9e877186e1b53d398b7bed0547609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c9efcd688f4886df56fb4d0e2ff5f5dd0c9e877186e1b53d398b7bed0547609\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a4a622d5231f5bacbb9dba0290a87f10310a0053be2dd692718fcf1002deb44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a4a622d5231f5bacbb9dba0290a87f10310a0053be2dd692718fcf1002deb44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://825649f5768765281bad22f9b86ed162b80cc258ce11a746ec06c9f415a21529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://825649f5768765281bad22f9b86ed162b80cc258ce11a746ec06c9f415a21529\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gmf4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:40Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.156351 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tqxds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dfd265a-0c7f-40bd-9226-82c2b1abbeda\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzfhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzfhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tqxds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:40Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.156847 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.156906 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.156960 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.156980 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.156996 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:40Z","lastTransitionTime":"2025-12-07T19:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.171985 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"538dccc2-452b-4d13-8b0d-df993c2b69a2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae001d2e4c277c57382ca0c4a1286c3c135f2fc06ca5a78df27cbdab588ce576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f92a2d37af34c519a6b00a9643f4a46de779ae0070adb0826e9f227960fe4a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89a76372fcc5ff7131084098e41183d148096169b9ccaf69d59dbef45dcfd755\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f29e69a7aa018d68c2e39f3e0f7a619fb25262acaebf83b81fabb6f9b87a9892\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:40Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.182217 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cb6cb3bfed28d77ff74b7a2273ddb477e21f930f79b169bfbd98fa10a5ead5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:40Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.199775 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac471afc20373bd22caa4941b5ebe38fdf62ad2145fabb4211fa02ce8728ab1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4050d0d7398492efc8809311a01c1a60986b97b6d177846e70c9283399f37b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c63ad6fab49ca4b8deb896aef3b5ff43c3a1ae1035e7ad94815a938939c57c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://812356ce6808d8d2a5845dec1132733fab86a0d6853951920ce5fb5083551c7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ab1c01dcbfe50580e64d89dc9e14843419588c89d7d792af89034d8f272e44e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f140818c38211cb31d63c0b540b60681a781c349ba0523866a3c7e80f33756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6dacc25cfd09c798577f8ef8470f6a4d7f1fe33b265ab8f798ea6ae999bd3ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69f7f34258562f08a32090f2be4e879a9646f9adf946e66561c4e127eac3ffc9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-07T19:15:37Z\\\",\\\"message\\\":\\\"er Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-kube-controller-manager/kube-controller-manager]} name:Service_openshift-kube-controller-manager/kube-controller-manager_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.36:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {ba175bbe-5cc4-47e6-a32d-57693e1320bd}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1207 19:15:36.490000 6038 services_controller.go:452] Built service openshift-machine-api/machine-api-controllers per-node LB for network=default: []services.LB{}\\\\nI1207 19:15:36.490018 6038 services_controller.go:453] Built service openshift-machine-api/machine-api-controllers template LB for network=default: []services.LB{}\\\\nI1207 19:15:36.490032 6038 services_controller.go:454] Service openshift-machine-api/machine-api-controllers for network=default has 3 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF1207 19:15:36.490040 6038 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba72f8ec4b6a36d0c1a6fd8f9f8e5fad577a00b8ae515738a007b994b989d134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a4a4b2d4e80c078ae857e91491c082c6b1c57a5297c485c31fac0c90996b19f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a4a4b2d4e80c078ae857e91491c082c6b1c57a5297c485c31fac0c90996b19f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zzw6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:40Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.210509 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kh7gd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29b15186-a725-4067-ba76-336f580327fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1fa8bda26912926bc52757424a9930e1eca6ee953f28ffdb205c1b8b8f29d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kh7gd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:40Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.222747 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ec34a6371c74ee0778e27d73f5e2948d9ee1bb518f658675df123df60879c61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc20ae853b989c9e9cfeb47a40bd1a02397e3ce0b0b4feec3fcec56ea7be16a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:40Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.234792 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:40Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.246000 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nz6q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92d414c2-ecc3-4598-ac9d-b982bfd89c7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://003f0d6dca39f167925bc3a05634d9dd20f23e54135631e0099a56801f9daf3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cp5k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nz6q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:40Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.259195 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.259229 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.259240 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.259259 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.259270 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:40Z","lastTransitionTime":"2025-12-07T19:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.260588 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe01e9d3-9839-41ea-82d9-f4b6f48bf3b2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52ab545f838534306ea70cc6e400e9fb56ed71ed8206d72c6626166f3a51a93c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f59886c065844948639bf9828ed20c24546cc38659683c98162568c6b12c029\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bec885c9a80d9b5b1fe630a9b7a3f19b23b0af05f359bd87a664f007812b4c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c76d9f056d22d1be012327374370269c2f6d6b8bd341c98419fcb744370751e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539bb1f0e569004272aac994e13bdc7f17f342f5986f08beca97a4c888e046c2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1207 19:15:09.905103 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1207 19:15:09.906859 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4228347019/tls.crt::/tmp/serving-cert-4228347019/tls.key\\\\\\\"\\\\nI1207 19:15:20.294696 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1207 19:15:20.298748 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1207 19:15:20.298781 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1207 19:15:20.298843 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1207 19:15:20.298860 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1207 19:15:20.305864 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1207 19:15:20.305885 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1207 19:15:20.305891 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1207 19:15:20.305973 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1207 19:15:20.305983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1207 19:15:20.305988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1207 19:15:20.305992 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1207 19:15:20.305997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1207 19:15:20.308727 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://902a764a3f6e3ac55c1817f369dfd69ffbf8528bfe4444a6514cdabd48f09fb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:40Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.273107 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:40Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.289005 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b57a8653da7e9136a6c54872ff67a01c2d6ad5077842b783ffe74e0069bc8693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:40Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.303314 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:40Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.317277 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d662ba2-aa03-4eea-bd30-8ad40638f6c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4939db657209a8744656f63541b50598888981216b000dd4316a9327fdfbcf34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2v8vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55eba62082762b6527923446c2d419d2eb00919331cf8e59bfa9cb27ca65a82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2v8vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gkn4h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:40Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.362194 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.362275 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.362290 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.362307 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.362349 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:40Z","lastTransitionTime":"2025-12-07T19:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.432835 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.446858 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d662ba2-aa03-4eea-bd30-8ad40638f6c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4939db657209a8744656f63541b50598888981216b000dd4316a9327fdfbcf34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2v8vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55eba62082762b6527923446c2d419d2eb00919331cf8e59bfa9cb27ca65a82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2v8vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gkn4h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:40Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.458380 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s95hp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b739f36-d9c4-4fb6-9ead-9df05e283dea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73eedf1d85ce69ddf2fed2445ed9318cb4515aee47f89fe9b1db69bbf213ea2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prbsc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s95hp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:40Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.464820 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.464840 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.464848 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.464860 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.464868 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:40Z","lastTransitionTime":"2025-12-07T19:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.480987 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gmf4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fefaece-ab52-48e2-9ee9-fb07be1922f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://600d32fcae56dd0fa63421283e9b0fdfc23c4589a1dc4a66e5ff1f4c6a415e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e14454c2c5dbbdcbfd3ff7ec5d618c2dfe48f41af9846b72f61941b878e53d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e14454c2c5dbbdcbfd3ff7ec5d618c2dfe48f41af9846b72f61941b878e53d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f33f1decd49213b8e034c2d1f2a3a51ab99aa18679bd3378becaa27f1ef4b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f33f1decd49213b8e034c2d1f2a3a51ab99aa18679bd3378becaa27f1ef4b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c519c7c8ea1b7405466970c64269b72cc6e8129838764dfc4a662cb4df419c6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c519c7c8ea1b7405466970c64269b72cc6e8129838764dfc4a662cb4df419c6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c9efcd688f4886df56fb4d0e2ff5f5dd0c9e877186e1b53d398b7bed0547609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c9efcd688f4886df56fb4d0e2ff5f5dd0c9e877186e1b53d398b7bed0547609\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a4a622d5231f5bacbb9dba0290a87f10310a0053be2dd692718fcf1002deb44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a4a622d5231f5bacbb9dba0290a87f10310a0053be2dd692718fcf1002deb44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://825649f5768765281bad22f9b86ed162b80cc258ce11a746ec06c9f415a21529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://825649f5768765281bad22f9b86ed162b80cc258ce11a746ec06c9f415a21529\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gmf4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:40Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.505373 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tqxds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dfd265a-0c7f-40bd-9226-82c2b1abbeda\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzfhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzfhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tqxds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:40Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.524630 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"538dccc2-452b-4d13-8b0d-df993c2b69a2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae001d2e4c277c57382ca0c4a1286c3c135f2fc06ca5a78df27cbdab588ce576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f92a2d37af34c519a6b00a9643f4a46de779ae0070adb0826e9f227960fe4a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89a76372fcc5ff7131084098e41183d148096169b9ccaf69d59dbef45dcfd755\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f29e69a7aa018d68c2e39f3e0f7a619fb25262acaebf83b81fabb6f9b87a9892\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:40Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.540300 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cb6cb3bfed28d77ff74b7a2273ddb477e21f930f79b169bfbd98fa10a5ead5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:40Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.564798 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac471afc20373bd22caa4941b5ebe38fdf62ad2145fabb4211fa02ce8728ab1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4050d0d7398492efc8809311a01c1a60986b97b6d177846e70c9283399f37b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c63ad6fab49ca4b8deb896aef3b5ff43c3a1ae1035e7ad94815a938939c57c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://812356ce6808d8d2a5845dec1132733fab86a0d6853951920ce5fb5083551c7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ab1c01dcbfe50580e64d89dc9e14843419588c89d7d792af89034d8f272e44e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f140818c38211cb31d63c0b540b60681a781c349ba0523866a3c7e80f33756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6dacc25cfd09c798577f8ef8470f6a4d7f1fe33b265ab8f798ea6ae999bd3ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69f7f34258562f08a32090f2be4e879a9646f9adf946e66561c4e127eac3ffc9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-07T19:15:37Z\\\",\\\"message\\\":\\\"er Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-kube-controller-manager/kube-controller-manager]} name:Service_openshift-kube-controller-manager/kube-controller-manager_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.36:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {ba175bbe-5cc4-47e6-a32d-57693e1320bd}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1207 19:15:36.490000 6038 services_controller.go:452] Built service openshift-machine-api/machine-api-controllers per-node LB for network=default: []services.LB{}\\\\nI1207 19:15:36.490018 6038 services_controller.go:453] Built service openshift-machine-api/machine-api-controllers template LB for network=default: []services.LB{}\\\\nI1207 19:15:36.490032 6038 services_controller.go:454] Service openshift-machine-api/machine-api-controllers for network=default has 3 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF1207 19:15:36.490040 6038 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba72f8ec4b6a36d0c1a6fd8f9f8e5fad577a00b8ae515738a007b994b989d134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a4a4b2d4e80c078ae857e91491c082c6b1c57a5297c485c31fac0c90996b19f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a4a4b2d4e80c078ae857e91491c082c6b1c57a5297c485c31fac0c90996b19f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zzw6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:40Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.567662 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.567699 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.567710 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.567726 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.567738 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:40Z","lastTransitionTime":"2025-12-07T19:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.574650 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kh7gd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29b15186-a725-4067-ba76-336f580327fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1fa8bda26912926bc52757424a9930e1eca6ee953f28ffdb205c1b8b8f29d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kh7gd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:40Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.588958 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nz6q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92d414c2-ecc3-4598-ac9d-b982bfd89c7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://003f0d6dca39f167925bc3a05634d9dd20f23e54135631e0099a56801f9daf3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cp5k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nz6q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:40Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.602771 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe01e9d3-9839-41ea-82d9-f4b6f48bf3b2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52ab545f838534306ea70cc6e400e9fb56ed71ed8206d72c6626166f3a51a93c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f59886c065844948639bf9828ed20c24546cc38659683c98162568c6b12c029\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bec885c9a80d9b5b1fe630a9b7a3f19b23b0af05f359bd87a664f007812b4c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c76d9f056d22d1be012327374370269c2f6d6b8bd341c98419fcb744370751e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539bb1f0e569004272aac994e13bdc7f17f342f5986f08beca97a4c888e046c2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1207 19:15:09.905103 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1207 19:15:09.906859 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4228347019/tls.crt::/tmp/serving-cert-4228347019/tls.key\\\\\\\"\\\\nI1207 19:15:20.294696 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1207 19:15:20.298748 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1207 19:15:20.298781 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1207 19:15:20.298843 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1207 19:15:20.298860 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1207 19:15:20.305864 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1207 19:15:20.305885 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1207 19:15:20.305891 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1207 19:15:20.305973 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1207 19:15:20.305983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1207 19:15:20.305988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1207 19:15:20.305992 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1207 19:15:20.305997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1207 19:15:20.308727 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://902a764a3f6e3ac55c1817f369dfd69ffbf8528bfe4444a6514cdabd48f09fb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:40Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.615090 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:40Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.627011 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b57a8653da7e9136a6c54872ff67a01c2d6ad5077842b783ffe74e0069bc8693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:40Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.637601 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:40Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.652332 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ec34a6371c74ee0778e27d73f5e2948d9ee1bb518f658675df123df60879c61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc20ae853b989c9e9cfeb47a40bd1a02397e3ce0b0b4feec3fcec56ea7be16a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:40Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.663589 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:40Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.670301 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.670338 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.670350 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.670384 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.670395 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:40Z","lastTransitionTime":"2025-12-07T19:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.675076 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-xbq22"] Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.675575 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xbq22" Dec 07 19:15:40 crc kubenswrapper[4815]: E1207 19:15:40.675638 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xbq22" podUID="201e9ba8-3e19-4555-90f0-587497a2a328" Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.687541 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s95hp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b739f36-d9c4-4fb6-9ead-9df05e283dea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73eedf1d85ce69ddf2fed2445ed9318cb4515aee47f89fe9b1db69bbf213ea2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prbsc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s95hp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:40Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.703730 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gmf4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fefaece-ab52-48e2-9ee9-fb07be1922f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://600d32fcae56dd0fa63421283e9b0fdfc23c4589a1dc4a66e5ff1f4c6a415e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e14454c2c5dbbdcbfd3ff7ec5d618c2dfe48f41af9846b72f61941b878e53d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e14454c2c5dbbdcbfd3ff7ec5d618c2dfe48f41af9846b72f61941b878e53d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f33f1decd49213b8e034c2d1f2a3a51ab99aa18679bd3378becaa27f1ef4b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f33f1decd49213b8e034c2d1f2a3a51ab99aa18679bd3378becaa27f1ef4b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c519c7c8ea1b7405466970c64269b72cc6e8129838764dfc4a662cb4df419c6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c519c7c8ea1b7405466970c64269b72cc6e8129838764dfc4a662cb4df419c6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c9efcd688f4886df56fb4d0e2ff5f5dd0c9e877186e1b53d398b7bed0547609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c9efcd688f4886df56fb4d0e2ff5f5dd0c9e877186e1b53d398b7bed0547609\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a4a622d5231f5bacbb9dba0290a87f10310a0053be2dd692718fcf1002deb44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a4a622d5231f5bacbb9dba0290a87f10310a0053be2dd692718fcf1002deb44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://825649f5768765281bad22f9b86ed162b80cc258ce11a746ec06c9f415a21529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://825649f5768765281bad22f9b86ed162b80cc258ce11a746ec06c9f415a21529\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gmf4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:40Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.713396 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tqxds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dfd265a-0c7f-40bd-9226-82c2b1abbeda\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzfhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzfhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tqxds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:40Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.725673 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"538dccc2-452b-4d13-8b0d-df993c2b69a2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae001d2e4c277c57382ca0c4a1286c3c135f2fc06ca5a78df27cbdab588ce576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f92a2d37af34c519a6b00a9643f4a46de779ae0070adb0826e9f227960fe4a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89a76372fcc5ff7131084098e41183d148096169b9ccaf69d59dbef45dcfd755\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f29e69a7aa018d68c2e39f3e0f7a619fb25262acaebf83b81fabb6f9b87a9892\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:40Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.736873 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cb6cb3bfed28d77ff74b7a2273ddb477e21f930f79b169bfbd98fa10a5ead5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:40Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.757427 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac471afc20373bd22caa4941b5ebe38fdf62ad2145fabb4211fa02ce8728ab1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4050d0d7398492efc8809311a01c1a60986b97b6d177846e70c9283399f37b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c63ad6fab49ca4b8deb896aef3b5ff43c3a1ae1035e7ad94815a938939c57c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://812356ce6808d8d2a5845dec1132733fab86a0d6853951920ce5fb5083551c7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ab1c01dcbfe50580e64d89dc9e14843419588c89d7d792af89034d8f272e44e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f140818c38211cb31d63c0b540b60681a781c349ba0523866a3c7e80f33756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6dacc25cfd09c798577f8ef8470f6a4d7f1fe33b265ab8f798ea6ae999bd3ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69f7f34258562f08a32090f2be4e879a9646f9adf946e66561c4e127eac3ffc9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-07T19:15:37Z\\\",\\\"message\\\":\\\"er Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-kube-controller-manager/kube-controller-manager]} name:Service_openshift-kube-controller-manager/kube-controller-manager_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.36:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {ba175bbe-5cc4-47e6-a32d-57693e1320bd}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1207 19:15:36.490000 6038 services_controller.go:452] Built service openshift-machine-api/machine-api-controllers per-node LB for network=default: []services.LB{}\\\\nI1207 19:15:36.490018 6038 services_controller.go:453] Built service openshift-machine-api/machine-api-controllers template LB for network=default: []services.LB{}\\\\nI1207 19:15:36.490032 6038 services_controller.go:454] Service openshift-machine-api/machine-api-controllers for network=default has 3 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF1207 19:15:36.490040 6038 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba72f8ec4b6a36d0c1a6fd8f9f8e5fad577a00b8ae515738a007b994b989d134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a4a4b2d4e80c078ae857e91491c082c6b1c57a5297c485c31fac0c90996b19f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a4a4b2d4e80c078ae857e91491c082c6b1c57a5297c485c31fac0c90996b19f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zzw6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:40Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.765127 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kh7gd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29b15186-a725-4067-ba76-336f580327fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1fa8bda26912926bc52757424a9930e1eca6ee953f28ffdb205c1b8b8f29d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kh7gd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:40Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.769645 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 07 19:15:40 crc kubenswrapper[4815]: E1207 19:15:40.769739 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.772139 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.772160 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.772168 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.772178 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.772187 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:40Z","lastTransitionTime":"2025-12-07T19:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.774952 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ec34a6371c74ee0778e27d73f5e2948d9ee1bb518f658675df123df60879c61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc20ae853b989c9e9cfeb47a40bd1a02397e3ce0b0b4feec3fcec56ea7be16a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:40Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.786210 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:40Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.793485 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nz6q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92d414c2-ecc3-4598-ac9d-b982bfd89c7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://003f0d6dca39f167925bc3a05634d9dd20f23e54135631e0099a56801f9daf3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cp5k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nz6q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:40Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.801717 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/201e9ba8-3e19-4555-90f0-587497a2a328-metrics-certs\") pod \"network-metrics-daemon-xbq22\" (UID: \"201e9ba8-3e19-4555-90f0-587497a2a328\") " pod="openshift-multus/network-metrics-daemon-xbq22" Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.801754 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z24bd\" (UniqueName: \"kubernetes.io/projected/201e9ba8-3e19-4555-90f0-587497a2a328-kube-api-access-z24bd\") pod \"network-metrics-daemon-xbq22\" (UID: \"201e9ba8-3e19-4555-90f0-587497a2a328\") " pod="openshift-multus/network-metrics-daemon-xbq22" Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.806714 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe01e9d3-9839-41ea-82d9-f4b6f48bf3b2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52ab545f838534306ea70cc6e400e9fb56ed71ed8206d72c6626166f3a51a93c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f59886c065844948639bf9828ed20c24546cc38659683c98162568c6b12c029\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bec885c9a80d9b5b1fe630a9b7a3f19b23b0af05f359bd87a664f007812b4c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c76d9f056d22d1be012327374370269c2f6d6b8bd341c98419fcb744370751e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539bb1f0e569004272aac994e13bdc7f17f342f5986f08beca97a4c888e046c2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1207 19:15:09.905103 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1207 19:15:09.906859 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4228347019/tls.crt::/tmp/serving-cert-4228347019/tls.key\\\\\\\"\\\\nI1207 19:15:20.294696 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1207 19:15:20.298748 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1207 19:15:20.298781 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1207 19:15:20.298843 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1207 19:15:20.298860 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1207 19:15:20.305864 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1207 19:15:20.305885 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1207 19:15:20.305891 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1207 19:15:20.305973 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1207 19:15:20.305983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1207 19:15:20.305988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1207 19:15:20.305992 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1207 19:15:20.305997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1207 19:15:20.308727 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://902a764a3f6e3ac55c1817f369dfd69ffbf8528bfe4444a6514cdabd48f09fb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:40Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.816045 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:40Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.826339 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b57a8653da7e9136a6c54872ff67a01c2d6ad5077842b783ffe74e0069bc8693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:40Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.834767 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:40Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.843342 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d662ba2-aa03-4eea-bd30-8ad40638f6c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4939db657209a8744656f63541b50598888981216b000dd4316a9327fdfbcf34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2v8vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55eba62082762b6527923446c2d419d2eb00919331cf8e59bfa9cb27ca65a82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2v8vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gkn4h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:40Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.851710 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xbq22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"201e9ba8-3e19-4555-90f0-587497a2a328\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z24bd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z24bd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xbq22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:40Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.875065 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.875108 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.875125 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.875149 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.875166 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:40Z","lastTransitionTime":"2025-12-07T19:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.903006 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z24bd\" (UniqueName: \"kubernetes.io/projected/201e9ba8-3e19-4555-90f0-587497a2a328-kube-api-access-z24bd\") pod \"network-metrics-daemon-xbq22\" (UID: \"201e9ba8-3e19-4555-90f0-587497a2a328\") " pod="openshift-multus/network-metrics-daemon-xbq22" Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.903150 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/201e9ba8-3e19-4555-90f0-587497a2a328-metrics-certs\") pod \"network-metrics-daemon-xbq22\" (UID: \"201e9ba8-3e19-4555-90f0-587497a2a328\") " pod="openshift-multus/network-metrics-daemon-xbq22" Dec 07 19:15:40 crc kubenswrapper[4815]: E1207 19:15:40.903336 4815 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 07 19:15:40 crc kubenswrapper[4815]: E1207 19:15:40.903411 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/201e9ba8-3e19-4555-90f0-587497a2a328-metrics-certs podName:201e9ba8-3e19-4555-90f0-587497a2a328 nodeName:}" failed. No retries permitted until 2025-12-07 19:15:41.4033872 +0000 UTC m=+45.982377285 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/201e9ba8-3e19-4555-90f0-587497a2a328-metrics-certs") pod "network-metrics-daemon-xbq22" (UID: "201e9ba8-3e19-4555-90f0-587497a2a328") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.917267 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z24bd\" (UniqueName: \"kubernetes.io/projected/201e9ba8-3e19-4555-90f0-587497a2a328-kube-api-access-z24bd\") pod \"network-metrics-daemon-xbq22\" (UID: \"201e9ba8-3e19-4555-90f0-587497a2a328\") " pod="openshift-multus/network-metrics-daemon-xbq22" Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.978218 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.978303 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.978326 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.978357 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:40 crc kubenswrapper[4815]: I1207 19:15:40.978379 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:40Z","lastTransitionTime":"2025-12-07T19:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:41 crc kubenswrapper[4815]: I1207 19:15:41.080886 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:41 crc kubenswrapper[4815]: I1207 19:15:41.080992 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:41 crc kubenswrapper[4815]: I1207 19:15:41.081009 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:41 crc kubenswrapper[4815]: I1207 19:15:41.081036 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:41 crc kubenswrapper[4815]: I1207 19:15:41.081053 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:41Z","lastTransitionTime":"2025-12-07T19:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:41 crc kubenswrapper[4815]: I1207 19:15:41.126059 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zzw6c_13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f/ovnkube-controller/1.log" Dec 07 19:15:41 crc kubenswrapper[4815]: I1207 19:15:41.127337 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zzw6c_13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f/ovnkube-controller/0.log" Dec 07 19:15:41 crc kubenswrapper[4815]: I1207 19:15:41.134115 4815 generic.go:334] "Generic (PLEG): container finished" podID="13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f" containerID="c6dacc25cfd09c798577f8ef8470f6a4d7f1fe33b265ab8f798ea6ae999bd3ac" exitCode=1 Dec 07 19:15:41 crc kubenswrapper[4815]: I1207 19:15:41.134245 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" event={"ID":"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f","Type":"ContainerDied","Data":"c6dacc25cfd09c798577f8ef8470f6a4d7f1fe33b265ab8f798ea6ae999bd3ac"} Dec 07 19:15:41 crc kubenswrapper[4815]: I1207 19:15:41.134479 4815 scope.go:117] "RemoveContainer" containerID="69f7f34258562f08a32090f2be4e879a9646f9adf946e66561c4e127eac3ffc9" Dec 07 19:15:41 crc kubenswrapper[4815]: I1207 19:15:41.135696 4815 scope.go:117] "RemoveContainer" containerID="c6dacc25cfd09c798577f8ef8470f6a4d7f1fe33b265ab8f798ea6ae999bd3ac" Dec 07 19:15:41 crc kubenswrapper[4815]: E1207 19:15:41.136056 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-zzw6c_openshift-ovn-kubernetes(13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" podUID="13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f" Dec 07 19:15:41 crc kubenswrapper[4815]: I1207 19:15:41.145332 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tqxds" event={"ID":"5dfd265a-0c7f-40bd-9226-82c2b1abbeda","Type":"ContainerStarted","Data":"a277a724d026909546ae84aee6d8d29fc87f0277f9d6aa45ca99e5f116ac79d3"} Dec 07 19:15:41 crc kubenswrapper[4815]: I1207 19:15:41.145413 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tqxds" event={"ID":"5dfd265a-0c7f-40bd-9226-82c2b1abbeda","Type":"ContainerStarted","Data":"3523e317081c56f05c00d0288c6ad6f1ff04f1346772bc0c86aed488570786a3"} Dec 07 19:15:41 crc kubenswrapper[4815]: I1207 19:15:41.164095 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gmf4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fefaece-ab52-48e2-9ee9-fb07be1922f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://600d32fcae56dd0fa63421283e9b0fdfc23c4589a1dc4a66e5ff1f4c6a415e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e14454c2c5dbbdcbfd3ff7ec5d618c2dfe48f41af9846b72f61941b878e53d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e14454c2c5dbbdcbfd3ff7ec5d618c2dfe48f41af9846b72f61941b878e53d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f33f1decd49213b8e034c2d1f2a3a51ab99aa18679bd3378becaa27f1ef4b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f33f1decd49213b8e034c2d1f2a3a51ab99aa18679bd3378becaa27f1ef4b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c519c7c8ea1b7405466970c64269b72cc6e8129838764dfc4a662cb4df419c6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c519c7c8ea1b7405466970c64269b72cc6e8129838764dfc4a662cb4df419c6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c9efcd688f4886df56fb4d0e2ff5f5dd0c9e877186e1b53d398b7bed0547609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c9efcd688f4886df56fb4d0e2ff5f5dd0c9e877186e1b53d398b7bed0547609\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a4a622d5231f5bacbb9dba0290a87f10310a0053be2dd692718fcf1002deb44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a4a622d5231f5bacbb9dba0290a87f10310a0053be2dd692718fcf1002deb44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://825649f5768765281bad22f9b86ed162b80cc258ce11a746ec06c9f415a21529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://825649f5768765281bad22f9b86ed162b80cc258ce11a746ec06c9f415a21529\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gmf4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:41Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:41 crc kubenswrapper[4815]: I1207 19:15:41.183810 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:41 crc kubenswrapper[4815]: I1207 19:15:41.184719 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:41 crc kubenswrapper[4815]: I1207 19:15:41.185148 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:41 crc kubenswrapper[4815]: I1207 19:15:41.185314 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tqxds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dfd265a-0c7f-40bd-9226-82c2b1abbeda\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzfhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzfhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tqxds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:41Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:41 crc kubenswrapper[4815]: I1207 19:15:41.185648 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:41 crc kubenswrapper[4815]: I1207 19:15:41.186012 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:41Z","lastTransitionTime":"2025-12-07T19:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:41 crc kubenswrapper[4815]: I1207 19:15:41.210735 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s95hp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b739f36-d9c4-4fb6-9ead-9df05e283dea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73eedf1d85ce69ddf2fed2445ed9318cb4515aee47f89fe9b1db69bbf213ea2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prbsc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s95hp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:41Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:41 crc kubenswrapper[4815]: I1207 19:15:41.225610 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kh7gd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29b15186-a725-4067-ba76-336f580327fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1fa8bda26912926bc52757424a9930e1eca6ee953f28ffdb205c1b8b8f29d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kh7gd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:41Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:41 crc kubenswrapper[4815]: I1207 19:15:41.242604 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"538dccc2-452b-4d13-8b0d-df993c2b69a2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae001d2e4c277c57382ca0c4a1286c3c135f2fc06ca5a78df27cbdab588ce576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f92a2d37af34c519a6b00a9643f4a46de779ae0070adb0826e9f227960fe4a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89a76372fcc5ff7131084098e41183d148096169b9ccaf69d59dbef45dcfd755\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f29e69a7aa018d68c2e39f3e0f7a619fb25262acaebf83b81fabb6f9b87a9892\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:41Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:41 crc kubenswrapper[4815]: I1207 19:15:41.262171 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cb6cb3bfed28d77ff74b7a2273ddb477e21f930f79b169bfbd98fa10a5ead5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:41Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:41 crc kubenswrapper[4815]: I1207 19:15:41.290298 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:41 crc kubenswrapper[4815]: I1207 19:15:41.290357 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:41 crc kubenswrapper[4815]: I1207 19:15:41.290375 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:41 crc kubenswrapper[4815]: I1207 19:15:41.290400 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:41 crc kubenswrapper[4815]: I1207 19:15:41.290417 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:41Z","lastTransitionTime":"2025-12-07T19:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:41 crc kubenswrapper[4815]: I1207 19:15:41.291100 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac471afc20373bd22caa4941b5ebe38fdf62ad2145fabb4211fa02ce8728ab1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4050d0d7398492efc8809311a01c1a60986b97b6d177846e70c9283399f37b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c63ad6fab49ca4b8deb896aef3b5ff43c3a1ae1035e7ad94815a938939c57c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://812356ce6808d8d2a5845dec1132733fab86a0d6853951920ce5fb5083551c7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ab1c01dcbfe50580e64d89dc9e14843419588c89d7d792af89034d8f272e44e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f140818c38211cb31d63c0b540b60681a781c349ba0523866a3c7e80f33756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6dacc25cfd09c798577f8ef8470f6a4d7f1fe33b265ab8f798ea6ae999bd3ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69f7f34258562f08a32090f2be4e879a9646f9adf946e66561c4e127eac3ffc9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-07T19:15:37Z\\\",\\\"message\\\":\\\"er Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-kube-controller-manager/kube-controller-manager]} name:Service_openshift-kube-controller-manager/kube-controller-manager_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.36:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {ba175bbe-5cc4-47e6-a32d-57693e1320bd}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1207 19:15:36.490000 6038 services_controller.go:452] Built service openshift-machine-api/machine-api-controllers per-node LB for network=default: []services.LB{}\\\\nI1207 19:15:36.490018 6038 services_controller.go:453] Built service openshift-machine-api/machine-api-controllers template LB for network=default: []services.LB{}\\\\nI1207 19:15:36.490032 6038 services_controller.go:454] Service openshift-machine-api/machine-api-controllers for network=default has 3 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF1207 19:15:36.490040 6038 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6dacc25cfd09c798577f8ef8470f6a4d7f1fe33b265ab8f798ea6ae999bd3ac\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"message\\\":\\\"rnetes/ovnkube-control-plane-749d76644c-tqxds\\\\nI1207 19:15:40.404183 6171 services_controller.go:451] Built service openshift-machine-api/machine-api-operator-machine-webhook cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-api/machine-api-operator-machine-webhook_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/machine-api-operator-machine-webhook\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.250\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1207 19:15:40.404208 6171 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba72f8ec4b6a36d0c1a6fd8f9f8e5fad577a00b8ae515738a007b994b989d134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a4a4b2d4e80c078ae857e91491c082c6b1c57a5297c485c31fac0c90996b19f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a4a4b2d4e80c078ae857e91491c082c6b1c57a5297c485c31fac0c90996b19f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zzw6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:41Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:41 crc kubenswrapper[4815]: I1207 19:15:41.309543 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b57a8653da7e9136a6c54872ff67a01c2d6ad5077842b783ffe74e0069bc8693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:41Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:41 crc kubenswrapper[4815]: I1207 19:15:41.328717 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:41Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:41 crc kubenswrapper[4815]: I1207 19:15:41.342760 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ec34a6371c74ee0778e27d73f5e2948d9ee1bb518f658675df123df60879c61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc20ae853b989c9e9cfeb47a40bd1a02397e3ce0b0b4feec3fcec56ea7be16a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:41Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:41 crc kubenswrapper[4815]: I1207 19:15:41.356291 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:41Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:41 crc kubenswrapper[4815]: I1207 19:15:41.368686 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nz6q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92d414c2-ecc3-4598-ac9d-b982bfd89c7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://003f0d6dca39f167925bc3a05634d9dd20f23e54135631e0099a56801f9daf3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cp5k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nz6q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:41Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:41 crc kubenswrapper[4815]: I1207 19:15:41.391708 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe01e9d3-9839-41ea-82d9-f4b6f48bf3b2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52ab545f838534306ea70cc6e400e9fb56ed71ed8206d72c6626166f3a51a93c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f59886c065844948639bf9828ed20c24546cc38659683c98162568c6b12c029\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bec885c9a80d9b5b1fe630a9b7a3f19b23b0af05f359bd87a664f007812b4c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c76d9f056d22d1be012327374370269c2f6d6b8bd341c98419fcb744370751e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539bb1f0e569004272aac994e13bdc7f17f342f5986f08beca97a4c888e046c2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1207 19:15:09.905103 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1207 19:15:09.906859 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4228347019/tls.crt::/tmp/serving-cert-4228347019/tls.key\\\\\\\"\\\\nI1207 19:15:20.294696 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1207 19:15:20.298748 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1207 19:15:20.298781 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1207 19:15:20.298843 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1207 19:15:20.298860 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1207 19:15:20.305864 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1207 19:15:20.305885 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1207 19:15:20.305891 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1207 19:15:20.305973 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1207 19:15:20.305983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1207 19:15:20.305988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1207 19:15:20.305992 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1207 19:15:20.305997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1207 19:15:20.308727 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://902a764a3f6e3ac55c1817f369dfd69ffbf8528bfe4444a6514cdabd48f09fb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:41Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:41 crc kubenswrapper[4815]: I1207 19:15:41.393224 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:41 crc kubenswrapper[4815]: I1207 19:15:41.393261 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:41 crc kubenswrapper[4815]: I1207 19:15:41.393274 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:41 crc kubenswrapper[4815]: I1207 19:15:41.393296 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:41 crc kubenswrapper[4815]: I1207 19:15:41.393309 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:41Z","lastTransitionTime":"2025-12-07T19:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:41 crc kubenswrapper[4815]: I1207 19:15:41.406603 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:41Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:41 crc kubenswrapper[4815]: I1207 19:15:41.408681 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/201e9ba8-3e19-4555-90f0-587497a2a328-metrics-certs\") pod \"network-metrics-daemon-xbq22\" (UID: \"201e9ba8-3e19-4555-90f0-587497a2a328\") " pod="openshift-multus/network-metrics-daemon-xbq22" Dec 07 19:15:41 crc kubenswrapper[4815]: E1207 19:15:41.408971 4815 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 07 19:15:41 crc kubenswrapper[4815]: E1207 19:15:41.409093 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/201e9ba8-3e19-4555-90f0-587497a2a328-metrics-certs podName:201e9ba8-3e19-4555-90f0-587497a2a328 nodeName:}" failed. No retries permitted until 2025-12-07 19:15:42.409060154 +0000 UTC m=+46.988050249 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/201e9ba8-3e19-4555-90f0-587497a2a328-metrics-certs") pod "network-metrics-daemon-xbq22" (UID: "201e9ba8-3e19-4555-90f0-587497a2a328") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 07 19:15:41 crc kubenswrapper[4815]: I1207 19:15:41.423091 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d662ba2-aa03-4eea-bd30-8ad40638f6c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4939db657209a8744656f63541b50598888981216b000dd4316a9327fdfbcf34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2v8vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55eba62082762b6527923446c2d419d2eb00919331cf8e59bfa9cb27ca65a82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2v8vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gkn4h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:41Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:41 crc kubenswrapper[4815]: I1207 19:15:41.435819 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xbq22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"201e9ba8-3e19-4555-90f0-587497a2a328\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z24bd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z24bd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xbq22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:41Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:41 crc kubenswrapper[4815]: I1207 19:15:41.457536 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s95hp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b739f36-d9c4-4fb6-9ead-9df05e283dea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73eedf1d85ce69ddf2fed2445ed9318cb4515aee47f89fe9b1db69bbf213ea2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prbsc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s95hp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:41Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:41 crc kubenswrapper[4815]: I1207 19:15:41.474410 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gmf4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fefaece-ab52-48e2-9ee9-fb07be1922f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://600d32fcae56dd0fa63421283e9b0fdfc23c4589a1dc4a66e5ff1f4c6a415e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e14454c2c5dbbdcbfd3ff7ec5d618c2dfe48f41af9846b72f61941b878e53d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e14454c2c5dbbdcbfd3ff7ec5d618c2dfe48f41af9846b72f61941b878e53d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f33f1decd49213b8e034c2d1f2a3a51ab99aa18679bd3378becaa27f1ef4b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f33f1decd49213b8e034c2d1f2a3a51ab99aa18679bd3378becaa27f1ef4b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c519c7c8ea1b7405466970c64269b72cc6e8129838764dfc4a662cb4df419c6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c519c7c8ea1b7405466970c64269b72cc6e8129838764dfc4a662cb4df419c6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c9efcd688f4886df56fb4d0e2ff5f5dd0c9e877186e1b53d398b7bed0547609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c9efcd688f4886df56fb4d0e2ff5f5dd0c9e877186e1b53d398b7bed0547609\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a4a622d5231f5bacbb9dba0290a87f10310a0053be2dd692718fcf1002deb44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a4a622d5231f5bacbb9dba0290a87f10310a0053be2dd692718fcf1002deb44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://825649f5768765281bad22f9b86ed162b80cc258ce11a746ec06c9f415a21529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://825649f5768765281bad22f9b86ed162b80cc258ce11a746ec06c9f415a21529\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gmf4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:41Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:41 crc kubenswrapper[4815]: I1207 19:15:41.486250 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tqxds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dfd265a-0c7f-40bd-9226-82c2b1abbeda\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3523e317081c56f05c00d0288c6ad6f1ff04f1346772bc0c86aed488570786a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzfhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a277a724d026909546ae84aee6d8d29fc87f0277f9d6aa45ca99e5f116ac79d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzfhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tqxds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:41Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:41 crc kubenswrapper[4815]: I1207 19:15:41.495613 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:41 crc kubenswrapper[4815]: I1207 19:15:41.495657 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:41 crc kubenswrapper[4815]: I1207 19:15:41.495670 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:41 crc kubenswrapper[4815]: I1207 19:15:41.495688 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:41 crc kubenswrapper[4815]: I1207 19:15:41.495700 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:41Z","lastTransitionTime":"2025-12-07T19:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:41 crc kubenswrapper[4815]: I1207 19:15:41.510304 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"538dccc2-452b-4d13-8b0d-df993c2b69a2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae001d2e4c277c57382ca0c4a1286c3c135f2fc06ca5a78df27cbdab588ce576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f92a2d37af34c519a6b00a9643f4a46de779ae0070adb0826e9f227960fe4a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89a76372fcc5ff7131084098e41183d148096169b9ccaf69d59dbef45dcfd755\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f29e69a7aa018d68c2e39f3e0f7a619fb25262acaebf83b81fabb6f9b87a9892\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:41Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:41 crc kubenswrapper[4815]: I1207 19:15:41.522515 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cb6cb3bfed28d77ff74b7a2273ddb477e21f930f79b169bfbd98fa10a5ead5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:41Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:41 crc kubenswrapper[4815]: I1207 19:15:41.539405 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac471afc20373bd22caa4941b5ebe38fdf62ad2145fabb4211fa02ce8728ab1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4050d0d7398492efc8809311a01c1a60986b97b6d177846e70c9283399f37b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c63ad6fab49ca4b8deb896aef3b5ff43c3a1ae1035e7ad94815a938939c57c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://812356ce6808d8d2a5845dec1132733fab86a0d6853951920ce5fb5083551c7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ab1c01dcbfe50580e64d89dc9e14843419588c89d7d792af89034d8f272e44e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f140818c38211cb31d63c0b540b60681a781c349ba0523866a3c7e80f33756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6dacc25cfd09c798577f8ef8470f6a4d7f1fe33b265ab8f798ea6ae999bd3ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://69f7f34258562f08a32090f2be4e879a9646f9adf946e66561c4e127eac3ffc9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-07T19:15:37Z\\\",\\\"message\\\":\\\"er Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-kube-controller-manager/kube-controller-manager]} name:Service_openshift-kube-controller-manager/kube-controller-manager_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.36:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {ba175bbe-5cc4-47e6-a32d-57693e1320bd}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1207 19:15:36.490000 6038 services_controller.go:452] Built service openshift-machine-api/machine-api-controllers per-node LB for network=default: []services.LB{}\\\\nI1207 19:15:36.490018 6038 services_controller.go:453] Built service openshift-machine-api/machine-api-controllers template LB for network=default: []services.LB{}\\\\nI1207 19:15:36.490032 6038 services_controller.go:454] Service openshift-machine-api/machine-api-controllers for network=default has 3 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF1207 19:15:36.490040 6038 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6dacc25cfd09c798577f8ef8470f6a4d7f1fe33b265ab8f798ea6ae999bd3ac\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"message\\\":\\\"rnetes/ovnkube-control-plane-749d76644c-tqxds\\\\nI1207 19:15:40.404183 6171 services_controller.go:451] Built service openshift-machine-api/machine-api-operator-machine-webhook cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-api/machine-api-operator-machine-webhook_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/machine-api-operator-machine-webhook\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.250\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1207 19:15:40.404208 6171 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba72f8ec4b6a36d0c1a6fd8f9f8e5fad577a00b8ae515738a007b994b989d134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a4a4b2d4e80c078ae857e91491c082c6b1c57a5297c485c31fac0c90996b19f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a4a4b2d4e80c078ae857e91491c082c6b1c57a5297c485c31fac0c90996b19f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zzw6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:41Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:41 crc kubenswrapper[4815]: I1207 19:15:41.550310 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kh7gd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29b15186-a725-4067-ba76-336f580327fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1fa8bda26912926bc52757424a9930e1eca6ee953f28ffdb205c1b8b8f29d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kh7gd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:41Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:41 crc kubenswrapper[4815]: I1207 19:15:41.560281 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:41Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:41 crc kubenswrapper[4815]: I1207 19:15:41.572789 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nz6q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92d414c2-ecc3-4598-ac9d-b982bfd89c7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://003f0d6dca39f167925bc3a05634d9dd20f23e54135631e0099a56801f9daf3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cp5k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nz6q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:41Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:41 crc kubenswrapper[4815]: I1207 19:15:41.586471 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe01e9d3-9839-41ea-82d9-f4b6f48bf3b2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52ab545f838534306ea70cc6e400e9fb56ed71ed8206d72c6626166f3a51a93c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f59886c065844948639bf9828ed20c24546cc38659683c98162568c6b12c029\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bec885c9a80d9b5b1fe630a9b7a3f19b23b0af05f359bd87a664f007812b4c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c76d9f056d22d1be012327374370269c2f6d6b8bd341c98419fcb744370751e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539bb1f0e569004272aac994e13bdc7f17f342f5986f08beca97a4c888e046c2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1207 19:15:09.905103 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1207 19:15:09.906859 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4228347019/tls.crt::/tmp/serving-cert-4228347019/tls.key\\\\\\\"\\\\nI1207 19:15:20.294696 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1207 19:15:20.298748 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1207 19:15:20.298781 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1207 19:15:20.298843 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1207 19:15:20.298860 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1207 19:15:20.305864 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1207 19:15:20.305885 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1207 19:15:20.305891 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1207 19:15:20.305973 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1207 19:15:20.305983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1207 19:15:20.305988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1207 19:15:20.305992 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1207 19:15:20.305997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1207 19:15:20.308727 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://902a764a3f6e3ac55c1817f369dfd69ffbf8528bfe4444a6514cdabd48f09fb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:41Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:41 crc kubenswrapper[4815]: I1207 19:15:41.598475 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:41 crc kubenswrapper[4815]: I1207 19:15:41.598518 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:41 crc kubenswrapper[4815]: I1207 19:15:41.598530 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:41 crc kubenswrapper[4815]: I1207 19:15:41.598549 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:41 crc kubenswrapper[4815]: I1207 19:15:41.598560 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:41Z","lastTransitionTime":"2025-12-07T19:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:41 crc kubenswrapper[4815]: I1207 19:15:41.602869 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:41Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:41 crc kubenswrapper[4815]: I1207 19:15:41.614385 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b57a8653da7e9136a6c54872ff67a01c2d6ad5077842b783ffe74e0069bc8693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:41Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:41 crc kubenswrapper[4815]: I1207 19:15:41.626052 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:41Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:41 crc kubenswrapper[4815]: I1207 19:15:41.636781 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ec34a6371c74ee0778e27d73f5e2948d9ee1bb518f658675df123df60879c61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc20ae853b989c9e9cfeb47a40bd1a02397e3ce0b0b4feec3fcec56ea7be16a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:41Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:41 crc kubenswrapper[4815]: I1207 19:15:41.648003 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d662ba2-aa03-4eea-bd30-8ad40638f6c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4939db657209a8744656f63541b50598888981216b000dd4316a9327fdfbcf34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2v8vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55eba62082762b6527923446c2d419d2eb00919331cf8e59bfa9cb27ca65a82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2v8vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gkn4h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:41Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:41 crc kubenswrapper[4815]: I1207 19:15:41.657056 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xbq22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"201e9ba8-3e19-4555-90f0-587497a2a328\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z24bd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z24bd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xbq22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:41Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:41 crc kubenswrapper[4815]: I1207 19:15:41.701513 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:41 crc kubenswrapper[4815]: I1207 19:15:41.701548 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:41 crc kubenswrapper[4815]: I1207 19:15:41.701557 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:41 crc kubenswrapper[4815]: I1207 19:15:41.701572 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:41 crc kubenswrapper[4815]: I1207 19:15:41.701581 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:41Z","lastTransitionTime":"2025-12-07T19:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:41 crc kubenswrapper[4815]: I1207 19:15:41.769246 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 07 19:15:41 crc kubenswrapper[4815]: E1207 19:15:41.769370 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 07 19:15:41 crc kubenswrapper[4815]: I1207 19:15:41.769553 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 07 19:15:41 crc kubenswrapper[4815]: E1207 19:15:41.769806 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 07 19:15:41 crc kubenswrapper[4815]: I1207 19:15:41.803767 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:41 crc kubenswrapper[4815]: I1207 19:15:41.803819 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:41 crc kubenswrapper[4815]: I1207 19:15:41.803830 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:41 crc kubenswrapper[4815]: I1207 19:15:41.803845 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:41 crc kubenswrapper[4815]: I1207 19:15:41.803857 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:41Z","lastTransitionTime":"2025-12-07T19:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:41 crc kubenswrapper[4815]: I1207 19:15:41.905581 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:41 crc kubenswrapper[4815]: I1207 19:15:41.905605 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:41 crc kubenswrapper[4815]: I1207 19:15:41.905613 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:41 crc kubenswrapper[4815]: I1207 19:15:41.905625 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:41 crc kubenswrapper[4815]: I1207 19:15:41.905633 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:41Z","lastTransitionTime":"2025-12-07T19:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:42 crc kubenswrapper[4815]: I1207 19:15:42.009500 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:42 crc kubenswrapper[4815]: I1207 19:15:42.009533 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:42 crc kubenswrapper[4815]: I1207 19:15:42.009541 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:42 crc kubenswrapper[4815]: I1207 19:15:42.009555 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:42 crc kubenswrapper[4815]: I1207 19:15:42.009563 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:42Z","lastTransitionTime":"2025-12-07T19:15:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:42 crc kubenswrapper[4815]: I1207 19:15:42.112221 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:42 crc kubenswrapper[4815]: I1207 19:15:42.112598 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:42 crc kubenswrapper[4815]: I1207 19:15:42.112775 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:42 crc kubenswrapper[4815]: I1207 19:15:42.113059 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:42 crc kubenswrapper[4815]: I1207 19:15:42.113255 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:42Z","lastTransitionTime":"2025-12-07T19:15:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:42 crc kubenswrapper[4815]: I1207 19:15:42.150579 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zzw6c_13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f/ovnkube-controller/1.log" Dec 07 19:15:42 crc kubenswrapper[4815]: I1207 19:15:42.156226 4815 scope.go:117] "RemoveContainer" containerID="c6dacc25cfd09c798577f8ef8470f6a4d7f1fe33b265ab8f798ea6ae999bd3ac" Dec 07 19:15:42 crc kubenswrapper[4815]: E1207 19:15:42.156443 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-zzw6c_openshift-ovn-kubernetes(13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" podUID="13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f" Dec 07 19:15:42 crc kubenswrapper[4815]: I1207 19:15:42.171323 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d662ba2-aa03-4eea-bd30-8ad40638f6c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4939db657209a8744656f63541b50598888981216b000dd4316a9327fdfbcf34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2v8vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55eba62082762b6527923446c2d419d2eb00919331cf8e59bfa9cb27ca65a82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2v8vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gkn4h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:42Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:42 crc kubenswrapper[4815]: I1207 19:15:42.185374 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xbq22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"201e9ba8-3e19-4555-90f0-587497a2a328\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z24bd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z24bd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xbq22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:42Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:42 crc kubenswrapper[4815]: I1207 19:15:42.205133 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s95hp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b739f36-d9c4-4fb6-9ead-9df05e283dea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73eedf1d85ce69ddf2fed2445ed9318cb4515aee47f89fe9b1db69bbf213ea2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prbsc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s95hp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:42Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:42 crc kubenswrapper[4815]: I1207 19:15:42.215422 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:42 crc kubenswrapper[4815]: I1207 19:15:42.215629 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:42 crc kubenswrapper[4815]: I1207 19:15:42.215751 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:42 crc kubenswrapper[4815]: I1207 19:15:42.215889 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:42 crc kubenswrapper[4815]: I1207 19:15:42.216045 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:42Z","lastTransitionTime":"2025-12-07T19:15:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:42 crc kubenswrapper[4815]: I1207 19:15:42.233174 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gmf4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fefaece-ab52-48e2-9ee9-fb07be1922f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://600d32fcae56dd0fa63421283e9b0fdfc23c4589a1dc4a66e5ff1f4c6a415e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e14454c2c5dbbdcbfd3ff7ec5d618c2dfe48f41af9846b72f61941b878e53d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e14454c2c5dbbdcbfd3ff7ec5d618c2dfe48f41af9846b72f61941b878e53d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f33f1decd49213b8e034c2d1f2a3a51ab99aa18679bd3378becaa27f1ef4b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f33f1decd49213b8e034c2d1f2a3a51ab99aa18679bd3378becaa27f1ef4b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c519c7c8ea1b7405466970c64269b72cc6e8129838764dfc4a662cb4df419c6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c519c7c8ea1b7405466970c64269b72cc6e8129838764dfc4a662cb4df419c6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c9efcd688f4886df56fb4d0e2ff5f5dd0c9e877186e1b53d398b7bed0547609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c9efcd688f4886df56fb4d0e2ff5f5dd0c9e877186e1b53d398b7bed0547609\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a4a622d5231f5bacbb9dba0290a87f10310a0053be2dd692718fcf1002deb44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a4a622d5231f5bacbb9dba0290a87f10310a0053be2dd692718fcf1002deb44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://825649f5768765281bad22f9b86ed162b80cc258ce11a746ec06c9f415a21529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://825649f5768765281bad22f9b86ed162b80cc258ce11a746ec06c9f415a21529\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gmf4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:42Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:42 crc kubenswrapper[4815]: I1207 19:15:42.250263 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tqxds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dfd265a-0c7f-40bd-9226-82c2b1abbeda\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3523e317081c56f05c00d0288c6ad6f1ff04f1346772bc0c86aed488570786a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzfhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a277a724d026909546ae84aee6d8d29fc87f0277f9d6aa45ca99e5f116ac79d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzfhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tqxds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:42Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:42 crc kubenswrapper[4815]: I1207 19:15:42.265884 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"538dccc2-452b-4d13-8b0d-df993c2b69a2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae001d2e4c277c57382ca0c4a1286c3c135f2fc06ca5a78df27cbdab588ce576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f92a2d37af34c519a6b00a9643f4a46de779ae0070adb0826e9f227960fe4a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89a76372fcc5ff7131084098e41183d148096169b9ccaf69d59dbef45dcfd755\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f29e69a7aa018d68c2e39f3e0f7a619fb25262acaebf83b81fabb6f9b87a9892\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:42Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:42 crc kubenswrapper[4815]: I1207 19:15:42.281137 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cb6cb3bfed28d77ff74b7a2273ddb477e21f930f79b169bfbd98fa10a5ead5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:42Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:42 crc kubenswrapper[4815]: I1207 19:15:42.298900 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac471afc20373bd22caa4941b5ebe38fdf62ad2145fabb4211fa02ce8728ab1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4050d0d7398492efc8809311a01c1a60986b97b6d177846e70c9283399f37b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c63ad6fab49ca4b8deb896aef3b5ff43c3a1ae1035e7ad94815a938939c57c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://812356ce6808d8d2a5845dec1132733fab86a0d6853951920ce5fb5083551c7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ab1c01dcbfe50580e64d89dc9e14843419588c89d7d792af89034d8f272e44e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f140818c38211cb31d63c0b540b60681a781c349ba0523866a3c7e80f33756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6dacc25cfd09c798577f8ef8470f6a4d7f1fe33b265ab8f798ea6ae999bd3ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6dacc25cfd09c798577f8ef8470f6a4d7f1fe33b265ab8f798ea6ae999bd3ac\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"message\\\":\\\"rnetes/ovnkube-control-plane-749d76644c-tqxds\\\\nI1207 19:15:40.404183 6171 services_controller.go:451] Built service openshift-machine-api/machine-api-operator-machine-webhook cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-api/machine-api-operator-machine-webhook_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/machine-api-operator-machine-webhook\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.250\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1207 19:15:40.404208 6171 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:39Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-zzw6c_openshift-ovn-kubernetes(13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba72f8ec4b6a36d0c1a6fd8f9f8e5fad577a00b8ae515738a007b994b989d134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a4a4b2d4e80c078ae857e91491c082c6b1c57a5297c485c31fac0c90996b19f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a4a4b2d4e80c078ae857e91491c082c6b1c57a5297c485c31fac0c90996b19f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zzw6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:42Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:42 crc kubenswrapper[4815]: I1207 19:15:42.311713 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kh7gd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29b15186-a725-4067-ba76-336f580327fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1fa8bda26912926bc52757424a9930e1eca6ee953f28ffdb205c1b8b8f29d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kh7gd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:42Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:42 crc kubenswrapper[4815]: I1207 19:15:42.318968 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:42 crc kubenswrapper[4815]: I1207 19:15:42.319037 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:42 crc kubenswrapper[4815]: I1207 19:15:42.319049 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:42 crc kubenswrapper[4815]: I1207 19:15:42.319065 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:42 crc kubenswrapper[4815]: I1207 19:15:42.319076 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:42Z","lastTransitionTime":"2025-12-07T19:15:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:42 crc kubenswrapper[4815]: I1207 19:15:42.327460 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe01e9d3-9839-41ea-82d9-f4b6f48bf3b2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52ab545f838534306ea70cc6e400e9fb56ed71ed8206d72c6626166f3a51a93c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f59886c065844948639bf9828ed20c24546cc38659683c98162568c6b12c029\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bec885c9a80d9b5b1fe630a9b7a3f19b23b0af05f359bd87a664f007812b4c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c76d9f056d22d1be012327374370269c2f6d6b8bd341c98419fcb744370751e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539bb1f0e569004272aac994e13bdc7f17f342f5986f08beca97a4c888e046c2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1207 19:15:09.905103 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1207 19:15:09.906859 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4228347019/tls.crt::/tmp/serving-cert-4228347019/tls.key\\\\\\\"\\\\nI1207 19:15:20.294696 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1207 19:15:20.298748 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1207 19:15:20.298781 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1207 19:15:20.298843 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1207 19:15:20.298860 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1207 19:15:20.305864 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1207 19:15:20.305885 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1207 19:15:20.305891 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1207 19:15:20.305973 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1207 19:15:20.305983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1207 19:15:20.305988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1207 19:15:20.305992 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1207 19:15:20.305997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1207 19:15:20.308727 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://902a764a3f6e3ac55c1817f369dfd69ffbf8528bfe4444a6514cdabd48f09fb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:42Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:42 crc kubenswrapper[4815]: I1207 19:15:42.345308 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:42Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:42 crc kubenswrapper[4815]: I1207 19:15:42.363726 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b57a8653da7e9136a6c54872ff67a01c2d6ad5077842b783ffe74e0069bc8693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:42Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:42 crc kubenswrapper[4815]: I1207 19:15:42.376676 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:42Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:42 crc kubenswrapper[4815]: I1207 19:15:42.391466 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ec34a6371c74ee0778e27d73f5e2948d9ee1bb518f658675df123df60879c61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc20ae853b989c9e9cfeb47a40bd1a02397e3ce0b0b4feec3fcec56ea7be16a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:42Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:42 crc kubenswrapper[4815]: I1207 19:15:42.410131 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:42Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:42 crc kubenswrapper[4815]: I1207 19:15:42.421264 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nz6q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92d414c2-ecc3-4598-ac9d-b982bfd89c7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://003f0d6dca39f167925bc3a05634d9dd20f23e54135631e0099a56801f9daf3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cp5k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nz6q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:42Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:42 crc kubenswrapper[4815]: I1207 19:15:42.421645 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:42 crc kubenswrapper[4815]: I1207 19:15:42.421710 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:42 crc kubenswrapper[4815]: I1207 19:15:42.421730 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:42 crc kubenswrapper[4815]: I1207 19:15:42.421754 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:42 crc kubenswrapper[4815]: I1207 19:15:42.421772 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:42Z","lastTransitionTime":"2025-12-07T19:15:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:42 crc kubenswrapper[4815]: I1207 19:15:42.425165 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/201e9ba8-3e19-4555-90f0-587497a2a328-metrics-certs\") pod \"network-metrics-daemon-xbq22\" (UID: \"201e9ba8-3e19-4555-90f0-587497a2a328\") " pod="openshift-multus/network-metrics-daemon-xbq22" Dec 07 19:15:42 crc kubenswrapper[4815]: E1207 19:15:42.425445 4815 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 07 19:15:42 crc kubenswrapper[4815]: E1207 19:15:42.425565 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/201e9ba8-3e19-4555-90f0-587497a2a328-metrics-certs podName:201e9ba8-3e19-4555-90f0-587497a2a328 nodeName:}" failed. No retries permitted until 2025-12-07 19:15:44.42553041 +0000 UTC m=+49.004520495 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/201e9ba8-3e19-4555-90f0-587497a2a328-metrics-certs") pod "network-metrics-daemon-xbq22" (UID: "201e9ba8-3e19-4555-90f0-587497a2a328") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 07 19:15:42 crc kubenswrapper[4815]: I1207 19:15:42.524814 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:42 crc kubenswrapper[4815]: I1207 19:15:42.524866 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:42 crc kubenswrapper[4815]: I1207 19:15:42.524888 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:42 crc kubenswrapper[4815]: I1207 19:15:42.524988 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:42 crc kubenswrapper[4815]: I1207 19:15:42.525007 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:42Z","lastTransitionTime":"2025-12-07T19:15:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:42 crc kubenswrapper[4815]: I1207 19:15:42.627557 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:42 crc kubenswrapper[4815]: I1207 19:15:42.627627 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:42 crc kubenswrapper[4815]: I1207 19:15:42.627650 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:42 crc kubenswrapper[4815]: I1207 19:15:42.627679 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:42 crc kubenswrapper[4815]: I1207 19:15:42.627702 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:42Z","lastTransitionTime":"2025-12-07T19:15:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:42 crc kubenswrapper[4815]: I1207 19:15:42.731327 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:42 crc kubenswrapper[4815]: I1207 19:15:42.731394 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:42 crc kubenswrapper[4815]: I1207 19:15:42.731417 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:42 crc kubenswrapper[4815]: I1207 19:15:42.731445 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:42 crc kubenswrapper[4815]: I1207 19:15:42.731468 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:42Z","lastTransitionTime":"2025-12-07T19:15:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:42 crc kubenswrapper[4815]: I1207 19:15:42.769725 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xbq22" Dec 07 19:15:42 crc kubenswrapper[4815]: E1207 19:15:42.769993 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xbq22" podUID="201e9ba8-3e19-4555-90f0-587497a2a328" Dec 07 19:15:42 crc kubenswrapper[4815]: I1207 19:15:42.770185 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 07 19:15:42 crc kubenswrapper[4815]: E1207 19:15:42.770352 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 07 19:15:42 crc kubenswrapper[4815]: I1207 19:15:42.834727 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:42 crc kubenswrapper[4815]: I1207 19:15:42.834777 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:42 crc kubenswrapper[4815]: I1207 19:15:42.834795 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:42 crc kubenswrapper[4815]: I1207 19:15:42.834822 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:42 crc kubenswrapper[4815]: I1207 19:15:42.834840 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:42Z","lastTransitionTime":"2025-12-07T19:15:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:42 crc kubenswrapper[4815]: I1207 19:15:42.938165 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:42 crc kubenswrapper[4815]: I1207 19:15:42.938227 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:42 crc kubenswrapper[4815]: I1207 19:15:42.938296 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:42 crc kubenswrapper[4815]: I1207 19:15:42.938329 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:42 crc kubenswrapper[4815]: I1207 19:15:42.938350 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:42Z","lastTransitionTime":"2025-12-07T19:15:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:43 crc kubenswrapper[4815]: I1207 19:15:43.041625 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:43 crc kubenswrapper[4815]: I1207 19:15:43.041684 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:43 crc kubenswrapper[4815]: I1207 19:15:43.041722 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:43 crc kubenswrapper[4815]: I1207 19:15:43.041753 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:43 crc kubenswrapper[4815]: I1207 19:15:43.041776 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:43Z","lastTransitionTime":"2025-12-07T19:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:43 crc kubenswrapper[4815]: I1207 19:15:43.147322 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:43 crc kubenswrapper[4815]: I1207 19:15:43.147377 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:43 crc kubenswrapper[4815]: I1207 19:15:43.147398 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:43 crc kubenswrapper[4815]: I1207 19:15:43.147426 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:43 crc kubenswrapper[4815]: I1207 19:15:43.147446 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:43Z","lastTransitionTime":"2025-12-07T19:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:43 crc kubenswrapper[4815]: I1207 19:15:43.210517 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 07 19:15:43 crc kubenswrapper[4815]: I1207 19:15:43.220121 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 07 19:15:43 crc kubenswrapper[4815]: I1207 19:15:43.231092 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d662ba2-aa03-4eea-bd30-8ad40638f6c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4939db657209a8744656f63541b50598888981216b000dd4316a9327fdfbcf34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2v8vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55eba62082762b6527923446c2d419d2eb00919331cf8e59bfa9cb27ca65a82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2v8vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gkn4h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:43Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:43 crc kubenswrapper[4815]: I1207 19:15:43.245144 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xbq22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"201e9ba8-3e19-4555-90f0-587497a2a328\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z24bd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z24bd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xbq22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:43Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:43 crc kubenswrapper[4815]: I1207 19:15:43.250751 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:43 crc kubenswrapper[4815]: I1207 19:15:43.250806 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:43 crc kubenswrapper[4815]: I1207 19:15:43.250825 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:43 crc kubenswrapper[4815]: I1207 19:15:43.250851 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:43 crc kubenswrapper[4815]: I1207 19:15:43.250874 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:43Z","lastTransitionTime":"2025-12-07T19:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:43 crc kubenswrapper[4815]: I1207 19:15:43.263590 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s95hp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b739f36-d9c4-4fb6-9ead-9df05e283dea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73eedf1d85ce69ddf2fed2445ed9318cb4515aee47f89fe9b1db69bbf213ea2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prbsc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s95hp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:43Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:43 crc kubenswrapper[4815]: I1207 19:15:43.286986 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gmf4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fefaece-ab52-48e2-9ee9-fb07be1922f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://600d32fcae56dd0fa63421283e9b0fdfc23c4589a1dc4a66e5ff1f4c6a415e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e14454c2c5dbbdcbfd3ff7ec5d618c2dfe48f41af9846b72f61941b878e53d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e14454c2c5dbbdcbfd3ff7ec5d618c2dfe48f41af9846b72f61941b878e53d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f33f1decd49213b8e034c2d1f2a3a51ab99aa18679bd3378becaa27f1ef4b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f33f1decd49213b8e034c2d1f2a3a51ab99aa18679bd3378becaa27f1ef4b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c519c7c8ea1b7405466970c64269b72cc6e8129838764dfc4a662cb4df419c6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c519c7c8ea1b7405466970c64269b72cc6e8129838764dfc4a662cb4df419c6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c9efcd688f4886df56fb4d0e2ff5f5dd0c9e877186e1b53d398b7bed0547609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c9efcd688f4886df56fb4d0e2ff5f5dd0c9e877186e1b53d398b7bed0547609\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a4a622d5231f5bacbb9dba0290a87f10310a0053be2dd692718fcf1002deb44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a4a622d5231f5bacbb9dba0290a87f10310a0053be2dd692718fcf1002deb44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://825649f5768765281bad22f9b86ed162b80cc258ce11a746ec06c9f415a21529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://825649f5768765281bad22f9b86ed162b80cc258ce11a746ec06c9f415a21529\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gmf4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:43Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:43 crc kubenswrapper[4815]: I1207 19:15:43.304631 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tqxds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dfd265a-0c7f-40bd-9226-82c2b1abbeda\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3523e317081c56f05c00d0288c6ad6f1ff04f1346772bc0c86aed488570786a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzfhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a277a724d026909546ae84aee6d8d29fc87f0277f9d6aa45ca99e5f116ac79d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzfhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tqxds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:43Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:43 crc kubenswrapper[4815]: I1207 19:15:43.322625 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"538dccc2-452b-4d13-8b0d-df993c2b69a2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae001d2e4c277c57382ca0c4a1286c3c135f2fc06ca5a78df27cbdab588ce576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f92a2d37af34c519a6b00a9643f4a46de779ae0070adb0826e9f227960fe4a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89a76372fcc5ff7131084098e41183d148096169b9ccaf69d59dbef45dcfd755\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f29e69a7aa018d68c2e39f3e0f7a619fb25262acaebf83b81fabb6f9b87a9892\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:43Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:43 crc kubenswrapper[4815]: I1207 19:15:43.339291 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cb6cb3bfed28d77ff74b7a2273ddb477e21f930f79b169bfbd98fa10a5ead5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:43Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:43 crc kubenswrapper[4815]: I1207 19:15:43.353566 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:43 crc kubenswrapper[4815]: I1207 19:15:43.353617 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:43 crc kubenswrapper[4815]: I1207 19:15:43.353635 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:43 crc kubenswrapper[4815]: I1207 19:15:43.353659 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:43 crc kubenswrapper[4815]: I1207 19:15:43.353677 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:43Z","lastTransitionTime":"2025-12-07T19:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:43 crc kubenswrapper[4815]: I1207 19:15:43.370870 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac471afc20373bd22caa4941b5ebe38fdf62ad2145fabb4211fa02ce8728ab1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4050d0d7398492efc8809311a01c1a60986b97b6d177846e70c9283399f37b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c63ad6fab49ca4b8deb896aef3b5ff43c3a1ae1035e7ad94815a938939c57c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://812356ce6808d8d2a5845dec1132733fab86a0d6853951920ce5fb5083551c7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ab1c01dcbfe50580e64d89dc9e14843419588c89d7d792af89034d8f272e44e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f140818c38211cb31d63c0b540b60681a781c349ba0523866a3c7e80f33756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6dacc25cfd09c798577f8ef8470f6a4d7f1fe33b265ab8f798ea6ae999bd3ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6dacc25cfd09c798577f8ef8470f6a4d7f1fe33b265ab8f798ea6ae999bd3ac\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"message\\\":\\\"rnetes/ovnkube-control-plane-749d76644c-tqxds\\\\nI1207 19:15:40.404183 6171 services_controller.go:451] Built service openshift-machine-api/machine-api-operator-machine-webhook cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-api/machine-api-operator-machine-webhook_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/machine-api-operator-machine-webhook\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.250\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1207 19:15:40.404208 6171 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:39Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-zzw6c_openshift-ovn-kubernetes(13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba72f8ec4b6a36d0c1a6fd8f9f8e5fad577a00b8ae515738a007b994b989d134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a4a4b2d4e80c078ae857e91491c082c6b1c57a5297c485c31fac0c90996b19f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a4a4b2d4e80c078ae857e91491c082c6b1c57a5297c485c31fac0c90996b19f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zzw6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:43Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:43 crc kubenswrapper[4815]: I1207 19:15:43.386487 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kh7gd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29b15186-a725-4067-ba76-336f580327fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1fa8bda26912926bc52757424a9930e1eca6ee953f28ffdb205c1b8b8f29d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kh7gd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:43Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:43 crc kubenswrapper[4815]: I1207 19:15:43.399678 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nz6q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92d414c2-ecc3-4598-ac9d-b982bfd89c7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://003f0d6dca39f167925bc3a05634d9dd20f23e54135631e0099a56801f9daf3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cp5k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nz6q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:43Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:43 crc kubenswrapper[4815]: I1207 19:15:43.419573 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe01e9d3-9839-41ea-82d9-f4b6f48bf3b2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52ab545f838534306ea70cc6e400e9fb56ed71ed8206d72c6626166f3a51a93c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f59886c065844948639bf9828ed20c24546cc38659683c98162568c6b12c029\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bec885c9a80d9b5b1fe630a9b7a3f19b23b0af05f359bd87a664f007812b4c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c76d9f056d22d1be012327374370269c2f6d6b8bd341c98419fcb744370751e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539bb1f0e569004272aac994e13bdc7f17f342f5986f08beca97a4c888e046c2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1207 19:15:09.905103 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1207 19:15:09.906859 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4228347019/tls.crt::/tmp/serving-cert-4228347019/tls.key\\\\\\\"\\\\nI1207 19:15:20.294696 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1207 19:15:20.298748 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1207 19:15:20.298781 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1207 19:15:20.298843 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1207 19:15:20.298860 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1207 19:15:20.305864 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1207 19:15:20.305885 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1207 19:15:20.305891 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1207 19:15:20.305973 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1207 19:15:20.305983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1207 19:15:20.305988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1207 19:15:20.305992 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1207 19:15:20.305997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1207 19:15:20.308727 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://902a764a3f6e3ac55c1817f369dfd69ffbf8528bfe4444a6514cdabd48f09fb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:43Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:43 crc kubenswrapper[4815]: I1207 19:15:43.437397 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:43Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:43 crc kubenswrapper[4815]: I1207 19:15:43.457081 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:43 crc kubenswrapper[4815]: I1207 19:15:43.457480 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:43 crc kubenswrapper[4815]: I1207 19:15:43.457503 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:43 crc kubenswrapper[4815]: I1207 19:15:43.457535 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:43 crc kubenswrapper[4815]: I1207 19:15:43.457558 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:43Z","lastTransitionTime":"2025-12-07T19:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:43 crc kubenswrapper[4815]: I1207 19:15:43.459580 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b57a8653da7e9136a6c54872ff67a01c2d6ad5077842b783ffe74e0069bc8693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:43Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:43 crc kubenswrapper[4815]: I1207 19:15:43.480055 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:43Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:43 crc kubenswrapper[4815]: I1207 19:15:43.503468 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ec34a6371c74ee0778e27d73f5e2948d9ee1bb518f658675df123df60879c61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc20ae853b989c9e9cfeb47a40bd1a02397e3ce0b0b4feec3fcec56ea7be16a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:43Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:43 crc kubenswrapper[4815]: I1207 19:15:43.524946 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:43Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:43 crc kubenswrapper[4815]: I1207 19:15:43.561444 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:43 crc kubenswrapper[4815]: I1207 19:15:43.561505 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:43 crc kubenswrapper[4815]: I1207 19:15:43.561524 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:43 crc kubenswrapper[4815]: I1207 19:15:43.561551 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:43 crc kubenswrapper[4815]: I1207 19:15:43.561570 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:43Z","lastTransitionTime":"2025-12-07T19:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:43 crc kubenswrapper[4815]: I1207 19:15:43.665061 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:43 crc kubenswrapper[4815]: I1207 19:15:43.665131 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:43 crc kubenswrapper[4815]: I1207 19:15:43.665150 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:43 crc kubenswrapper[4815]: I1207 19:15:43.665181 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:43 crc kubenswrapper[4815]: I1207 19:15:43.665202 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:43Z","lastTransitionTime":"2025-12-07T19:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:43 crc kubenswrapper[4815]: I1207 19:15:43.768683 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:43 crc kubenswrapper[4815]: I1207 19:15:43.768734 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:43 crc kubenswrapper[4815]: I1207 19:15:43.768752 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:43 crc kubenswrapper[4815]: I1207 19:15:43.768777 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:43 crc kubenswrapper[4815]: I1207 19:15:43.768777 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 07 19:15:43 crc kubenswrapper[4815]: I1207 19:15:43.768798 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:43Z","lastTransitionTime":"2025-12-07T19:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:43 crc kubenswrapper[4815]: I1207 19:15:43.768850 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 07 19:15:43 crc kubenswrapper[4815]: E1207 19:15:43.768969 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 07 19:15:43 crc kubenswrapper[4815]: E1207 19:15:43.769065 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 07 19:15:43 crc kubenswrapper[4815]: I1207 19:15:43.872464 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:43 crc kubenswrapper[4815]: I1207 19:15:43.872525 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:43 crc kubenswrapper[4815]: I1207 19:15:43.872543 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:43 crc kubenswrapper[4815]: I1207 19:15:43.872567 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:43 crc kubenswrapper[4815]: I1207 19:15:43.872585 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:43Z","lastTransitionTime":"2025-12-07T19:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:43 crc kubenswrapper[4815]: I1207 19:15:43.976168 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:43 crc kubenswrapper[4815]: I1207 19:15:43.976231 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:43 crc kubenswrapper[4815]: I1207 19:15:43.976251 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:43 crc kubenswrapper[4815]: I1207 19:15:43.976278 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:43 crc kubenswrapper[4815]: I1207 19:15:43.976296 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:43Z","lastTransitionTime":"2025-12-07T19:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:44 crc kubenswrapper[4815]: I1207 19:15:44.079502 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:44 crc kubenswrapper[4815]: I1207 19:15:44.079561 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:44 crc kubenswrapper[4815]: I1207 19:15:44.079583 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:44 crc kubenswrapper[4815]: I1207 19:15:44.079611 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:44 crc kubenswrapper[4815]: I1207 19:15:44.079633 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:44Z","lastTransitionTime":"2025-12-07T19:15:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:44 crc kubenswrapper[4815]: I1207 19:15:44.182396 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:44 crc kubenswrapper[4815]: I1207 19:15:44.182457 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:44 crc kubenswrapper[4815]: I1207 19:15:44.182476 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:44 crc kubenswrapper[4815]: I1207 19:15:44.182501 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:44 crc kubenswrapper[4815]: I1207 19:15:44.182519 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:44Z","lastTransitionTime":"2025-12-07T19:15:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:44 crc kubenswrapper[4815]: I1207 19:15:44.285265 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:44 crc kubenswrapper[4815]: I1207 19:15:44.285331 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:44 crc kubenswrapper[4815]: I1207 19:15:44.285353 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:44 crc kubenswrapper[4815]: I1207 19:15:44.285381 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:44 crc kubenswrapper[4815]: I1207 19:15:44.285398 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:44Z","lastTransitionTime":"2025-12-07T19:15:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:44 crc kubenswrapper[4815]: I1207 19:15:44.388509 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:44 crc kubenswrapper[4815]: I1207 19:15:44.388636 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:44 crc kubenswrapper[4815]: I1207 19:15:44.388663 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:44 crc kubenswrapper[4815]: I1207 19:15:44.388737 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:44 crc kubenswrapper[4815]: I1207 19:15:44.388762 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:44Z","lastTransitionTime":"2025-12-07T19:15:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:44 crc kubenswrapper[4815]: I1207 19:15:44.446612 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/201e9ba8-3e19-4555-90f0-587497a2a328-metrics-certs\") pod \"network-metrics-daemon-xbq22\" (UID: \"201e9ba8-3e19-4555-90f0-587497a2a328\") " pod="openshift-multus/network-metrics-daemon-xbq22" Dec 07 19:15:44 crc kubenswrapper[4815]: E1207 19:15:44.446872 4815 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 07 19:15:44 crc kubenswrapper[4815]: E1207 19:15:44.447111 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/201e9ba8-3e19-4555-90f0-587497a2a328-metrics-certs podName:201e9ba8-3e19-4555-90f0-587497a2a328 nodeName:}" failed. No retries permitted until 2025-12-07 19:15:48.447005996 +0000 UTC m=+53.025996081 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/201e9ba8-3e19-4555-90f0-587497a2a328-metrics-certs") pod "network-metrics-daemon-xbq22" (UID: "201e9ba8-3e19-4555-90f0-587497a2a328") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 07 19:15:44 crc kubenswrapper[4815]: I1207 19:15:44.492407 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:44 crc kubenswrapper[4815]: I1207 19:15:44.492460 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:44 crc kubenswrapper[4815]: I1207 19:15:44.492476 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:44 crc kubenswrapper[4815]: I1207 19:15:44.492501 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:44 crc kubenswrapper[4815]: I1207 19:15:44.492520 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:44Z","lastTransitionTime":"2025-12-07T19:15:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:44 crc kubenswrapper[4815]: I1207 19:15:44.596227 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:44 crc kubenswrapper[4815]: I1207 19:15:44.596296 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:44 crc kubenswrapper[4815]: I1207 19:15:44.596314 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:44 crc kubenswrapper[4815]: I1207 19:15:44.596339 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:44 crc kubenswrapper[4815]: I1207 19:15:44.596357 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:44Z","lastTransitionTime":"2025-12-07T19:15:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:44 crc kubenswrapper[4815]: I1207 19:15:44.699594 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:44 crc kubenswrapper[4815]: I1207 19:15:44.699647 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:44 crc kubenswrapper[4815]: I1207 19:15:44.699665 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:44 crc kubenswrapper[4815]: I1207 19:15:44.699687 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:44 crc kubenswrapper[4815]: I1207 19:15:44.699703 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:44Z","lastTransitionTime":"2025-12-07T19:15:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:44 crc kubenswrapper[4815]: I1207 19:15:44.769609 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 07 19:15:44 crc kubenswrapper[4815]: I1207 19:15:44.769642 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xbq22" Dec 07 19:15:44 crc kubenswrapper[4815]: E1207 19:15:44.769819 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 07 19:15:44 crc kubenswrapper[4815]: E1207 19:15:44.769994 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xbq22" podUID="201e9ba8-3e19-4555-90f0-587497a2a328" Dec 07 19:15:44 crc kubenswrapper[4815]: I1207 19:15:44.805228 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:44 crc kubenswrapper[4815]: I1207 19:15:44.805274 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:44 crc kubenswrapper[4815]: I1207 19:15:44.805305 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:44 crc kubenswrapper[4815]: I1207 19:15:44.805326 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:44 crc kubenswrapper[4815]: I1207 19:15:44.805337 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:44Z","lastTransitionTime":"2025-12-07T19:15:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:44 crc kubenswrapper[4815]: I1207 19:15:44.907958 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:44 crc kubenswrapper[4815]: I1207 19:15:44.908013 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:44 crc kubenswrapper[4815]: I1207 19:15:44.908029 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:44 crc kubenswrapper[4815]: I1207 19:15:44.908047 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:44 crc kubenswrapper[4815]: I1207 19:15:44.908059 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:44Z","lastTransitionTime":"2025-12-07T19:15:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:45 crc kubenswrapper[4815]: I1207 19:15:45.010700 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:45 crc kubenswrapper[4815]: I1207 19:15:45.010742 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:45 crc kubenswrapper[4815]: I1207 19:15:45.010756 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:45 crc kubenswrapper[4815]: I1207 19:15:45.010775 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:45 crc kubenswrapper[4815]: I1207 19:15:45.010788 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:45Z","lastTransitionTime":"2025-12-07T19:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:45 crc kubenswrapper[4815]: I1207 19:15:45.114256 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:45 crc kubenswrapper[4815]: I1207 19:15:45.114315 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:45 crc kubenswrapper[4815]: I1207 19:15:45.114333 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:45 crc kubenswrapper[4815]: I1207 19:15:45.114361 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:45 crc kubenswrapper[4815]: I1207 19:15:45.114379 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:45Z","lastTransitionTime":"2025-12-07T19:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:45 crc kubenswrapper[4815]: I1207 19:15:45.216656 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:45 crc kubenswrapper[4815]: I1207 19:15:45.216751 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:45 crc kubenswrapper[4815]: I1207 19:15:45.216769 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:45 crc kubenswrapper[4815]: I1207 19:15:45.216795 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:45 crc kubenswrapper[4815]: I1207 19:15:45.216813 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:45Z","lastTransitionTime":"2025-12-07T19:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:45 crc kubenswrapper[4815]: I1207 19:15:45.320028 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:45 crc kubenswrapper[4815]: I1207 19:15:45.320104 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:45 crc kubenswrapper[4815]: I1207 19:15:45.320119 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:45 crc kubenswrapper[4815]: I1207 19:15:45.320137 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:45 crc kubenswrapper[4815]: I1207 19:15:45.320150 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:45Z","lastTransitionTime":"2025-12-07T19:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:45 crc kubenswrapper[4815]: I1207 19:15:45.423318 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:45 crc kubenswrapper[4815]: I1207 19:15:45.423430 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:45 crc kubenswrapper[4815]: I1207 19:15:45.423447 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:45 crc kubenswrapper[4815]: I1207 19:15:45.423470 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:45 crc kubenswrapper[4815]: I1207 19:15:45.423486 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:45Z","lastTransitionTime":"2025-12-07T19:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:45 crc kubenswrapper[4815]: I1207 19:15:45.526489 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:45 crc kubenswrapper[4815]: I1207 19:15:45.526553 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:45 crc kubenswrapper[4815]: I1207 19:15:45.526571 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:45 crc kubenswrapper[4815]: I1207 19:15:45.526597 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:45 crc kubenswrapper[4815]: I1207 19:15:45.526616 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:45Z","lastTransitionTime":"2025-12-07T19:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:45 crc kubenswrapper[4815]: I1207 19:15:45.629043 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:45 crc kubenswrapper[4815]: I1207 19:15:45.629106 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:45 crc kubenswrapper[4815]: I1207 19:15:45.629127 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:45 crc kubenswrapper[4815]: I1207 19:15:45.629157 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:45 crc kubenswrapper[4815]: I1207 19:15:45.629179 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:45Z","lastTransitionTime":"2025-12-07T19:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:45 crc kubenswrapper[4815]: I1207 19:15:45.732515 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:45 crc kubenswrapper[4815]: I1207 19:15:45.732606 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:45 crc kubenswrapper[4815]: I1207 19:15:45.732635 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:45 crc kubenswrapper[4815]: I1207 19:15:45.732669 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:45 crc kubenswrapper[4815]: I1207 19:15:45.732696 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:45Z","lastTransitionTime":"2025-12-07T19:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:45 crc kubenswrapper[4815]: I1207 19:15:45.768952 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 07 19:15:45 crc kubenswrapper[4815]: I1207 19:15:45.769052 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 07 19:15:45 crc kubenswrapper[4815]: E1207 19:15:45.769148 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 07 19:15:45 crc kubenswrapper[4815]: E1207 19:15:45.769269 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 07 19:15:45 crc kubenswrapper[4815]: I1207 19:15:45.791175 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d662ba2-aa03-4eea-bd30-8ad40638f6c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4939db657209a8744656f63541b50598888981216b000dd4316a9327fdfbcf34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2v8vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55eba62082762b6527923446c2d419d2eb00919331cf8e59bfa9cb27ca65a82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2v8vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gkn4h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:45Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:45 crc kubenswrapper[4815]: I1207 19:15:45.810501 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xbq22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"201e9ba8-3e19-4555-90f0-587497a2a328\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z24bd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z24bd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xbq22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:45Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:45 crc kubenswrapper[4815]: I1207 19:15:45.833532 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s95hp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b739f36-d9c4-4fb6-9ead-9df05e283dea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73eedf1d85ce69ddf2fed2445ed9318cb4515aee47f89fe9b1db69bbf213ea2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prbsc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s95hp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:45Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:45 crc kubenswrapper[4815]: I1207 19:15:45.835411 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:45 crc kubenswrapper[4815]: I1207 19:15:45.835499 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:45 crc kubenswrapper[4815]: I1207 19:15:45.835519 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:45 crc kubenswrapper[4815]: I1207 19:15:45.835546 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:45 crc kubenswrapper[4815]: I1207 19:15:45.835564 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:45Z","lastTransitionTime":"2025-12-07T19:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:45 crc kubenswrapper[4815]: I1207 19:15:45.856457 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gmf4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fefaece-ab52-48e2-9ee9-fb07be1922f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://600d32fcae56dd0fa63421283e9b0fdfc23c4589a1dc4a66e5ff1f4c6a415e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e14454c2c5dbbdcbfd3ff7ec5d618c2dfe48f41af9846b72f61941b878e53d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e14454c2c5dbbdcbfd3ff7ec5d618c2dfe48f41af9846b72f61941b878e53d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f33f1decd49213b8e034c2d1f2a3a51ab99aa18679bd3378becaa27f1ef4b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f33f1decd49213b8e034c2d1f2a3a51ab99aa18679bd3378becaa27f1ef4b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c519c7c8ea1b7405466970c64269b72cc6e8129838764dfc4a662cb4df419c6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c519c7c8ea1b7405466970c64269b72cc6e8129838764dfc4a662cb4df419c6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c9efcd688f4886df56fb4d0e2ff5f5dd0c9e877186e1b53d398b7bed0547609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c9efcd688f4886df56fb4d0e2ff5f5dd0c9e877186e1b53d398b7bed0547609\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a4a622d5231f5bacbb9dba0290a87f10310a0053be2dd692718fcf1002deb44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a4a622d5231f5bacbb9dba0290a87f10310a0053be2dd692718fcf1002deb44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://825649f5768765281bad22f9b86ed162b80cc258ce11a746ec06c9f415a21529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://825649f5768765281bad22f9b86ed162b80cc258ce11a746ec06c9f415a21529\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gmf4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:45Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:45 crc kubenswrapper[4815]: I1207 19:15:45.874297 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tqxds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dfd265a-0c7f-40bd-9226-82c2b1abbeda\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3523e317081c56f05c00d0288c6ad6f1ff04f1346772bc0c86aed488570786a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzfhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a277a724d026909546ae84aee6d8d29fc87f0277f9d6aa45ca99e5f116ac79d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzfhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tqxds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:45Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:45 crc kubenswrapper[4815]: I1207 19:15:45.891848 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"538dccc2-452b-4d13-8b0d-df993c2b69a2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae001d2e4c277c57382ca0c4a1286c3c135f2fc06ca5a78df27cbdab588ce576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f92a2d37af34c519a6b00a9643f4a46de779ae0070adb0826e9f227960fe4a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89a76372fcc5ff7131084098e41183d148096169b9ccaf69d59dbef45dcfd755\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f29e69a7aa018d68c2e39f3e0f7a619fb25262acaebf83b81fabb6f9b87a9892\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:45Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:45 crc kubenswrapper[4815]: I1207 19:15:45.909648 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cb6cb3bfed28d77ff74b7a2273ddb477e21f930f79b169bfbd98fa10a5ead5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:45Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:45 crc kubenswrapper[4815]: I1207 19:15:45.931195 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac471afc20373bd22caa4941b5ebe38fdf62ad2145fabb4211fa02ce8728ab1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4050d0d7398492efc8809311a01c1a60986b97b6d177846e70c9283399f37b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c63ad6fab49ca4b8deb896aef3b5ff43c3a1ae1035e7ad94815a938939c57c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://812356ce6808d8d2a5845dec1132733fab86a0d6853951920ce5fb5083551c7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ab1c01dcbfe50580e64d89dc9e14843419588c89d7d792af89034d8f272e44e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f140818c38211cb31d63c0b540b60681a781c349ba0523866a3c7e80f33756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6dacc25cfd09c798577f8ef8470f6a4d7f1fe33b265ab8f798ea6ae999bd3ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6dacc25cfd09c798577f8ef8470f6a4d7f1fe33b265ab8f798ea6ae999bd3ac\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"message\\\":\\\"rnetes/ovnkube-control-plane-749d76644c-tqxds\\\\nI1207 19:15:40.404183 6171 services_controller.go:451] Built service openshift-machine-api/machine-api-operator-machine-webhook cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-api/machine-api-operator-machine-webhook_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/machine-api-operator-machine-webhook\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.250\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1207 19:15:40.404208 6171 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:39Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-zzw6c_openshift-ovn-kubernetes(13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba72f8ec4b6a36d0c1a6fd8f9f8e5fad577a00b8ae515738a007b994b989d134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a4a4b2d4e80c078ae857e91491c082c6b1c57a5297c485c31fac0c90996b19f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a4a4b2d4e80c078ae857e91491c082c6b1c57a5297c485c31fac0c90996b19f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zzw6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:45Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:45 crc kubenswrapper[4815]: I1207 19:15:45.937879 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:45 crc kubenswrapper[4815]: I1207 19:15:45.938059 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:45 crc kubenswrapper[4815]: I1207 19:15:45.938079 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:45 crc kubenswrapper[4815]: I1207 19:15:45.938134 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:45 crc kubenswrapper[4815]: I1207 19:15:45.938157 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:45Z","lastTransitionTime":"2025-12-07T19:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:45 crc kubenswrapper[4815]: I1207 19:15:45.947858 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kh7gd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29b15186-a725-4067-ba76-336f580327fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1fa8bda26912926bc52757424a9930e1eca6ee953f28ffdb205c1b8b8f29d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kh7gd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:45Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:45 crc kubenswrapper[4815]: I1207 19:15:45.970587 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ec34a6371c74ee0778e27d73f5e2948d9ee1bb518f658675df123df60879c61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc20ae853b989c9e9cfeb47a40bd1a02397e3ce0b0b4feec3fcec56ea7be16a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:45Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:45 crc kubenswrapper[4815]: I1207 19:15:45.992579 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:45Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:46 crc kubenswrapper[4815]: I1207 19:15:46.011204 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nz6q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92d414c2-ecc3-4598-ac9d-b982bfd89c7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://003f0d6dca39f167925bc3a05634d9dd20f23e54135631e0099a56801f9daf3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cp5k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nz6q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:46Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:46 crc kubenswrapper[4815]: I1207 19:15:46.029736 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c752886-7f9b-4605-8358-0fde597c93da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://441c88363ea1cc8305fa21c51c9237798c47d82a207c9e80da98df93027fe4a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb1ec115ead917caf40ce3586653d6eb78e9a290d845f766f8818c2b57ece6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bda919a651aa83d2f9a82f711d0b242e14140e3d13ec5a80f7ec675a4aedb21a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33a4c84a8cd512a90c51f4721d9ddfde2a452e7f03d9809d492d5bccd7cf4aa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33a4c84a8cd512a90c51f4721d9ddfde2a452e7f03d9809d492d5bccd7cf4aa4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:46Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:46 crc kubenswrapper[4815]: I1207 19:15:46.042665 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:46 crc kubenswrapper[4815]: I1207 19:15:46.042729 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:46 crc kubenswrapper[4815]: I1207 19:15:46.042751 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:46 crc kubenswrapper[4815]: I1207 19:15:46.042781 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:46 crc kubenswrapper[4815]: I1207 19:15:46.042800 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:46Z","lastTransitionTime":"2025-12-07T19:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:46 crc kubenswrapper[4815]: I1207 19:15:46.054621 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe01e9d3-9839-41ea-82d9-f4b6f48bf3b2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52ab545f838534306ea70cc6e400e9fb56ed71ed8206d72c6626166f3a51a93c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f59886c065844948639bf9828ed20c24546cc38659683c98162568c6b12c029\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bec885c9a80d9b5b1fe630a9b7a3f19b23b0af05f359bd87a664f007812b4c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c76d9f056d22d1be012327374370269c2f6d6b8bd341c98419fcb744370751e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539bb1f0e569004272aac994e13bdc7f17f342f5986f08beca97a4c888e046c2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1207 19:15:09.905103 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1207 19:15:09.906859 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4228347019/tls.crt::/tmp/serving-cert-4228347019/tls.key\\\\\\\"\\\\nI1207 19:15:20.294696 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1207 19:15:20.298748 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1207 19:15:20.298781 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1207 19:15:20.298843 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1207 19:15:20.298860 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1207 19:15:20.305864 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1207 19:15:20.305885 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1207 19:15:20.305891 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1207 19:15:20.305973 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1207 19:15:20.305983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1207 19:15:20.305988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1207 19:15:20.305992 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1207 19:15:20.305997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1207 19:15:20.308727 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://902a764a3f6e3ac55c1817f369dfd69ffbf8528bfe4444a6514cdabd48f09fb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:46Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:46 crc kubenswrapper[4815]: I1207 19:15:46.077531 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:46Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:46 crc kubenswrapper[4815]: I1207 19:15:46.099484 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b57a8653da7e9136a6c54872ff67a01c2d6ad5077842b783ffe74e0069bc8693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:46Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:46 crc kubenswrapper[4815]: I1207 19:15:46.117205 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:46Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:46 crc kubenswrapper[4815]: I1207 19:15:46.145836 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:46 crc kubenswrapper[4815]: I1207 19:15:46.145891 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:46 crc kubenswrapper[4815]: I1207 19:15:46.145908 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:46 crc kubenswrapper[4815]: I1207 19:15:46.145967 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:46 crc kubenswrapper[4815]: I1207 19:15:46.145985 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:46Z","lastTransitionTime":"2025-12-07T19:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:46 crc kubenswrapper[4815]: I1207 19:15:46.248729 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:46 crc kubenswrapper[4815]: I1207 19:15:46.248782 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:46 crc kubenswrapper[4815]: I1207 19:15:46.248798 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:46 crc kubenswrapper[4815]: I1207 19:15:46.248828 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:46 crc kubenswrapper[4815]: I1207 19:15:46.248846 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:46Z","lastTransitionTime":"2025-12-07T19:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:46 crc kubenswrapper[4815]: I1207 19:15:46.352037 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:46 crc kubenswrapper[4815]: I1207 19:15:46.352103 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:46 crc kubenswrapper[4815]: I1207 19:15:46.352121 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:46 crc kubenswrapper[4815]: I1207 19:15:46.352146 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:46 crc kubenswrapper[4815]: I1207 19:15:46.352167 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:46Z","lastTransitionTime":"2025-12-07T19:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:46 crc kubenswrapper[4815]: I1207 19:15:46.454458 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:46 crc kubenswrapper[4815]: I1207 19:15:46.454548 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:46 crc kubenswrapper[4815]: I1207 19:15:46.454570 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:46 crc kubenswrapper[4815]: I1207 19:15:46.454597 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:46 crc kubenswrapper[4815]: I1207 19:15:46.454619 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:46Z","lastTransitionTime":"2025-12-07T19:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:46 crc kubenswrapper[4815]: I1207 19:15:46.557537 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:46 crc kubenswrapper[4815]: I1207 19:15:46.557588 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:46 crc kubenswrapper[4815]: I1207 19:15:46.557607 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:46 crc kubenswrapper[4815]: I1207 19:15:46.557631 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:46 crc kubenswrapper[4815]: I1207 19:15:46.557647 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:46Z","lastTransitionTime":"2025-12-07T19:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:46 crc kubenswrapper[4815]: I1207 19:15:46.661215 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:46 crc kubenswrapper[4815]: I1207 19:15:46.661274 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:46 crc kubenswrapper[4815]: I1207 19:15:46.661300 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:46 crc kubenswrapper[4815]: I1207 19:15:46.661329 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:46 crc kubenswrapper[4815]: I1207 19:15:46.661354 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:46Z","lastTransitionTime":"2025-12-07T19:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:46 crc kubenswrapper[4815]: I1207 19:15:46.764710 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:46 crc kubenswrapper[4815]: I1207 19:15:46.764763 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:46 crc kubenswrapper[4815]: I1207 19:15:46.764781 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:46 crc kubenswrapper[4815]: I1207 19:15:46.764804 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:46 crc kubenswrapper[4815]: I1207 19:15:46.764822 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:46Z","lastTransitionTime":"2025-12-07T19:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:46 crc kubenswrapper[4815]: I1207 19:15:46.769488 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xbq22" Dec 07 19:15:46 crc kubenswrapper[4815]: I1207 19:15:46.769686 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 07 19:15:46 crc kubenswrapper[4815]: E1207 19:15:46.769722 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xbq22" podUID="201e9ba8-3e19-4555-90f0-587497a2a328" Dec 07 19:15:46 crc kubenswrapper[4815]: E1207 19:15:46.770173 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 07 19:15:46 crc kubenswrapper[4815]: I1207 19:15:46.868519 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:46 crc kubenswrapper[4815]: I1207 19:15:46.868585 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:46 crc kubenswrapper[4815]: I1207 19:15:46.868603 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:46 crc kubenswrapper[4815]: I1207 19:15:46.868628 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:46 crc kubenswrapper[4815]: I1207 19:15:46.868647 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:46Z","lastTransitionTime":"2025-12-07T19:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:46 crc kubenswrapper[4815]: I1207 19:15:46.971401 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:46 crc kubenswrapper[4815]: I1207 19:15:46.971840 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:46 crc kubenswrapper[4815]: I1207 19:15:46.972021 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:46 crc kubenswrapper[4815]: I1207 19:15:46.972164 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:46 crc kubenswrapper[4815]: I1207 19:15:46.972301 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:46Z","lastTransitionTime":"2025-12-07T19:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:47 crc kubenswrapper[4815]: I1207 19:15:47.075295 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:47 crc kubenswrapper[4815]: I1207 19:15:47.075412 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:47 crc kubenswrapper[4815]: I1207 19:15:47.075438 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:47 crc kubenswrapper[4815]: I1207 19:15:47.075463 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:47 crc kubenswrapper[4815]: I1207 19:15:47.075481 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:47Z","lastTransitionTime":"2025-12-07T19:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:47 crc kubenswrapper[4815]: I1207 19:15:47.178225 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:47 crc kubenswrapper[4815]: I1207 19:15:47.178273 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:47 crc kubenswrapper[4815]: I1207 19:15:47.178295 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:47 crc kubenswrapper[4815]: I1207 19:15:47.178321 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:47 crc kubenswrapper[4815]: I1207 19:15:47.178342 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:47Z","lastTransitionTime":"2025-12-07T19:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:47 crc kubenswrapper[4815]: I1207 19:15:47.282170 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:47 crc kubenswrapper[4815]: I1207 19:15:47.282310 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:47 crc kubenswrapper[4815]: I1207 19:15:47.282332 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:47 crc kubenswrapper[4815]: I1207 19:15:47.282356 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:47 crc kubenswrapper[4815]: I1207 19:15:47.282374 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:47Z","lastTransitionTime":"2025-12-07T19:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:47 crc kubenswrapper[4815]: I1207 19:15:47.386373 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:47 crc kubenswrapper[4815]: I1207 19:15:47.386442 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:47 crc kubenswrapper[4815]: I1207 19:15:47.386465 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:47 crc kubenswrapper[4815]: I1207 19:15:47.386494 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:47 crc kubenswrapper[4815]: I1207 19:15:47.386514 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:47Z","lastTransitionTime":"2025-12-07T19:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:47 crc kubenswrapper[4815]: I1207 19:15:47.436636 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:47 crc kubenswrapper[4815]: I1207 19:15:47.436695 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:47 crc kubenswrapper[4815]: I1207 19:15:47.436720 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:47 crc kubenswrapper[4815]: I1207 19:15:47.436748 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:47 crc kubenswrapper[4815]: I1207 19:15:47.436772 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:47Z","lastTransitionTime":"2025-12-07T19:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:47 crc kubenswrapper[4815]: E1207 19:15:47.459001 4815 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:15:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:15:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:15:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:15:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3d749f27-20c8-4d23-ad52-dc6b852bf3b7\\\",\\\"systemUUID\\\":\\\"077277cc-9bde-4aeb-947a-0cf3c49a1ac0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:47Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:47 crc kubenswrapper[4815]: I1207 19:15:47.464299 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:47 crc kubenswrapper[4815]: I1207 19:15:47.464356 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:47 crc kubenswrapper[4815]: I1207 19:15:47.464379 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:47 crc kubenswrapper[4815]: I1207 19:15:47.464406 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:47 crc kubenswrapper[4815]: I1207 19:15:47.464428 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:47Z","lastTransitionTime":"2025-12-07T19:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:47 crc kubenswrapper[4815]: E1207 19:15:47.485180 4815 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:15:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:15:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:15:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:15:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3d749f27-20c8-4d23-ad52-dc6b852bf3b7\\\",\\\"systemUUID\\\":\\\"077277cc-9bde-4aeb-947a-0cf3c49a1ac0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:47Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:47 crc kubenswrapper[4815]: I1207 19:15:47.491172 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:47 crc kubenswrapper[4815]: I1207 19:15:47.491238 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:47 crc kubenswrapper[4815]: I1207 19:15:47.491261 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:47 crc kubenswrapper[4815]: I1207 19:15:47.491289 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:47 crc kubenswrapper[4815]: I1207 19:15:47.491311 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:47Z","lastTransitionTime":"2025-12-07T19:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:47 crc kubenswrapper[4815]: E1207 19:15:47.513407 4815 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:15:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:15:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:15:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:15:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3d749f27-20c8-4d23-ad52-dc6b852bf3b7\\\",\\\"systemUUID\\\":\\\"077277cc-9bde-4aeb-947a-0cf3c49a1ac0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:47Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:47 crc kubenswrapper[4815]: I1207 19:15:47.519450 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:47 crc kubenswrapper[4815]: I1207 19:15:47.519506 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:47 crc kubenswrapper[4815]: I1207 19:15:47.519531 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:47 crc kubenswrapper[4815]: I1207 19:15:47.519562 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:47 crc kubenswrapper[4815]: I1207 19:15:47.519586 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:47Z","lastTransitionTime":"2025-12-07T19:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:47 crc kubenswrapper[4815]: E1207 19:15:47.541293 4815 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:15:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:15:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:15:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:15:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3d749f27-20c8-4d23-ad52-dc6b852bf3b7\\\",\\\"systemUUID\\\":\\\"077277cc-9bde-4aeb-947a-0cf3c49a1ac0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:47Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:47 crc kubenswrapper[4815]: I1207 19:15:47.545944 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:47 crc kubenswrapper[4815]: I1207 19:15:47.545983 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:47 crc kubenswrapper[4815]: I1207 19:15:47.545998 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:47 crc kubenswrapper[4815]: I1207 19:15:47.546019 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:47 crc kubenswrapper[4815]: I1207 19:15:47.546033 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:47Z","lastTransitionTime":"2025-12-07T19:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:47 crc kubenswrapper[4815]: E1207 19:15:47.565057 4815 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:15:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:15:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:15:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:15:47Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3d749f27-20c8-4d23-ad52-dc6b852bf3b7\\\",\\\"systemUUID\\\":\\\"077277cc-9bde-4aeb-947a-0cf3c49a1ac0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:47Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:47 crc kubenswrapper[4815]: E1207 19:15:47.565298 4815 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 07 19:15:47 crc kubenswrapper[4815]: I1207 19:15:47.567521 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:47 crc kubenswrapper[4815]: I1207 19:15:47.567559 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:47 crc kubenswrapper[4815]: I1207 19:15:47.567572 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:47 crc kubenswrapper[4815]: I1207 19:15:47.567591 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:47 crc kubenswrapper[4815]: I1207 19:15:47.567605 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:47Z","lastTransitionTime":"2025-12-07T19:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:47 crc kubenswrapper[4815]: I1207 19:15:47.671354 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:47 crc kubenswrapper[4815]: I1207 19:15:47.671434 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:47 crc kubenswrapper[4815]: I1207 19:15:47.671457 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:47 crc kubenswrapper[4815]: I1207 19:15:47.671482 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:47 crc kubenswrapper[4815]: I1207 19:15:47.671499 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:47Z","lastTransitionTime":"2025-12-07T19:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:47 crc kubenswrapper[4815]: I1207 19:15:47.769601 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 07 19:15:47 crc kubenswrapper[4815]: I1207 19:15:47.769703 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 07 19:15:47 crc kubenswrapper[4815]: E1207 19:15:47.769831 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 07 19:15:47 crc kubenswrapper[4815]: E1207 19:15:47.769912 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 07 19:15:47 crc kubenswrapper[4815]: I1207 19:15:47.775182 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:47 crc kubenswrapper[4815]: I1207 19:15:47.775230 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:47 crc kubenswrapper[4815]: I1207 19:15:47.775247 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:47 crc kubenswrapper[4815]: I1207 19:15:47.775277 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:47 crc kubenswrapper[4815]: I1207 19:15:47.775295 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:47Z","lastTransitionTime":"2025-12-07T19:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:47 crc kubenswrapper[4815]: I1207 19:15:47.878011 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:47 crc kubenswrapper[4815]: I1207 19:15:47.878069 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:47 crc kubenswrapper[4815]: I1207 19:15:47.878089 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:47 crc kubenswrapper[4815]: I1207 19:15:47.878139 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:47 crc kubenswrapper[4815]: I1207 19:15:47.878161 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:47Z","lastTransitionTime":"2025-12-07T19:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:47 crc kubenswrapper[4815]: I1207 19:15:47.981190 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:47 crc kubenswrapper[4815]: I1207 19:15:47.981238 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:47 crc kubenswrapper[4815]: I1207 19:15:47.981252 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:47 crc kubenswrapper[4815]: I1207 19:15:47.981273 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:47 crc kubenswrapper[4815]: I1207 19:15:47.981288 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:47Z","lastTransitionTime":"2025-12-07T19:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:48 crc kubenswrapper[4815]: I1207 19:15:48.084741 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:48 crc kubenswrapper[4815]: I1207 19:15:48.084812 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:48 crc kubenswrapper[4815]: I1207 19:15:48.084837 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:48 crc kubenswrapper[4815]: I1207 19:15:48.084866 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:48 crc kubenswrapper[4815]: I1207 19:15:48.084887 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:48Z","lastTransitionTime":"2025-12-07T19:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:48 crc kubenswrapper[4815]: I1207 19:15:48.187353 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:48 crc kubenswrapper[4815]: I1207 19:15:48.187419 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:48 crc kubenswrapper[4815]: I1207 19:15:48.187441 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:48 crc kubenswrapper[4815]: I1207 19:15:48.187469 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:48 crc kubenswrapper[4815]: I1207 19:15:48.187490 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:48Z","lastTransitionTime":"2025-12-07T19:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:48 crc kubenswrapper[4815]: I1207 19:15:48.297129 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:48 crc kubenswrapper[4815]: I1207 19:15:48.297232 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:48 crc kubenswrapper[4815]: I1207 19:15:48.297258 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:48 crc kubenswrapper[4815]: I1207 19:15:48.297651 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:48 crc kubenswrapper[4815]: I1207 19:15:48.297674 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:48Z","lastTransitionTime":"2025-12-07T19:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:48 crc kubenswrapper[4815]: I1207 19:15:48.400431 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:48 crc kubenswrapper[4815]: I1207 19:15:48.400486 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:48 crc kubenswrapper[4815]: I1207 19:15:48.400510 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:48 crc kubenswrapper[4815]: I1207 19:15:48.400540 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:48 crc kubenswrapper[4815]: I1207 19:15:48.400564 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:48Z","lastTransitionTime":"2025-12-07T19:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:48 crc kubenswrapper[4815]: I1207 19:15:48.489255 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/201e9ba8-3e19-4555-90f0-587497a2a328-metrics-certs\") pod \"network-metrics-daemon-xbq22\" (UID: \"201e9ba8-3e19-4555-90f0-587497a2a328\") " pod="openshift-multus/network-metrics-daemon-xbq22" Dec 07 19:15:48 crc kubenswrapper[4815]: E1207 19:15:48.489442 4815 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 07 19:15:48 crc kubenswrapper[4815]: E1207 19:15:48.489493 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/201e9ba8-3e19-4555-90f0-587497a2a328-metrics-certs podName:201e9ba8-3e19-4555-90f0-587497a2a328 nodeName:}" failed. No retries permitted until 2025-12-07 19:15:56.489477243 +0000 UTC m=+61.068467298 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/201e9ba8-3e19-4555-90f0-587497a2a328-metrics-certs") pod "network-metrics-daemon-xbq22" (UID: "201e9ba8-3e19-4555-90f0-587497a2a328") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 07 19:15:48 crc kubenswrapper[4815]: I1207 19:15:48.503827 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:48 crc kubenswrapper[4815]: I1207 19:15:48.503944 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:48 crc kubenswrapper[4815]: I1207 19:15:48.503963 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:48 crc kubenswrapper[4815]: I1207 19:15:48.503992 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:48 crc kubenswrapper[4815]: I1207 19:15:48.504013 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:48Z","lastTransitionTime":"2025-12-07T19:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:48 crc kubenswrapper[4815]: I1207 19:15:48.606452 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:48 crc kubenswrapper[4815]: I1207 19:15:48.606490 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:48 crc kubenswrapper[4815]: I1207 19:15:48.606501 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:48 crc kubenswrapper[4815]: I1207 19:15:48.606517 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:48 crc kubenswrapper[4815]: I1207 19:15:48.606527 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:48Z","lastTransitionTime":"2025-12-07T19:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:48 crc kubenswrapper[4815]: I1207 19:15:48.708464 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:48 crc kubenswrapper[4815]: I1207 19:15:48.708520 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:48 crc kubenswrapper[4815]: I1207 19:15:48.708543 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:48 crc kubenswrapper[4815]: I1207 19:15:48.708571 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:48 crc kubenswrapper[4815]: I1207 19:15:48.708591 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:48Z","lastTransitionTime":"2025-12-07T19:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:48 crc kubenswrapper[4815]: I1207 19:15:48.769986 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 07 19:15:48 crc kubenswrapper[4815]: E1207 19:15:48.770258 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 07 19:15:48 crc kubenswrapper[4815]: I1207 19:15:48.770389 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xbq22" Dec 07 19:15:48 crc kubenswrapper[4815]: E1207 19:15:48.770569 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xbq22" podUID="201e9ba8-3e19-4555-90f0-587497a2a328" Dec 07 19:15:48 crc kubenswrapper[4815]: I1207 19:15:48.811892 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:48 crc kubenswrapper[4815]: I1207 19:15:48.811994 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:48 crc kubenswrapper[4815]: I1207 19:15:48.812013 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:48 crc kubenswrapper[4815]: I1207 19:15:48.812037 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:48 crc kubenswrapper[4815]: I1207 19:15:48.812054 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:48Z","lastTransitionTime":"2025-12-07T19:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:48 crc kubenswrapper[4815]: I1207 19:15:48.915323 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:48 crc kubenswrapper[4815]: I1207 19:15:48.915360 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:48 crc kubenswrapper[4815]: I1207 19:15:48.915372 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:48 crc kubenswrapper[4815]: I1207 19:15:48.915388 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:48 crc kubenswrapper[4815]: I1207 19:15:48.915399 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:48Z","lastTransitionTime":"2025-12-07T19:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:49 crc kubenswrapper[4815]: I1207 19:15:49.018116 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:49 crc kubenswrapper[4815]: I1207 19:15:49.018158 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:49 crc kubenswrapper[4815]: I1207 19:15:49.018215 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:49 crc kubenswrapper[4815]: I1207 19:15:49.018236 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:49 crc kubenswrapper[4815]: I1207 19:15:49.018251 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:49Z","lastTransitionTime":"2025-12-07T19:15:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:49 crc kubenswrapper[4815]: I1207 19:15:49.121745 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:49 crc kubenswrapper[4815]: I1207 19:15:49.121801 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:49 crc kubenswrapper[4815]: I1207 19:15:49.121819 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:49 crc kubenswrapper[4815]: I1207 19:15:49.121842 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:49 crc kubenswrapper[4815]: I1207 19:15:49.121860 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:49Z","lastTransitionTime":"2025-12-07T19:15:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:49 crc kubenswrapper[4815]: I1207 19:15:49.224460 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:49 crc kubenswrapper[4815]: I1207 19:15:49.224517 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:49 crc kubenswrapper[4815]: I1207 19:15:49.224535 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:49 crc kubenswrapper[4815]: I1207 19:15:49.224557 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:49 crc kubenswrapper[4815]: I1207 19:15:49.224574 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:49Z","lastTransitionTime":"2025-12-07T19:15:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:49 crc kubenswrapper[4815]: I1207 19:15:49.327904 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:49 crc kubenswrapper[4815]: I1207 19:15:49.328007 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:49 crc kubenswrapper[4815]: I1207 19:15:49.328034 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:49 crc kubenswrapper[4815]: I1207 19:15:49.328067 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:49 crc kubenswrapper[4815]: I1207 19:15:49.328090 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:49Z","lastTransitionTime":"2025-12-07T19:15:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:49 crc kubenswrapper[4815]: I1207 19:15:49.430387 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:49 crc kubenswrapper[4815]: I1207 19:15:49.430435 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:49 crc kubenswrapper[4815]: I1207 19:15:49.430450 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:49 crc kubenswrapper[4815]: I1207 19:15:49.430469 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:49 crc kubenswrapper[4815]: I1207 19:15:49.430484 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:49Z","lastTransitionTime":"2025-12-07T19:15:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:49 crc kubenswrapper[4815]: I1207 19:15:49.533240 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:49 crc kubenswrapper[4815]: I1207 19:15:49.533298 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:49 crc kubenswrapper[4815]: I1207 19:15:49.533322 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:49 crc kubenswrapper[4815]: I1207 19:15:49.533346 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:49 crc kubenswrapper[4815]: I1207 19:15:49.533362 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:49Z","lastTransitionTime":"2025-12-07T19:15:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:49 crc kubenswrapper[4815]: I1207 19:15:49.635713 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:49 crc kubenswrapper[4815]: I1207 19:15:49.635804 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:49 crc kubenswrapper[4815]: I1207 19:15:49.635822 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:49 crc kubenswrapper[4815]: I1207 19:15:49.635848 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:49 crc kubenswrapper[4815]: I1207 19:15:49.635868 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:49Z","lastTransitionTime":"2025-12-07T19:15:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:49 crc kubenswrapper[4815]: I1207 19:15:49.738050 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:49 crc kubenswrapper[4815]: I1207 19:15:49.738090 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:49 crc kubenswrapper[4815]: I1207 19:15:49.738102 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:49 crc kubenswrapper[4815]: I1207 19:15:49.738119 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:49 crc kubenswrapper[4815]: I1207 19:15:49.738131 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:49Z","lastTransitionTime":"2025-12-07T19:15:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:49 crc kubenswrapper[4815]: I1207 19:15:49.769001 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 07 19:15:49 crc kubenswrapper[4815]: E1207 19:15:49.769114 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 07 19:15:49 crc kubenswrapper[4815]: I1207 19:15:49.769171 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 07 19:15:49 crc kubenswrapper[4815]: E1207 19:15:49.769317 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 07 19:15:49 crc kubenswrapper[4815]: I1207 19:15:49.840095 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:49 crc kubenswrapper[4815]: I1207 19:15:49.840149 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:49 crc kubenswrapper[4815]: I1207 19:15:49.840167 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:49 crc kubenswrapper[4815]: I1207 19:15:49.840191 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:49 crc kubenswrapper[4815]: I1207 19:15:49.840210 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:49Z","lastTransitionTime":"2025-12-07T19:15:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:49 crc kubenswrapper[4815]: I1207 19:15:49.942548 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:49 crc kubenswrapper[4815]: I1207 19:15:49.942607 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:49 crc kubenswrapper[4815]: I1207 19:15:49.942630 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:49 crc kubenswrapper[4815]: I1207 19:15:49.942656 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:49 crc kubenswrapper[4815]: I1207 19:15:49.942674 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:49Z","lastTransitionTime":"2025-12-07T19:15:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:50 crc kubenswrapper[4815]: I1207 19:15:50.044686 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:50 crc kubenswrapper[4815]: I1207 19:15:50.044928 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:50 crc kubenswrapper[4815]: I1207 19:15:50.045021 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:50 crc kubenswrapper[4815]: I1207 19:15:50.045114 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:50 crc kubenswrapper[4815]: I1207 19:15:50.045214 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:50Z","lastTransitionTime":"2025-12-07T19:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:50 crc kubenswrapper[4815]: I1207 19:15:50.148237 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:50 crc kubenswrapper[4815]: I1207 19:15:50.148285 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:50 crc kubenswrapper[4815]: I1207 19:15:50.148300 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:50 crc kubenswrapper[4815]: I1207 19:15:50.148320 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:50 crc kubenswrapper[4815]: I1207 19:15:50.148333 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:50Z","lastTransitionTime":"2025-12-07T19:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:50 crc kubenswrapper[4815]: I1207 19:15:50.250399 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:50 crc kubenswrapper[4815]: I1207 19:15:50.250448 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:50 crc kubenswrapper[4815]: I1207 19:15:50.250464 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:50 crc kubenswrapper[4815]: I1207 19:15:50.250484 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:50 crc kubenswrapper[4815]: I1207 19:15:50.250500 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:50Z","lastTransitionTime":"2025-12-07T19:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:50 crc kubenswrapper[4815]: I1207 19:15:50.353409 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:50 crc kubenswrapper[4815]: I1207 19:15:50.353438 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:50 crc kubenswrapper[4815]: I1207 19:15:50.353449 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:50 crc kubenswrapper[4815]: I1207 19:15:50.353463 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:50 crc kubenswrapper[4815]: I1207 19:15:50.353475 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:50Z","lastTransitionTime":"2025-12-07T19:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:50 crc kubenswrapper[4815]: I1207 19:15:50.455826 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:50 crc kubenswrapper[4815]: I1207 19:15:50.455873 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:50 crc kubenswrapper[4815]: I1207 19:15:50.455887 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:50 crc kubenswrapper[4815]: I1207 19:15:50.455905 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:50 crc kubenswrapper[4815]: I1207 19:15:50.455944 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:50Z","lastTransitionTime":"2025-12-07T19:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:50 crc kubenswrapper[4815]: I1207 19:15:50.558104 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:50 crc kubenswrapper[4815]: I1207 19:15:50.558165 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:50 crc kubenswrapper[4815]: I1207 19:15:50.558194 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:50 crc kubenswrapper[4815]: I1207 19:15:50.558224 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:50 crc kubenswrapper[4815]: I1207 19:15:50.558246 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:50Z","lastTransitionTime":"2025-12-07T19:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:50 crc kubenswrapper[4815]: I1207 19:15:50.660971 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:50 crc kubenswrapper[4815]: I1207 19:15:50.661016 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:50 crc kubenswrapper[4815]: I1207 19:15:50.661030 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:50 crc kubenswrapper[4815]: I1207 19:15:50.661050 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:50 crc kubenswrapper[4815]: I1207 19:15:50.661064 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:50Z","lastTransitionTime":"2025-12-07T19:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:50 crc kubenswrapper[4815]: I1207 19:15:50.763639 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:50 crc kubenswrapper[4815]: I1207 19:15:50.763707 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:50 crc kubenswrapper[4815]: I1207 19:15:50.763731 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:50 crc kubenswrapper[4815]: I1207 19:15:50.763764 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:50 crc kubenswrapper[4815]: I1207 19:15:50.763787 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:50Z","lastTransitionTime":"2025-12-07T19:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:50 crc kubenswrapper[4815]: I1207 19:15:50.768779 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 07 19:15:50 crc kubenswrapper[4815]: I1207 19:15:50.768804 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xbq22" Dec 07 19:15:50 crc kubenswrapper[4815]: E1207 19:15:50.768940 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 07 19:15:50 crc kubenswrapper[4815]: E1207 19:15:50.769014 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xbq22" podUID="201e9ba8-3e19-4555-90f0-587497a2a328" Dec 07 19:15:50 crc kubenswrapper[4815]: I1207 19:15:50.867287 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:50 crc kubenswrapper[4815]: I1207 19:15:50.867316 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:50 crc kubenswrapper[4815]: I1207 19:15:50.867327 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:50 crc kubenswrapper[4815]: I1207 19:15:50.867345 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:50 crc kubenswrapper[4815]: I1207 19:15:50.867356 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:50Z","lastTransitionTime":"2025-12-07T19:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:50 crc kubenswrapper[4815]: I1207 19:15:50.970218 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:50 crc kubenswrapper[4815]: I1207 19:15:50.970551 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:50 crc kubenswrapper[4815]: I1207 19:15:50.970680 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:50 crc kubenswrapper[4815]: I1207 19:15:50.970822 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:50 crc kubenswrapper[4815]: I1207 19:15:50.970978 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:50Z","lastTransitionTime":"2025-12-07T19:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:51 crc kubenswrapper[4815]: I1207 19:15:51.073495 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:51 crc kubenswrapper[4815]: I1207 19:15:51.073904 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:51 crc kubenswrapper[4815]: I1207 19:15:51.074076 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:51 crc kubenswrapper[4815]: I1207 19:15:51.074221 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:51 crc kubenswrapper[4815]: I1207 19:15:51.074418 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:51Z","lastTransitionTime":"2025-12-07T19:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:51 crc kubenswrapper[4815]: I1207 19:15:51.177870 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:51 crc kubenswrapper[4815]: I1207 19:15:51.177945 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:51 crc kubenswrapper[4815]: I1207 19:15:51.177962 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:51 crc kubenswrapper[4815]: I1207 19:15:51.177988 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:51 crc kubenswrapper[4815]: I1207 19:15:51.178008 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:51Z","lastTransitionTime":"2025-12-07T19:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:51 crc kubenswrapper[4815]: I1207 19:15:51.280303 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:51 crc kubenswrapper[4815]: I1207 19:15:51.280332 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:51 crc kubenswrapper[4815]: I1207 19:15:51.280340 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:51 crc kubenswrapper[4815]: I1207 19:15:51.280353 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:51 crc kubenswrapper[4815]: I1207 19:15:51.280362 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:51Z","lastTransitionTime":"2025-12-07T19:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:51 crc kubenswrapper[4815]: I1207 19:15:51.383134 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:51 crc kubenswrapper[4815]: I1207 19:15:51.383191 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:51 crc kubenswrapper[4815]: I1207 19:15:51.383209 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:51 crc kubenswrapper[4815]: I1207 19:15:51.383233 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:51 crc kubenswrapper[4815]: I1207 19:15:51.383250 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:51Z","lastTransitionTime":"2025-12-07T19:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:51 crc kubenswrapper[4815]: I1207 19:15:51.486389 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:51 crc kubenswrapper[4815]: I1207 19:15:51.486426 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:51 crc kubenswrapper[4815]: I1207 19:15:51.486435 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:51 crc kubenswrapper[4815]: I1207 19:15:51.486448 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:51 crc kubenswrapper[4815]: I1207 19:15:51.486456 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:51Z","lastTransitionTime":"2025-12-07T19:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:51 crc kubenswrapper[4815]: I1207 19:15:51.589954 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:51 crc kubenswrapper[4815]: I1207 19:15:51.589994 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:51 crc kubenswrapper[4815]: I1207 19:15:51.590023 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:51 crc kubenswrapper[4815]: I1207 19:15:51.590038 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:51 crc kubenswrapper[4815]: I1207 19:15:51.590050 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:51Z","lastTransitionTime":"2025-12-07T19:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:51 crc kubenswrapper[4815]: I1207 19:15:51.693076 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:51 crc kubenswrapper[4815]: I1207 19:15:51.693122 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:51 crc kubenswrapper[4815]: I1207 19:15:51.693140 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:51 crc kubenswrapper[4815]: I1207 19:15:51.693164 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:51 crc kubenswrapper[4815]: I1207 19:15:51.693181 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:51Z","lastTransitionTime":"2025-12-07T19:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:51 crc kubenswrapper[4815]: I1207 19:15:51.769205 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 07 19:15:51 crc kubenswrapper[4815]: E1207 19:15:51.769420 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 07 19:15:51 crc kubenswrapper[4815]: I1207 19:15:51.770042 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 07 19:15:51 crc kubenswrapper[4815]: E1207 19:15:51.770170 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 07 19:15:51 crc kubenswrapper[4815]: I1207 19:15:51.796111 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:51 crc kubenswrapper[4815]: I1207 19:15:51.796170 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:51 crc kubenswrapper[4815]: I1207 19:15:51.796194 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:51 crc kubenswrapper[4815]: I1207 19:15:51.796223 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:51 crc kubenswrapper[4815]: I1207 19:15:51.796247 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:51Z","lastTransitionTime":"2025-12-07T19:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:51 crc kubenswrapper[4815]: I1207 19:15:51.899342 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:51 crc kubenswrapper[4815]: I1207 19:15:51.899393 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:51 crc kubenswrapper[4815]: I1207 19:15:51.899415 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:51 crc kubenswrapper[4815]: I1207 19:15:51.899444 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:51 crc kubenswrapper[4815]: I1207 19:15:51.899469 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:51Z","lastTransitionTime":"2025-12-07T19:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:52 crc kubenswrapper[4815]: I1207 19:15:52.002314 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:52 crc kubenswrapper[4815]: I1207 19:15:52.002370 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:52 crc kubenswrapper[4815]: I1207 19:15:52.002396 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:52 crc kubenswrapper[4815]: I1207 19:15:52.002411 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:52 crc kubenswrapper[4815]: I1207 19:15:52.002419 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:52Z","lastTransitionTime":"2025-12-07T19:15:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:52 crc kubenswrapper[4815]: I1207 19:15:52.105469 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:52 crc kubenswrapper[4815]: I1207 19:15:52.105529 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:52 crc kubenswrapper[4815]: I1207 19:15:52.105549 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:52 crc kubenswrapper[4815]: I1207 19:15:52.105571 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:52 crc kubenswrapper[4815]: I1207 19:15:52.105589 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:52Z","lastTransitionTime":"2025-12-07T19:15:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:52 crc kubenswrapper[4815]: I1207 19:15:52.208017 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:52 crc kubenswrapper[4815]: I1207 19:15:52.208075 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:52 crc kubenswrapper[4815]: I1207 19:15:52.208096 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:52 crc kubenswrapper[4815]: I1207 19:15:52.208124 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:52 crc kubenswrapper[4815]: I1207 19:15:52.208144 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:52Z","lastTransitionTime":"2025-12-07T19:15:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:52 crc kubenswrapper[4815]: I1207 19:15:52.311295 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:52 crc kubenswrapper[4815]: I1207 19:15:52.311345 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:52 crc kubenswrapper[4815]: I1207 19:15:52.311367 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:52 crc kubenswrapper[4815]: I1207 19:15:52.311395 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:52 crc kubenswrapper[4815]: I1207 19:15:52.311415 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:52Z","lastTransitionTime":"2025-12-07T19:15:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:52 crc kubenswrapper[4815]: I1207 19:15:52.413772 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:52 crc kubenswrapper[4815]: I1207 19:15:52.413821 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:52 crc kubenswrapper[4815]: I1207 19:15:52.413834 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:52 crc kubenswrapper[4815]: I1207 19:15:52.413852 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:52 crc kubenswrapper[4815]: I1207 19:15:52.413867 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:52Z","lastTransitionTime":"2025-12-07T19:15:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:52 crc kubenswrapper[4815]: I1207 19:15:52.516624 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:52 crc kubenswrapper[4815]: I1207 19:15:52.516675 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:52 crc kubenswrapper[4815]: I1207 19:15:52.516688 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:52 crc kubenswrapper[4815]: I1207 19:15:52.516708 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:52 crc kubenswrapper[4815]: I1207 19:15:52.516719 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:52Z","lastTransitionTime":"2025-12-07T19:15:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:52 crc kubenswrapper[4815]: I1207 19:15:52.618843 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:52 crc kubenswrapper[4815]: I1207 19:15:52.618880 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:52 crc kubenswrapper[4815]: I1207 19:15:52.618894 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:52 crc kubenswrapper[4815]: I1207 19:15:52.618910 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:52 crc kubenswrapper[4815]: I1207 19:15:52.618953 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:52Z","lastTransitionTime":"2025-12-07T19:15:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:52 crc kubenswrapper[4815]: I1207 19:15:52.720865 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:52 crc kubenswrapper[4815]: I1207 19:15:52.720955 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:52 crc kubenswrapper[4815]: I1207 19:15:52.720977 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:52 crc kubenswrapper[4815]: I1207 19:15:52.720996 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:52 crc kubenswrapper[4815]: I1207 19:15:52.721010 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:52Z","lastTransitionTime":"2025-12-07T19:15:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:52 crc kubenswrapper[4815]: I1207 19:15:52.735461 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 07 19:15:52 crc kubenswrapper[4815]: E1207 19:15:52.735571 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-07 19:16:24.735549929 +0000 UTC m=+89.314539994 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:15:52 crc kubenswrapper[4815]: I1207 19:15:52.735607 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 07 19:15:52 crc kubenswrapper[4815]: I1207 19:15:52.735664 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 07 19:15:52 crc kubenswrapper[4815]: I1207 19:15:52.735692 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 07 19:15:52 crc kubenswrapper[4815]: I1207 19:15:52.735739 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 07 19:15:52 crc kubenswrapper[4815]: E1207 19:15:52.735779 4815 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 07 19:15:52 crc kubenswrapper[4815]: E1207 19:15:52.735844 4815 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 07 19:15:52 crc kubenswrapper[4815]: E1207 19:15:52.735860 4815 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 07 19:15:52 crc kubenswrapper[4815]: E1207 19:15:52.735847 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-07 19:16:24.735826457 +0000 UTC m=+89.314816512 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 07 19:15:52 crc kubenswrapper[4815]: E1207 19:15:52.735886 4815 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 07 19:15:52 crc kubenswrapper[4815]: E1207 19:15:52.735905 4815 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 07 19:15:52 crc kubenswrapper[4815]: E1207 19:15:52.735779 4815 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 07 19:15:52 crc kubenswrapper[4815]: E1207 19:15:52.735982 4815 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 07 19:15:52 crc kubenswrapper[4815]: E1207 19:15:52.735997 4815 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 07 19:15:52 crc kubenswrapper[4815]: E1207 19:15:52.735906 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-07 19:16:24.735892218 +0000 UTC m=+89.314882273 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 07 19:15:52 crc kubenswrapper[4815]: E1207 19:15:52.736056 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-07 19:16:24.736040702 +0000 UTC m=+89.315030757 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 07 19:15:52 crc kubenswrapper[4815]: E1207 19:15:52.736083 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-07 19:16:24.736072973 +0000 UTC m=+89.315063028 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 07 19:15:52 crc kubenswrapper[4815]: I1207 19:15:52.769521 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xbq22" Dec 07 19:15:52 crc kubenswrapper[4815]: I1207 19:15:52.769537 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 07 19:15:52 crc kubenswrapper[4815]: E1207 19:15:52.769751 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xbq22" podUID="201e9ba8-3e19-4555-90f0-587497a2a328" Dec 07 19:15:52 crc kubenswrapper[4815]: E1207 19:15:52.769829 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 07 19:15:52 crc kubenswrapper[4815]: I1207 19:15:52.824679 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:52 crc kubenswrapper[4815]: I1207 19:15:52.824769 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:52 crc kubenswrapper[4815]: I1207 19:15:52.824804 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:52 crc kubenswrapper[4815]: I1207 19:15:52.824832 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:52 crc kubenswrapper[4815]: I1207 19:15:52.824854 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:52Z","lastTransitionTime":"2025-12-07T19:15:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:52 crc kubenswrapper[4815]: I1207 19:15:52.927200 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:52 crc kubenswrapper[4815]: I1207 19:15:52.927248 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:52 crc kubenswrapper[4815]: I1207 19:15:52.927259 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:52 crc kubenswrapper[4815]: I1207 19:15:52.927283 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:52 crc kubenswrapper[4815]: I1207 19:15:52.927296 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:52Z","lastTransitionTime":"2025-12-07T19:15:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:53 crc kubenswrapper[4815]: I1207 19:15:53.029982 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:53 crc kubenswrapper[4815]: I1207 19:15:53.030022 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:53 crc kubenswrapper[4815]: I1207 19:15:53.030033 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:53 crc kubenswrapper[4815]: I1207 19:15:53.030049 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:53 crc kubenswrapper[4815]: I1207 19:15:53.030062 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:53Z","lastTransitionTime":"2025-12-07T19:15:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:53 crc kubenswrapper[4815]: I1207 19:15:53.132595 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:53 crc kubenswrapper[4815]: I1207 19:15:53.132627 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:53 crc kubenswrapper[4815]: I1207 19:15:53.132638 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:53 crc kubenswrapper[4815]: I1207 19:15:53.132653 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:53 crc kubenswrapper[4815]: I1207 19:15:53.132666 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:53Z","lastTransitionTime":"2025-12-07T19:15:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:53 crc kubenswrapper[4815]: I1207 19:15:53.235373 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:53 crc kubenswrapper[4815]: I1207 19:15:53.235471 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:53 crc kubenswrapper[4815]: I1207 19:15:53.235491 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:53 crc kubenswrapper[4815]: I1207 19:15:53.235517 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:53 crc kubenswrapper[4815]: I1207 19:15:53.235537 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:53Z","lastTransitionTime":"2025-12-07T19:15:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:53 crc kubenswrapper[4815]: I1207 19:15:53.338140 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:53 crc kubenswrapper[4815]: I1207 19:15:53.338223 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:53 crc kubenswrapper[4815]: I1207 19:15:53.338245 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:53 crc kubenswrapper[4815]: I1207 19:15:53.338273 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:53 crc kubenswrapper[4815]: I1207 19:15:53.338292 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:53Z","lastTransitionTime":"2025-12-07T19:15:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:53 crc kubenswrapper[4815]: I1207 19:15:53.445574 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:53 crc kubenswrapper[4815]: I1207 19:15:53.445626 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:53 crc kubenswrapper[4815]: I1207 19:15:53.445645 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:53 crc kubenswrapper[4815]: I1207 19:15:53.445669 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:53 crc kubenswrapper[4815]: I1207 19:15:53.445687 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:53Z","lastTransitionTime":"2025-12-07T19:15:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:53 crc kubenswrapper[4815]: I1207 19:15:53.547425 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:53 crc kubenswrapper[4815]: I1207 19:15:53.547473 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:53 crc kubenswrapper[4815]: I1207 19:15:53.547489 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:53 crc kubenswrapper[4815]: I1207 19:15:53.547513 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:53 crc kubenswrapper[4815]: I1207 19:15:53.547530 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:53Z","lastTransitionTime":"2025-12-07T19:15:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:53 crc kubenswrapper[4815]: I1207 19:15:53.650107 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:53 crc kubenswrapper[4815]: I1207 19:15:53.650177 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:53 crc kubenswrapper[4815]: I1207 19:15:53.650198 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:53 crc kubenswrapper[4815]: I1207 19:15:53.650225 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:53 crc kubenswrapper[4815]: I1207 19:15:53.650245 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:53Z","lastTransitionTime":"2025-12-07T19:15:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:53 crc kubenswrapper[4815]: I1207 19:15:53.752897 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:53 crc kubenswrapper[4815]: I1207 19:15:53.753005 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:53 crc kubenswrapper[4815]: I1207 19:15:53.753030 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:53 crc kubenswrapper[4815]: I1207 19:15:53.753060 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:53 crc kubenswrapper[4815]: I1207 19:15:53.753083 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:53Z","lastTransitionTime":"2025-12-07T19:15:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:53 crc kubenswrapper[4815]: I1207 19:15:53.769548 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 07 19:15:53 crc kubenswrapper[4815]: E1207 19:15:53.769780 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 07 19:15:53 crc kubenswrapper[4815]: I1207 19:15:53.769972 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 07 19:15:53 crc kubenswrapper[4815]: E1207 19:15:53.770135 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 07 19:15:53 crc kubenswrapper[4815]: I1207 19:15:53.856122 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:53 crc kubenswrapper[4815]: I1207 19:15:53.856165 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:53 crc kubenswrapper[4815]: I1207 19:15:53.856188 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:53 crc kubenswrapper[4815]: I1207 19:15:53.856303 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:53 crc kubenswrapper[4815]: I1207 19:15:53.856323 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:53Z","lastTransitionTime":"2025-12-07T19:15:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:53 crc kubenswrapper[4815]: I1207 19:15:53.958986 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:53 crc kubenswrapper[4815]: I1207 19:15:53.959036 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:53 crc kubenswrapper[4815]: I1207 19:15:53.959049 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:53 crc kubenswrapper[4815]: I1207 19:15:53.959068 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:53 crc kubenswrapper[4815]: I1207 19:15:53.959083 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:53Z","lastTransitionTime":"2025-12-07T19:15:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:54 crc kubenswrapper[4815]: I1207 19:15:54.061059 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:54 crc kubenswrapper[4815]: I1207 19:15:54.061119 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:54 crc kubenswrapper[4815]: I1207 19:15:54.061129 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:54 crc kubenswrapper[4815]: I1207 19:15:54.061144 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:54 crc kubenswrapper[4815]: I1207 19:15:54.061154 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:54Z","lastTransitionTime":"2025-12-07T19:15:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:54 crc kubenswrapper[4815]: I1207 19:15:54.163394 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:54 crc kubenswrapper[4815]: I1207 19:15:54.163434 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:54 crc kubenswrapper[4815]: I1207 19:15:54.163445 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:54 crc kubenswrapper[4815]: I1207 19:15:54.163464 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:54 crc kubenswrapper[4815]: I1207 19:15:54.163475 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:54Z","lastTransitionTime":"2025-12-07T19:15:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:54 crc kubenswrapper[4815]: I1207 19:15:54.265455 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:54 crc kubenswrapper[4815]: I1207 19:15:54.265490 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:54 crc kubenswrapper[4815]: I1207 19:15:54.265503 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:54 crc kubenswrapper[4815]: I1207 19:15:54.265519 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:54 crc kubenswrapper[4815]: I1207 19:15:54.265532 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:54Z","lastTransitionTime":"2025-12-07T19:15:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:54 crc kubenswrapper[4815]: I1207 19:15:54.367717 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:54 crc kubenswrapper[4815]: I1207 19:15:54.367762 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:54 crc kubenswrapper[4815]: I1207 19:15:54.367779 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:54 crc kubenswrapper[4815]: I1207 19:15:54.367799 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:54 crc kubenswrapper[4815]: I1207 19:15:54.367817 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:54Z","lastTransitionTime":"2025-12-07T19:15:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:54 crc kubenswrapper[4815]: I1207 19:15:54.469577 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:54 crc kubenswrapper[4815]: I1207 19:15:54.469604 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:54 crc kubenswrapper[4815]: I1207 19:15:54.469612 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:54 crc kubenswrapper[4815]: I1207 19:15:54.469626 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:54 crc kubenswrapper[4815]: I1207 19:15:54.469635 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:54Z","lastTransitionTime":"2025-12-07T19:15:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:54 crc kubenswrapper[4815]: I1207 19:15:54.572355 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:54 crc kubenswrapper[4815]: I1207 19:15:54.572390 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:54 crc kubenswrapper[4815]: I1207 19:15:54.572401 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:54 crc kubenswrapper[4815]: I1207 19:15:54.572416 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:54 crc kubenswrapper[4815]: I1207 19:15:54.572428 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:54Z","lastTransitionTime":"2025-12-07T19:15:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:54 crc kubenswrapper[4815]: I1207 19:15:54.675393 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:54 crc kubenswrapper[4815]: I1207 19:15:54.675437 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:54 crc kubenswrapper[4815]: I1207 19:15:54.675447 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:54 crc kubenswrapper[4815]: I1207 19:15:54.675463 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:54 crc kubenswrapper[4815]: I1207 19:15:54.675475 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:54Z","lastTransitionTime":"2025-12-07T19:15:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:54 crc kubenswrapper[4815]: I1207 19:15:54.769546 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 07 19:15:54 crc kubenswrapper[4815]: I1207 19:15:54.769594 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xbq22" Dec 07 19:15:54 crc kubenswrapper[4815]: E1207 19:15:54.769713 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 07 19:15:54 crc kubenswrapper[4815]: E1207 19:15:54.769837 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xbq22" podUID="201e9ba8-3e19-4555-90f0-587497a2a328" Dec 07 19:15:54 crc kubenswrapper[4815]: I1207 19:15:54.777822 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:54 crc kubenswrapper[4815]: I1207 19:15:54.777870 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:54 crc kubenswrapper[4815]: I1207 19:15:54.777883 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:54 crc kubenswrapper[4815]: I1207 19:15:54.777898 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:54 crc kubenswrapper[4815]: I1207 19:15:54.777909 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:54Z","lastTransitionTime":"2025-12-07T19:15:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:54 crc kubenswrapper[4815]: I1207 19:15:54.880522 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:54 crc kubenswrapper[4815]: I1207 19:15:54.880563 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:54 crc kubenswrapper[4815]: I1207 19:15:54.880575 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:54 crc kubenswrapper[4815]: I1207 19:15:54.880594 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:54 crc kubenswrapper[4815]: I1207 19:15:54.880607 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:54Z","lastTransitionTime":"2025-12-07T19:15:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:54 crc kubenswrapper[4815]: I1207 19:15:54.983157 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:54 crc kubenswrapper[4815]: I1207 19:15:54.983198 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:54 crc kubenswrapper[4815]: I1207 19:15:54.983210 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:54 crc kubenswrapper[4815]: I1207 19:15:54.983227 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:54 crc kubenswrapper[4815]: I1207 19:15:54.983239 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:54Z","lastTransitionTime":"2025-12-07T19:15:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:55 crc kubenswrapper[4815]: I1207 19:15:55.085613 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:55 crc kubenswrapper[4815]: I1207 19:15:55.085650 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:55 crc kubenswrapper[4815]: I1207 19:15:55.085661 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:55 crc kubenswrapper[4815]: I1207 19:15:55.085680 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:55 crc kubenswrapper[4815]: I1207 19:15:55.085693 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:55Z","lastTransitionTime":"2025-12-07T19:15:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:55 crc kubenswrapper[4815]: I1207 19:15:55.188029 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:55 crc kubenswrapper[4815]: I1207 19:15:55.188082 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:55 crc kubenswrapper[4815]: I1207 19:15:55.188098 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:55 crc kubenswrapper[4815]: I1207 19:15:55.188118 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:55 crc kubenswrapper[4815]: I1207 19:15:55.188133 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:55Z","lastTransitionTime":"2025-12-07T19:15:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:55 crc kubenswrapper[4815]: I1207 19:15:55.290791 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:55 crc kubenswrapper[4815]: I1207 19:15:55.290849 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:55 crc kubenswrapper[4815]: I1207 19:15:55.290867 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:55 crc kubenswrapper[4815]: I1207 19:15:55.290893 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:55 crc kubenswrapper[4815]: I1207 19:15:55.290910 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:55Z","lastTransitionTime":"2025-12-07T19:15:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:55 crc kubenswrapper[4815]: I1207 19:15:55.393820 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:55 crc kubenswrapper[4815]: I1207 19:15:55.393881 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:55 crc kubenswrapper[4815]: I1207 19:15:55.393902 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:55 crc kubenswrapper[4815]: I1207 19:15:55.393953 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:55 crc kubenswrapper[4815]: I1207 19:15:55.393972 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:55Z","lastTransitionTime":"2025-12-07T19:15:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:55 crc kubenswrapper[4815]: I1207 19:15:55.496626 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:55 crc kubenswrapper[4815]: I1207 19:15:55.496669 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:55 crc kubenswrapper[4815]: I1207 19:15:55.496694 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:55 crc kubenswrapper[4815]: I1207 19:15:55.496713 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:55 crc kubenswrapper[4815]: I1207 19:15:55.496727 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:55Z","lastTransitionTime":"2025-12-07T19:15:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:55 crc kubenswrapper[4815]: I1207 19:15:55.598952 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:55 crc kubenswrapper[4815]: I1207 19:15:55.599034 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:55 crc kubenswrapper[4815]: I1207 19:15:55.599058 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:55 crc kubenswrapper[4815]: I1207 19:15:55.599091 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:55 crc kubenswrapper[4815]: I1207 19:15:55.599115 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:55Z","lastTransitionTime":"2025-12-07T19:15:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:55 crc kubenswrapper[4815]: I1207 19:15:55.701308 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:55 crc kubenswrapper[4815]: I1207 19:15:55.701557 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:55 crc kubenswrapper[4815]: I1207 19:15:55.701624 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:55 crc kubenswrapper[4815]: I1207 19:15:55.701691 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:55 crc kubenswrapper[4815]: I1207 19:15:55.701754 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:55Z","lastTransitionTime":"2025-12-07T19:15:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:55 crc kubenswrapper[4815]: I1207 19:15:55.768886 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 07 19:15:55 crc kubenswrapper[4815]: I1207 19:15:55.769083 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 07 19:15:55 crc kubenswrapper[4815]: E1207 19:15:55.771022 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 07 19:15:55 crc kubenswrapper[4815]: E1207 19:15:55.771276 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 07 19:15:55 crc kubenswrapper[4815]: I1207 19:15:55.786124 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s95hp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b739f36-d9c4-4fb6-9ead-9df05e283dea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73eedf1d85ce69ddf2fed2445ed9318cb4515aee47f89fe9b1db69bbf213ea2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prbsc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s95hp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:55Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:55 crc kubenswrapper[4815]: I1207 19:15:55.803058 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gmf4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fefaece-ab52-48e2-9ee9-fb07be1922f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://600d32fcae56dd0fa63421283e9b0fdfc23c4589a1dc4a66e5ff1f4c6a415e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e14454c2c5dbbdcbfd3ff7ec5d618c2dfe48f41af9846b72f61941b878e53d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e14454c2c5dbbdcbfd3ff7ec5d618c2dfe48f41af9846b72f61941b878e53d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f33f1decd49213b8e034c2d1f2a3a51ab99aa18679bd3378becaa27f1ef4b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f33f1decd49213b8e034c2d1f2a3a51ab99aa18679bd3378becaa27f1ef4b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c519c7c8ea1b7405466970c64269b72cc6e8129838764dfc4a662cb4df419c6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c519c7c8ea1b7405466970c64269b72cc6e8129838764dfc4a662cb4df419c6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c9efcd688f4886df56fb4d0e2ff5f5dd0c9e877186e1b53d398b7bed0547609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c9efcd688f4886df56fb4d0e2ff5f5dd0c9e877186e1b53d398b7bed0547609\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a4a622d5231f5bacbb9dba0290a87f10310a0053be2dd692718fcf1002deb44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a4a622d5231f5bacbb9dba0290a87f10310a0053be2dd692718fcf1002deb44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://825649f5768765281bad22f9b86ed162b80cc258ce11a746ec06c9f415a21529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://825649f5768765281bad22f9b86ed162b80cc258ce11a746ec06c9f415a21529\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gmf4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:55Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:55 crc kubenswrapper[4815]: I1207 19:15:55.804264 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:55 crc kubenswrapper[4815]: I1207 19:15:55.804307 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:55 crc kubenswrapper[4815]: I1207 19:15:55.804318 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:55 crc kubenswrapper[4815]: I1207 19:15:55.804338 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:55 crc kubenswrapper[4815]: I1207 19:15:55.804354 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:55Z","lastTransitionTime":"2025-12-07T19:15:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:55 crc kubenswrapper[4815]: I1207 19:15:55.819441 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tqxds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dfd265a-0c7f-40bd-9226-82c2b1abbeda\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3523e317081c56f05c00d0288c6ad6f1ff04f1346772bc0c86aed488570786a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzfhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a277a724d026909546ae84aee6d8d29fc87f0277f9d6aa45ca99e5f116ac79d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzfhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tqxds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:55Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:55 crc kubenswrapper[4815]: I1207 19:15:55.836774 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"538dccc2-452b-4d13-8b0d-df993c2b69a2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae001d2e4c277c57382ca0c4a1286c3c135f2fc06ca5a78df27cbdab588ce576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f92a2d37af34c519a6b00a9643f4a46de779ae0070adb0826e9f227960fe4a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89a76372fcc5ff7131084098e41183d148096169b9ccaf69d59dbef45dcfd755\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f29e69a7aa018d68c2e39f3e0f7a619fb25262acaebf83b81fabb6f9b87a9892\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:55Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:55 crc kubenswrapper[4815]: I1207 19:15:55.850628 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cb6cb3bfed28d77ff74b7a2273ddb477e21f930f79b169bfbd98fa10a5ead5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:55Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:55 crc kubenswrapper[4815]: I1207 19:15:55.870821 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac471afc20373bd22caa4941b5ebe38fdf62ad2145fabb4211fa02ce8728ab1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4050d0d7398492efc8809311a01c1a60986b97b6d177846e70c9283399f37b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c63ad6fab49ca4b8deb896aef3b5ff43c3a1ae1035e7ad94815a938939c57c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://812356ce6808d8d2a5845dec1132733fab86a0d6853951920ce5fb5083551c7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ab1c01dcbfe50580e64d89dc9e14843419588c89d7d792af89034d8f272e44e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f140818c38211cb31d63c0b540b60681a781c349ba0523866a3c7e80f33756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c6dacc25cfd09c798577f8ef8470f6a4d7f1fe33b265ab8f798ea6ae999bd3ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6dacc25cfd09c798577f8ef8470f6a4d7f1fe33b265ab8f798ea6ae999bd3ac\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"message\\\":\\\"rnetes/ovnkube-control-plane-749d76644c-tqxds\\\\nI1207 19:15:40.404183 6171 services_controller.go:451] Built service openshift-machine-api/machine-api-operator-machine-webhook cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-api/machine-api-operator-machine-webhook_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/machine-api-operator-machine-webhook\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.250\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1207 19:15:40.404208 6171 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:39Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-zzw6c_openshift-ovn-kubernetes(13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba72f8ec4b6a36d0c1a6fd8f9f8e5fad577a00b8ae515738a007b994b989d134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a4a4b2d4e80c078ae857e91491c082c6b1c57a5297c485c31fac0c90996b19f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a4a4b2d4e80c078ae857e91491c082c6b1c57a5297c485c31fac0c90996b19f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zzw6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:55Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:55 crc kubenswrapper[4815]: I1207 19:15:55.884982 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kh7gd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29b15186-a725-4067-ba76-336f580327fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1fa8bda26912926bc52757424a9930e1eca6ee953f28ffdb205c1b8b8f29d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kh7gd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:55Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:55 crc kubenswrapper[4815]: I1207 19:15:55.900323 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:55Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:55 crc kubenswrapper[4815]: I1207 19:15:55.907708 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:55 crc kubenswrapper[4815]: I1207 19:15:55.907795 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:55 crc kubenswrapper[4815]: I1207 19:15:55.907813 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:55 crc kubenswrapper[4815]: I1207 19:15:55.907868 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:55 crc kubenswrapper[4815]: I1207 19:15:55.908004 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:55Z","lastTransitionTime":"2025-12-07T19:15:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:55 crc kubenswrapper[4815]: I1207 19:15:55.912638 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nz6q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92d414c2-ecc3-4598-ac9d-b982bfd89c7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://003f0d6dca39f167925bc3a05634d9dd20f23e54135631e0099a56801f9daf3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cp5k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nz6q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:55Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:55 crc kubenswrapper[4815]: I1207 19:15:55.929978 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c752886-7f9b-4605-8358-0fde597c93da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://441c88363ea1cc8305fa21c51c9237798c47d82a207c9e80da98df93027fe4a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb1ec115ead917caf40ce3586653d6eb78e9a290d845f766f8818c2b57ece6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bda919a651aa83d2f9a82f711d0b242e14140e3d13ec5a80f7ec675a4aedb21a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33a4c84a8cd512a90c51f4721d9ddfde2a452e7f03d9809d492d5bccd7cf4aa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33a4c84a8cd512a90c51f4721d9ddfde2a452e7f03d9809d492d5bccd7cf4aa4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:55Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:55 crc kubenswrapper[4815]: I1207 19:15:55.946974 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe01e9d3-9839-41ea-82d9-f4b6f48bf3b2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52ab545f838534306ea70cc6e400e9fb56ed71ed8206d72c6626166f3a51a93c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f59886c065844948639bf9828ed20c24546cc38659683c98162568c6b12c029\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bec885c9a80d9b5b1fe630a9b7a3f19b23b0af05f359bd87a664f007812b4c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c76d9f056d22d1be012327374370269c2f6d6b8bd341c98419fcb744370751e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539bb1f0e569004272aac994e13bdc7f17f342f5986f08beca97a4c888e046c2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1207 19:15:09.905103 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1207 19:15:09.906859 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4228347019/tls.crt::/tmp/serving-cert-4228347019/tls.key\\\\\\\"\\\\nI1207 19:15:20.294696 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1207 19:15:20.298748 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1207 19:15:20.298781 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1207 19:15:20.298843 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1207 19:15:20.298860 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1207 19:15:20.305864 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1207 19:15:20.305885 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1207 19:15:20.305891 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1207 19:15:20.305973 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1207 19:15:20.305983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1207 19:15:20.305988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1207 19:15:20.305992 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1207 19:15:20.305997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1207 19:15:20.308727 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://902a764a3f6e3ac55c1817f369dfd69ffbf8528bfe4444a6514cdabd48f09fb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:55Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:55 crc kubenswrapper[4815]: I1207 19:15:55.964295 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:55Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:55 crc kubenswrapper[4815]: I1207 19:15:55.991660 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b57a8653da7e9136a6c54872ff67a01c2d6ad5077842b783ffe74e0069bc8693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:55Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:56 crc kubenswrapper[4815]: I1207 19:15:56.006101 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:56Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:56 crc kubenswrapper[4815]: I1207 19:15:56.010098 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:56 crc kubenswrapper[4815]: I1207 19:15:56.010293 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:56 crc kubenswrapper[4815]: I1207 19:15:56.010432 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:56 crc kubenswrapper[4815]: I1207 19:15:56.010601 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:56 crc kubenswrapper[4815]: I1207 19:15:56.010754 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:56Z","lastTransitionTime":"2025-12-07T19:15:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:56 crc kubenswrapper[4815]: I1207 19:15:56.019768 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ec34a6371c74ee0778e27d73f5e2948d9ee1bb518f658675df123df60879c61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc20ae853b989c9e9cfeb47a40bd1a02397e3ce0b0b4feec3fcec56ea7be16a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:56Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:56 crc kubenswrapper[4815]: I1207 19:15:56.033228 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d662ba2-aa03-4eea-bd30-8ad40638f6c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4939db657209a8744656f63541b50598888981216b000dd4316a9327fdfbcf34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2v8vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55eba62082762b6527923446c2d419d2eb00919331cf8e59bfa9cb27ca65a82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2v8vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gkn4h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:56Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:56 crc kubenswrapper[4815]: I1207 19:15:56.046886 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xbq22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"201e9ba8-3e19-4555-90f0-587497a2a328\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z24bd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z24bd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xbq22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:56Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:56 crc kubenswrapper[4815]: I1207 19:15:56.113784 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:56 crc kubenswrapper[4815]: I1207 19:15:56.114153 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:56 crc kubenswrapper[4815]: I1207 19:15:56.114302 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:56 crc kubenswrapper[4815]: I1207 19:15:56.114456 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:56 crc kubenswrapper[4815]: I1207 19:15:56.114649 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:56Z","lastTransitionTime":"2025-12-07T19:15:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:56 crc kubenswrapper[4815]: I1207 19:15:56.217498 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:56 crc kubenswrapper[4815]: I1207 19:15:56.217570 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:56 crc kubenswrapper[4815]: I1207 19:15:56.217587 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:56 crc kubenswrapper[4815]: I1207 19:15:56.218071 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:56 crc kubenswrapper[4815]: I1207 19:15:56.218103 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:56Z","lastTransitionTime":"2025-12-07T19:15:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:56 crc kubenswrapper[4815]: I1207 19:15:56.321394 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:56 crc kubenswrapper[4815]: I1207 19:15:56.321436 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:56 crc kubenswrapper[4815]: I1207 19:15:56.321447 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:56 crc kubenswrapper[4815]: I1207 19:15:56.321463 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:56 crc kubenswrapper[4815]: I1207 19:15:56.321473 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:56Z","lastTransitionTime":"2025-12-07T19:15:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:56 crc kubenswrapper[4815]: I1207 19:15:56.425607 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:56 crc kubenswrapper[4815]: I1207 19:15:56.425664 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:56 crc kubenswrapper[4815]: I1207 19:15:56.425675 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:56 crc kubenswrapper[4815]: I1207 19:15:56.425695 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:56 crc kubenswrapper[4815]: I1207 19:15:56.426133 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:56Z","lastTransitionTime":"2025-12-07T19:15:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:56 crc kubenswrapper[4815]: I1207 19:15:56.530043 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:56 crc kubenswrapper[4815]: I1207 19:15:56.530098 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:56 crc kubenswrapper[4815]: I1207 19:15:56.530109 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:56 crc kubenswrapper[4815]: I1207 19:15:56.530130 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:56 crc kubenswrapper[4815]: I1207 19:15:56.530142 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:56Z","lastTransitionTime":"2025-12-07T19:15:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:56 crc kubenswrapper[4815]: I1207 19:15:56.577674 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/201e9ba8-3e19-4555-90f0-587497a2a328-metrics-certs\") pod \"network-metrics-daemon-xbq22\" (UID: \"201e9ba8-3e19-4555-90f0-587497a2a328\") " pod="openshift-multus/network-metrics-daemon-xbq22" Dec 07 19:15:56 crc kubenswrapper[4815]: E1207 19:15:56.578223 4815 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 07 19:15:56 crc kubenswrapper[4815]: E1207 19:15:56.578552 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/201e9ba8-3e19-4555-90f0-587497a2a328-metrics-certs podName:201e9ba8-3e19-4555-90f0-587497a2a328 nodeName:}" failed. No retries permitted until 2025-12-07 19:16:12.578524749 +0000 UTC m=+77.157514834 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/201e9ba8-3e19-4555-90f0-587497a2a328-metrics-certs") pod "network-metrics-daemon-xbq22" (UID: "201e9ba8-3e19-4555-90f0-587497a2a328") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 07 19:15:56 crc kubenswrapper[4815]: I1207 19:15:56.633101 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:56 crc kubenswrapper[4815]: I1207 19:15:56.634110 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:56 crc kubenswrapper[4815]: I1207 19:15:56.635368 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:56 crc kubenswrapper[4815]: I1207 19:15:56.635410 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:56 crc kubenswrapper[4815]: I1207 19:15:56.635432 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:56Z","lastTransitionTime":"2025-12-07T19:15:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:56 crc kubenswrapper[4815]: I1207 19:15:56.739846 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:56 crc kubenswrapper[4815]: I1207 19:15:56.740281 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:56 crc kubenswrapper[4815]: I1207 19:15:56.740304 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:56 crc kubenswrapper[4815]: I1207 19:15:56.740359 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:56 crc kubenswrapper[4815]: I1207 19:15:56.740382 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:56Z","lastTransitionTime":"2025-12-07T19:15:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:56 crc kubenswrapper[4815]: I1207 19:15:56.769059 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 07 19:15:56 crc kubenswrapper[4815]: E1207 19:15:56.769418 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 07 19:15:56 crc kubenswrapper[4815]: I1207 19:15:56.769720 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xbq22" Dec 07 19:15:56 crc kubenswrapper[4815]: E1207 19:15:56.769956 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xbq22" podUID="201e9ba8-3e19-4555-90f0-587497a2a328" Dec 07 19:15:56 crc kubenswrapper[4815]: I1207 19:15:56.843951 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:56 crc kubenswrapper[4815]: I1207 19:15:56.843996 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:56 crc kubenswrapper[4815]: I1207 19:15:56.844011 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:56 crc kubenswrapper[4815]: I1207 19:15:56.844031 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:56 crc kubenswrapper[4815]: I1207 19:15:56.844046 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:56Z","lastTransitionTime":"2025-12-07T19:15:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:56 crc kubenswrapper[4815]: I1207 19:15:56.946574 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:56 crc kubenswrapper[4815]: I1207 19:15:56.946652 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:56 crc kubenswrapper[4815]: I1207 19:15:56.946668 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:56 crc kubenswrapper[4815]: I1207 19:15:56.946710 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:56 crc kubenswrapper[4815]: I1207 19:15:56.946745 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:56Z","lastTransitionTime":"2025-12-07T19:15:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:57 crc kubenswrapper[4815]: I1207 19:15:57.050231 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:57 crc kubenswrapper[4815]: I1207 19:15:57.050298 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:57 crc kubenswrapper[4815]: I1207 19:15:57.050312 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:57 crc kubenswrapper[4815]: I1207 19:15:57.050340 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:57 crc kubenswrapper[4815]: I1207 19:15:57.050355 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:57Z","lastTransitionTime":"2025-12-07T19:15:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:57 crc kubenswrapper[4815]: I1207 19:15:57.153302 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:57 crc kubenswrapper[4815]: I1207 19:15:57.153709 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:57 crc kubenswrapper[4815]: I1207 19:15:57.153867 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:57 crc kubenswrapper[4815]: I1207 19:15:57.154088 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:57 crc kubenswrapper[4815]: I1207 19:15:57.154249 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:57Z","lastTransitionTime":"2025-12-07T19:15:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:57 crc kubenswrapper[4815]: I1207 19:15:57.256250 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:57 crc kubenswrapper[4815]: I1207 19:15:57.256290 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:57 crc kubenswrapper[4815]: I1207 19:15:57.256301 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:57 crc kubenswrapper[4815]: I1207 19:15:57.256317 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:57 crc kubenswrapper[4815]: I1207 19:15:57.256327 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:57Z","lastTransitionTime":"2025-12-07T19:15:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:57 crc kubenswrapper[4815]: I1207 19:15:57.359526 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:57 crc kubenswrapper[4815]: I1207 19:15:57.359587 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:57 crc kubenswrapper[4815]: I1207 19:15:57.359615 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:57 crc kubenswrapper[4815]: I1207 19:15:57.359648 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:57 crc kubenswrapper[4815]: I1207 19:15:57.359671 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:57Z","lastTransitionTime":"2025-12-07T19:15:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:57 crc kubenswrapper[4815]: I1207 19:15:57.462496 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:57 crc kubenswrapper[4815]: I1207 19:15:57.462566 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:57 crc kubenswrapper[4815]: I1207 19:15:57.462586 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:57 crc kubenswrapper[4815]: I1207 19:15:57.462615 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:57 crc kubenswrapper[4815]: I1207 19:15:57.462634 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:57Z","lastTransitionTime":"2025-12-07T19:15:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:57 crc kubenswrapper[4815]: I1207 19:15:57.565004 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:57 crc kubenswrapper[4815]: I1207 19:15:57.565040 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:57 crc kubenswrapper[4815]: I1207 19:15:57.565051 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:57 crc kubenswrapper[4815]: I1207 19:15:57.565067 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:57 crc kubenswrapper[4815]: I1207 19:15:57.565078 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:57Z","lastTransitionTime":"2025-12-07T19:15:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:57 crc kubenswrapper[4815]: I1207 19:15:57.674523 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:57 crc kubenswrapper[4815]: I1207 19:15:57.674587 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:57 crc kubenswrapper[4815]: I1207 19:15:57.674606 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:57 crc kubenswrapper[4815]: I1207 19:15:57.674630 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:57 crc kubenswrapper[4815]: I1207 19:15:57.674648 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:57Z","lastTransitionTime":"2025-12-07T19:15:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:57 crc kubenswrapper[4815]: I1207 19:15:57.773293 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 07 19:15:57 crc kubenswrapper[4815]: E1207 19:15:57.773459 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 07 19:15:57 crc kubenswrapper[4815]: I1207 19:15:57.774533 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 07 19:15:57 crc kubenswrapper[4815]: E1207 19:15:57.774739 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 07 19:15:57 crc kubenswrapper[4815]: I1207 19:15:57.774879 4815 scope.go:117] "RemoveContainer" containerID="c6dacc25cfd09c798577f8ef8470f6a4d7f1fe33b265ab8f798ea6ae999bd3ac" Dec 07 19:15:57 crc kubenswrapper[4815]: I1207 19:15:57.778478 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:57 crc kubenswrapper[4815]: I1207 19:15:57.778509 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:57 crc kubenswrapper[4815]: I1207 19:15:57.778522 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:57 crc kubenswrapper[4815]: I1207 19:15:57.778538 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:57 crc kubenswrapper[4815]: I1207 19:15:57.778550 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:57Z","lastTransitionTime":"2025-12-07T19:15:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:57 crc kubenswrapper[4815]: I1207 19:15:57.881466 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:57 crc kubenswrapper[4815]: I1207 19:15:57.882079 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:57 crc kubenswrapper[4815]: I1207 19:15:57.882253 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:57 crc kubenswrapper[4815]: I1207 19:15:57.882400 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:57 crc kubenswrapper[4815]: I1207 19:15:57.882727 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:57Z","lastTransitionTime":"2025-12-07T19:15:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:57 crc kubenswrapper[4815]: I1207 19:15:57.913318 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:57 crc kubenswrapper[4815]: I1207 19:15:57.913625 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:57 crc kubenswrapper[4815]: I1207 19:15:57.913717 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:57 crc kubenswrapper[4815]: I1207 19:15:57.915275 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:57 crc kubenswrapper[4815]: I1207 19:15:57.915388 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:57Z","lastTransitionTime":"2025-12-07T19:15:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:57 crc kubenswrapper[4815]: E1207 19:15:57.930090 4815 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:15:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:15:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:15:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:15:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3d749f27-20c8-4d23-ad52-dc6b852bf3b7\\\",\\\"systemUUID\\\":\\\"077277cc-9bde-4aeb-947a-0cf3c49a1ac0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:57Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:57 crc kubenswrapper[4815]: I1207 19:15:57.941576 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:57 crc kubenswrapper[4815]: I1207 19:15:57.941642 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:57 crc kubenswrapper[4815]: I1207 19:15:57.941655 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:57 crc kubenswrapper[4815]: I1207 19:15:57.941696 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:57 crc kubenswrapper[4815]: I1207 19:15:57.941710 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:57Z","lastTransitionTime":"2025-12-07T19:15:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:57 crc kubenswrapper[4815]: E1207 19:15:57.961451 4815 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:15:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:15:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:15:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:15:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3d749f27-20c8-4d23-ad52-dc6b852bf3b7\\\",\\\"systemUUID\\\":\\\"077277cc-9bde-4aeb-947a-0cf3c49a1ac0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:57Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:57 crc kubenswrapper[4815]: I1207 19:15:57.966791 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:57 crc kubenswrapper[4815]: I1207 19:15:57.966878 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:57 crc kubenswrapper[4815]: I1207 19:15:57.966899 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:57 crc kubenswrapper[4815]: I1207 19:15:57.966966 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:57 crc kubenswrapper[4815]: I1207 19:15:57.966980 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:57Z","lastTransitionTime":"2025-12-07T19:15:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:57 crc kubenswrapper[4815]: E1207 19:15:57.992176 4815 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:15:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:15:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:15:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:15:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3d749f27-20c8-4d23-ad52-dc6b852bf3b7\\\",\\\"systemUUID\\\":\\\"077277cc-9bde-4aeb-947a-0cf3c49a1ac0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:57Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:57 crc kubenswrapper[4815]: I1207 19:15:57.998129 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:57 crc kubenswrapper[4815]: I1207 19:15:57.998188 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:57 crc kubenswrapper[4815]: I1207 19:15:57.998203 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:57 crc kubenswrapper[4815]: I1207 19:15:57.998223 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:57 crc kubenswrapper[4815]: I1207 19:15:57.998235 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:57Z","lastTransitionTime":"2025-12-07T19:15:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:58 crc kubenswrapper[4815]: E1207 19:15:58.015994 4815 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:15:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:15:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:15:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:15:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3d749f27-20c8-4d23-ad52-dc6b852bf3b7\\\",\\\"systemUUID\\\":\\\"077277cc-9bde-4aeb-947a-0cf3c49a1ac0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:58Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:58 crc kubenswrapper[4815]: I1207 19:15:58.020848 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:58 crc kubenswrapper[4815]: I1207 19:15:58.020897 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:58 crc kubenswrapper[4815]: I1207 19:15:58.020925 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:58 crc kubenswrapper[4815]: I1207 19:15:58.020950 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:58 crc kubenswrapper[4815]: I1207 19:15:58.020963 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:58Z","lastTransitionTime":"2025-12-07T19:15:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:58 crc kubenswrapper[4815]: E1207 19:15:58.033410 4815 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:15:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:15:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:15:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:15:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3d749f27-20c8-4d23-ad52-dc6b852bf3b7\\\",\\\"systemUUID\\\":\\\"077277cc-9bde-4aeb-947a-0cf3c49a1ac0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:58Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:58 crc kubenswrapper[4815]: E1207 19:15:58.033625 4815 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 07 19:15:58 crc kubenswrapper[4815]: I1207 19:15:58.035830 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:58 crc kubenswrapper[4815]: I1207 19:15:58.035867 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:58 crc kubenswrapper[4815]: I1207 19:15:58.035882 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:58 crc kubenswrapper[4815]: I1207 19:15:58.035902 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:58 crc kubenswrapper[4815]: I1207 19:15:58.035930 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:58Z","lastTransitionTime":"2025-12-07T19:15:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:58 crc kubenswrapper[4815]: I1207 19:15:58.144116 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:58 crc kubenswrapper[4815]: I1207 19:15:58.144158 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:58 crc kubenswrapper[4815]: I1207 19:15:58.144172 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:58 crc kubenswrapper[4815]: I1207 19:15:58.144192 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:58 crc kubenswrapper[4815]: I1207 19:15:58.144208 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:58Z","lastTransitionTime":"2025-12-07T19:15:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:58 crc kubenswrapper[4815]: I1207 19:15:58.215265 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zzw6c_13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f/ovnkube-controller/1.log" Dec 07 19:15:58 crc kubenswrapper[4815]: I1207 19:15:58.217816 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" event={"ID":"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f","Type":"ContainerStarted","Data":"9d8bbcb8ed47df52ce8120a29864bc098ad0cce571dad6460aea888eda70ccf8"} Dec 07 19:15:58 crc kubenswrapper[4815]: I1207 19:15:58.218388 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" Dec 07 19:15:58 crc kubenswrapper[4815]: I1207 19:15:58.246201 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:58 crc kubenswrapper[4815]: I1207 19:15:58.246233 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:58 crc kubenswrapper[4815]: I1207 19:15:58.246247 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:58 crc kubenswrapper[4815]: I1207 19:15:58.246263 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:58 crc kubenswrapper[4815]: I1207 19:15:58.246277 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:58Z","lastTransitionTime":"2025-12-07T19:15:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:58 crc kubenswrapper[4815]: I1207 19:15:58.246951 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cb6cb3bfed28d77ff74b7a2273ddb477e21f930f79b169bfbd98fa10a5ead5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:58Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:58 crc kubenswrapper[4815]: I1207 19:15:58.264965 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac471afc20373bd22caa4941b5ebe38fdf62ad2145fabb4211fa02ce8728ab1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4050d0d7398492efc8809311a01c1a60986b97b6d177846e70c9283399f37b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c63ad6fab49ca4b8deb896aef3b5ff43c3a1ae1035e7ad94815a938939c57c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://812356ce6808d8d2a5845dec1132733fab86a0d6853951920ce5fb5083551c7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ab1c01dcbfe50580e64d89dc9e14843419588c89d7d792af89034d8f272e44e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f140818c38211cb31d63c0b540b60681a781c349ba0523866a3c7e80f33756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d8bbcb8ed47df52ce8120a29864bc098ad0cce571dad6460aea888eda70ccf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6dacc25cfd09c798577f8ef8470f6a4d7f1fe33b265ab8f798ea6ae999bd3ac\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"message\\\":\\\"rnetes/ovnkube-control-plane-749d76644c-tqxds\\\\nI1207 19:15:40.404183 6171 services_controller.go:451] Built service openshift-machine-api/machine-api-operator-machine-webhook cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-api/machine-api-operator-machine-webhook_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/machine-api-operator-machine-webhook\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.250\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1207 19:15:40.404208 6171 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:39Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba72f8ec4b6a36d0c1a6fd8f9f8e5fad577a00b8ae515738a007b994b989d134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a4a4b2d4e80c078ae857e91491c082c6b1c57a5297c485c31fac0c90996b19f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a4a4b2d4e80c078ae857e91491c082c6b1c57a5297c485c31fac0c90996b19f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zzw6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:58Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:58 crc kubenswrapper[4815]: I1207 19:15:58.273458 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kh7gd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29b15186-a725-4067-ba76-336f580327fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1fa8bda26912926bc52757424a9930e1eca6ee953f28ffdb205c1b8b8f29d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kh7gd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:58Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:58 crc kubenswrapper[4815]: I1207 19:15:58.285561 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"538dccc2-452b-4d13-8b0d-df993c2b69a2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae001d2e4c277c57382ca0c4a1286c3c135f2fc06ca5a78df27cbdab588ce576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f92a2d37af34c519a6b00a9643f4a46de779ae0070adb0826e9f227960fe4a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89a76372fcc5ff7131084098e41183d148096169b9ccaf69d59dbef45dcfd755\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f29e69a7aa018d68c2e39f3e0f7a619fb25262acaebf83b81fabb6f9b87a9892\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:58Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:58 crc kubenswrapper[4815]: I1207 19:15:58.296703 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe01e9d3-9839-41ea-82d9-f4b6f48bf3b2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52ab545f838534306ea70cc6e400e9fb56ed71ed8206d72c6626166f3a51a93c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f59886c065844948639bf9828ed20c24546cc38659683c98162568c6b12c029\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bec885c9a80d9b5b1fe630a9b7a3f19b23b0af05f359bd87a664f007812b4c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c76d9f056d22d1be012327374370269c2f6d6b8bd341c98419fcb744370751e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539bb1f0e569004272aac994e13bdc7f17f342f5986f08beca97a4c888e046c2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1207 19:15:09.905103 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1207 19:15:09.906859 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4228347019/tls.crt::/tmp/serving-cert-4228347019/tls.key\\\\\\\"\\\\nI1207 19:15:20.294696 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1207 19:15:20.298748 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1207 19:15:20.298781 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1207 19:15:20.298843 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1207 19:15:20.298860 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1207 19:15:20.305864 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1207 19:15:20.305885 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1207 19:15:20.305891 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1207 19:15:20.305973 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1207 19:15:20.305983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1207 19:15:20.305988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1207 19:15:20.305992 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1207 19:15:20.305997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1207 19:15:20.308727 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://902a764a3f6e3ac55c1817f369dfd69ffbf8528bfe4444a6514cdabd48f09fb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:58Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:58 crc kubenswrapper[4815]: I1207 19:15:58.306162 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:58Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:58 crc kubenswrapper[4815]: I1207 19:15:58.322808 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b57a8653da7e9136a6c54872ff67a01c2d6ad5077842b783ffe74e0069bc8693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:58Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:58 crc kubenswrapper[4815]: I1207 19:15:58.335578 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:58Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:58 crc kubenswrapper[4815]: I1207 19:15:58.348292 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:58 crc kubenswrapper[4815]: I1207 19:15:58.348330 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:58 crc kubenswrapper[4815]: I1207 19:15:58.348342 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:58 crc kubenswrapper[4815]: I1207 19:15:58.348360 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:58 crc kubenswrapper[4815]: I1207 19:15:58.348372 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:58Z","lastTransitionTime":"2025-12-07T19:15:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:58 crc kubenswrapper[4815]: I1207 19:15:58.348429 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ec34a6371c74ee0778e27d73f5e2948d9ee1bb518f658675df123df60879c61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc20ae853b989c9e9cfeb47a40bd1a02397e3ce0b0b4feec3fcec56ea7be16a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:58Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:58 crc kubenswrapper[4815]: I1207 19:15:58.360657 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:58Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:58 crc kubenswrapper[4815]: I1207 19:15:58.371736 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nz6q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92d414c2-ecc3-4598-ac9d-b982bfd89c7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://003f0d6dca39f167925bc3a05634d9dd20f23e54135631e0099a56801f9daf3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cp5k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nz6q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:58Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:58 crc kubenswrapper[4815]: I1207 19:15:58.386233 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c752886-7f9b-4605-8358-0fde597c93da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://441c88363ea1cc8305fa21c51c9237798c47d82a207c9e80da98df93027fe4a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb1ec115ead917caf40ce3586653d6eb78e9a290d845f766f8818c2b57ece6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bda919a651aa83d2f9a82f711d0b242e14140e3d13ec5a80f7ec675a4aedb21a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33a4c84a8cd512a90c51f4721d9ddfde2a452e7f03d9809d492d5bccd7cf4aa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33a4c84a8cd512a90c51f4721d9ddfde2a452e7f03d9809d492d5bccd7cf4aa4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:58Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:58 crc kubenswrapper[4815]: I1207 19:15:58.400304 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xbq22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"201e9ba8-3e19-4555-90f0-587497a2a328\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z24bd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z24bd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xbq22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:58Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:58 crc kubenswrapper[4815]: I1207 19:15:58.417553 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d662ba2-aa03-4eea-bd30-8ad40638f6c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4939db657209a8744656f63541b50598888981216b000dd4316a9327fdfbcf34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2v8vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55eba62082762b6527923446c2d419d2eb00919331cf8e59bfa9cb27ca65a82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2v8vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gkn4h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:58Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:58 crc kubenswrapper[4815]: I1207 19:15:58.435254 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s95hp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b739f36-d9c4-4fb6-9ead-9df05e283dea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73eedf1d85ce69ddf2fed2445ed9318cb4515aee47f89fe9b1db69bbf213ea2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prbsc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s95hp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:58Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:58 crc kubenswrapper[4815]: I1207 19:15:58.451192 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:58 crc kubenswrapper[4815]: I1207 19:15:58.451226 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:58 crc kubenswrapper[4815]: I1207 19:15:58.451236 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:58 crc kubenswrapper[4815]: I1207 19:15:58.451254 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:58 crc kubenswrapper[4815]: I1207 19:15:58.451265 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:58Z","lastTransitionTime":"2025-12-07T19:15:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:58 crc kubenswrapper[4815]: I1207 19:15:58.452135 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gmf4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fefaece-ab52-48e2-9ee9-fb07be1922f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://600d32fcae56dd0fa63421283e9b0fdfc23c4589a1dc4a66e5ff1f4c6a415e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e14454c2c5dbbdcbfd3ff7ec5d618c2dfe48f41af9846b72f61941b878e53d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e14454c2c5dbbdcbfd3ff7ec5d618c2dfe48f41af9846b72f61941b878e53d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f33f1decd49213b8e034c2d1f2a3a51ab99aa18679bd3378becaa27f1ef4b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f33f1decd49213b8e034c2d1f2a3a51ab99aa18679bd3378becaa27f1ef4b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c519c7c8ea1b7405466970c64269b72cc6e8129838764dfc4a662cb4df419c6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c519c7c8ea1b7405466970c64269b72cc6e8129838764dfc4a662cb4df419c6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c9efcd688f4886df56fb4d0e2ff5f5dd0c9e877186e1b53d398b7bed0547609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c9efcd688f4886df56fb4d0e2ff5f5dd0c9e877186e1b53d398b7bed0547609\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a4a622d5231f5bacbb9dba0290a87f10310a0053be2dd692718fcf1002deb44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a4a622d5231f5bacbb9dba0290a87f10310a0053be2dd692718fcf1002deb44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://825649f5768765281bad22f9b86ed162b80cc258ce11a746ec06c9f415a21529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://825649f5768765281bad22f9b86ed162b80cc258ce11a746ec06c9f415a21529\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gmf4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:58Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:58 crc kubenswrapper[4815]: I1207 19:15:58.466261 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tqxds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dfd265a-0c7f-40bd-9226-82c2b1abbeda\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3523e317081c56f05c00d0288c6ad6f1ff04f1346772bc0c86aed488570786a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzfhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a277a724d026909546ae84aee6d8d29fc87f0277f9d6aa45ca99e5f116ac79d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzfhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tqxds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:58Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:58 crc kubenswrapper[4815]: I1207 19:15:58.553283 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:58 crc kubenswrapper[4815]: I1207 19:15:58.553315 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:58 crc kubenswrapper[4815]: I1207 19:15:58.553325 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:58 crc kubenswrapper[4815]: I1207 19:15:58.553342 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:58 crc kubenswrapper[4815]: I1207 19:15:58.553353 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:58Z","lastTransitionTime":"2025-12-07T19:15:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:58 crc kubenswrapper[4815]: I1207 19:15:58.656454 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:58 crc kubenswrapper[4815]: I1207 19:15:58.656508 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:58 crc kubenswrapper[4815]: I1207 19:15:58.656525 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:58 crc kubenswrapper[4815]: I1207 19:15:58.656548 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:58 crc kubenswrapper[4815]: I1207 19:15:58.656565 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:58Z","lastTransitionTime":"2025-12-07T19:15:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:58 crc kubenswrapper[4815]: I1207 19:15:58.758568 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:58 crc kubenswrapper[4815]: I1207 19:15:58.758644 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:58 crc kubenswrapper[4815]: I1207 19:15:58.758668 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:58 crc kubenswrapper[4815]: I1207 19:15:58.758698 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:58 crc kubenswrapper[4815]: I1207 19:15:58.758755 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:58Z","lastTransitionTime":"2025-12-07T19:15:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:58 crc kubenswrapper[4815]: I1207 19:15:58.769796 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xbq22" Dec 07 19:15:58 crc kubenswrapper[4815]: I1207 19:15:58.769851 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 07 19:15:58 crc kubenswrapper[4815]: E1207 19:15:58.769932 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xbq22" podUID="201e9ba8-3e19-4555-90f0-587497a2a328" Dec 07 19:15:58 crc kubenswrapper[4815]: E1207 19:15:58.770034 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 07 19:15:58 crc kubenswrapper[4815]: I1207 19:15:58.860878 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:58 crc kubenswrapper[4815]: I1207 19:15:58.860938 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:58 crc kubenswrapper[4815]: I1207 19:15:58.860950 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:58 crc kubenswrapper[4815]: I1207 19:15:58.860965 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:58 crc kubenswrapper[4815]: I1207 19:15:58.860974 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:58Z","lastTransitionTime":"2025-12-07T19:15:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:58 crc kubenswrapper[4815]: I1207 19:15:58.963910 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:58 crc kubenswrapper[4815]: I1207 19:15:58.963998 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:58 crc kubenswrapper[4815]: I1207 19:15:58.964021 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:58 crc kubenswrapper[4815]: I1207 19:15:58.964053 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:58 crc kubenswrapper[4815]: I1207 19:15:58.964077 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:58Z","lastTransitionTime":"2025-12-07T19:15:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:59 crc kubenswrapper[4815]: I1207 19:15:59.067377 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:59 crc kubenswrapper[4815]: I1207 19:15:59.067426 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:59 crc kubenswrapper[4815]: I1207 19:15:59.067436 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:59 crc kubenswrapper[4815]: I1207 19:15:59.067452 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:59 crc kubenswrapper[4815]: I1207 19:15:59.067464 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:59Z","lastTransitionTime":"2025-12-07T19:15:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:59 crc kubenswrapper[4815]: I1207 19:15:59.170042 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:59 crc kubenswrapper[4815]: I1207 19:15:59.170083 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:59 crc kubenswrapper[4815]: I1207 19:15:59.170095 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:59 crc kubenswrapper[4815]: I1207 19:15:59.170111 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:59 crc kubenswrapper[4815]: I1207 19:15:59.170122 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:59Z","lastTransitionTime":"2025-12-07T19:15:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:59 crc kubenswrapper[4815]: I1207 19:15:59.223480 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zzw6c_13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f/ovnkube-controller/2.log" Dec 07 19:15:59 crc kubenswrapper[4815]: I1207 19:15:59.224220 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zzw6c_13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f/ovnkube-controller/1.log" Dec 07 19:15:59 crc kubenswrapper[4815]: I1207 19:15:59.227059 4815 generic.go:334] "Generic (PLEG): container finished" podID="13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f" containerID="9d8bbcb8ed47df52ce8120a29864bc098ad0cce571dad6460aea888eda70ccf8" exitCode=1 Dec 07 19:15:59 crc kubenswrapper[4815]: I1207 19:15:59.227115 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" event={"ID":"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f","Type":"ContainerDied","Data":"9d8bbcb8ed47df52ce8120a29864bc098ad0cce571dad6460aea888eda70ccf8"} Dec 07 19:15:59 crc kubenswrapper[4815]: I1207 19:15:59.227164 4815 scope.go:117] "RemoveContainer" containerID="c6dacc25cfd09c798577f8ef8470f6a4d7f1fe33b265ab8f798ea6ae999bd3ac" Dec 07 19:15:59 crc kubenswrapper[4815]: I1207 19:15:59.228267 4815 scope.go:117] "RemoveContainer" containerID="9d8bbcb8ed47df52ce8120a29864bc098ad0cce571dad6460aea888eda70ccf8" Dec 07 19:15:59 crc kubenswrapper[4815]: E1207 19:15:59.228545 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zzw6c_openshift-ovn-kubernetes(13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" podUID="13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f" Dec 07 19:15:59 crc kubenswrapper[4815]: I1207 19:15:59.241356 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"538dccc2-452b-4d13-8b0d-df993c2b69a2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae001d2e4c277c57382ca0c4a1286c3c135f2fc06ca5a78df27cbdab588ce576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f92a2d37af34c519a6b00a9643f4a46de779ae0070adb0826e9f227960fe4a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89a76372fcc5ff7131084098e41183d148096169b9ccaf69d59dbef45dcfd755\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f29e69a7aa018d68c2e39f3e0f7a619fb25262acaebf83b81fabb6f9b87a9892\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:59Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:59 crc kubenswrapper[4815]: I1207 19:15:59.254053 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cb6cb3bfed28d77ff74b7a2273ddb477e21f930f79b169bfbd98fa10a5ead5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:59Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:59 crc kubenswrapper[4815]: I1207 19:15:59.271816 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:59 crc kubenswrapper[4815]: I1207 19:15:59.271850 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:59 crc kubenswrapper[4815]: I1207 19:15:59.271859 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:59 crc kubenswrapper[4815]: I1207 19:15:59.271871 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:59 crc kubenswrapper[4815]: I1207 19:15:59.271879 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:59Z","lastTransitionTime":"2025-12-07T19:15:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:59 crc kubenswrapper[4815]: I1207 19:15:59.274335 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac471afc20373bd22caa4941b5ebe38fdf62ad2145fabb4211fa02ce8728ab1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4050d0d7398492efc8809311a01c1a60986b97b6d177846e70c9283399f37b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c63ad6fab49ca4b8deb896aef3b5ff43c3a1ae1035e7ad94815a938939c57c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://812356ce6808d8d2a5845dec1132733fab86a0d6853951920ce5fb5083551c7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ab1c01dcbfe50580e64d89dc9e14843419588c89d7d792af89034d8f272e44e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f140818c38211cb31d63c0b540b60681a781c349ba0523866a3c7e80f33756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d8bbcb8ed47df52ce8120a29864bc098ad0cce571dad6460aea888eda70ccf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c6dacc25cfd09c798577f8ef8470f6a4d7f1fe33b265ab8f798ea6ae999bd3ac\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"message\\\":\\\"rnetes/ovnkube-control-plane-749d76644c-tqxds\\\\nI1207 19:15:40.404183 6171 services_controller.go:451] Built service openshift-machine-api/machine-api-operator-machine-webhook cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-api/machine-api-operator-machine-webhook_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/machine-api-operator-machine-webhook\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.250\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1207 19:15:40.404208 6171 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:39Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d8bbcb8ed47df52ce8120a29864bc098ad0cce571dad6460aea888eda70ccf8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-07T19:15:59Z\\\",\\\"message\\\":\\\"map[10.217.4.213:80:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {2ead45b3-c313-4fbc-a7bc-2b3c4ffd610c}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1207 19:15:58.661470 6472 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1207 19:15:58.661480 6472 ovn.go:134] Ensuring zone local for Pod openshift-kube-controller-manager/kube-controller-manager-crc in node crc\\\\nI1207 19:15:58.661485 6472 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nI1207 19:15:58.661491 6472 default_network_controller.go:776] Recording success event on pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1207 19:15:58.661489 6472 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1207 19:15:58.658410 6472 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba72f8ec4b6a36d0c1a6fd8f9f8e5fad577a00b8ae515738a007b994b989d134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a4a4b2d4e80c078ae857e91491c082c6b1c57a5297c485c31fac0c90996b19f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a4a4b2d4e80c078ae857e91491c082c6b1c57a5297c485c31fac0c90996b19f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zzw6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:59Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:59 crc kubenswrapper[4815]: I1207 19:15:59.285965 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kh7gd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29b15186-a725-4067-ba76-336f580327fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1fa8bda26912926bc52757424a9930e1eca6ee953f28ffdb205c1b8b8f29d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kh7gd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:59Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:59 crc kubenswrapper[4815]: I1207 19:15:59.299425 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c752886-7f9b-4605-8358-0fde597c93da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://441c88363ea1cc8305fa21c51c9237798c47d82a207c9e80da98df93027fe4a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb1ec115ead917caf40ce3586653d6eb78e9a290d845f766f8818c2b57ece6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bda919a651aa83d2f9a82f711d0b242e14140e3d13ec5a80f7ec675a4aedb21a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33a4c84a8cd512a90c51f4721d9ddfde2a452e7f03d9809d492d5bccd7cf4aa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33a4c84a8cd512a90c51f4721d9ddfde2a452e7f03d9809d492d5bccd7cf4aa4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:59Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:59 crc kubenswrapper[4815]: I1207 19:15:59.313088 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe01e9d3-9839-41ea-82d9-f4b6f48bf3b2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52ab545f838534306ea70cc6e400e9fb56ed71ed8206d72c6626166f3a51a93c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f59886c065844948639bf9828ed20c24546cc38659683c98162568c6b12c029\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bec885c9a80d9b5b1fe630a9b7a3f19b23b0af05f359bd87a664f007812b4c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c76d9f056d22d1be012327374370269c2f6d6b8bd341c98419fcb744370751e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539bb1f0e569004272aac994e13bdc7f17f342f5986f08beca97a4c888e046c2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1207 19:15:09.905103 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1207 19:15:09.906859 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4228347019/tls.crt::/tmp/serving-cert-4228347019/tls.key\\\\\\\"\\\\nI1207 19:15:20.294696 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1207 19:15:20.298748 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1207 19:15:20.298781 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1207 19:15:20.298843 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1207 19:15:20.298860 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1207 19:15:20.305864 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1207 19:15:20.305885 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1207 19:15:20.305891 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1207 19:15:20.305973 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1207 19:15:20.305983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1207 19:15:20.305988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1207 19:15:20.305992 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1207 19:15:20.305997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1207 19:15:20.308727 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://902a764a3f6e3ac55c1817f369dfd69ffbf8528bfe4444a6514cdabd48f09fb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:59Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:59 crc kubenswrapper[4815]: I1207 19:15:59.325438 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:59Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:59 crc kubenswrapper[4815]: I1207 19:15:59.337529 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b57a8653da7e9136a6c54872ff67a01c2d6ad5077842b783ffe74e0069bc8693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:59Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:59 crc kubenswrapper[4815]: I1207 19:15:59.349405 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:59Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:59 crc kubenswrapper[4815]: I1207 19:15:59.361287 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ec34a6371c74ee0778e27d73f5e2948d9ee1bb518f658675df123df60879c61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc20ae853b989c9e9cfeb47a40bd1a02397e3ce0b0b4feec3fcec56ea7be16a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:59Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:59 crc kubenswrapper[4815]: I1207 19:15:59.372557 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:59Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:59 crc kubenswrapper[4815]: I1207 19:15:59.373891 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:59 crc kubenswrapper[4815]: I1207 19:15:59.373935 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:59 crc kubenswrapper[4815]: I1207 19:15:59.373946 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:59 crc kubenswrapper[4815]: I1207 19:15:59.373963 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:59 crc kubenswrapper[4815]: I1207 19:15:59.373974 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:59Z","lastTransitionTime":"2025-12-07T19:15:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:59 crc kubenswrapper[4815]: I1207 19:15:59.381277 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nz6q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92d414c2-ecc3-4598-ac9d-b982bfd89c7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://003f0d6dca39f167925bc3a05634d9dd20f23e54135631e0099a56801f9daf3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cp5k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nz6q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:59Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:59 crc kubenswrapper[4815]: I1207 19:15:59.392229 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d662ba2-aa03-4eea-bd30-8ad40638f6c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4939db657209a8744656f63541b50598888981216b000dd4316a9327fdfbcf34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2v8vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55eba62082762b6527923446c2d419d2eb00919331cf8e59bfa9cb27ca65a82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2v8vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gkn4h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:59Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:59 crc kubenswrapper[4815]: I1207 19:15:59.401689 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xbq22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"201e9ba8-3e19-4555-90f0-587497a2a328\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z24bd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z24bd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xbq22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:59Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:59 crc kubenswrapper[4815]: I1207 19:15:59.412144 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s95hp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b739f36-d9c4-4fb6-9ead-9df05e283dea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73eedf1d85ce69ddf2fed2445ed9318cb4515aee47f89fe9b1db69bbf213ea2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prbsc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s95hp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:59Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:59 crc kubenswrapper[4815]: I1207 19:15:59.426642 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gmf4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fefaece-ab52-48e2-9ee9-fb07be1922f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://600d32fcae56dd0fa63421283e9b0fdfc23c4589a1dc4a66e5ff1f4c6a415e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e14454c2c5dbbdcbfd3ff7ec5d618c2dfe48f41af9846b72f61941b878e53d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e14454c2c5dbbdcbfd3ff7ec5d618c2dfe48f41af9846b72f61941b878e53d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f33f1decd49213b8e034c2d1f2a3a51ab99aa18679bd3378becaa27f1ef4b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f33f1decd49213b8e034c2d1f2a3a51ab99aa18679bd3378becaa27f1ef4b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c519c7c8ea1b7405466970c64269b72cc6e8129838764dfc4a662cb4df419c6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c519c7c8ea1b7405466970c64269b72cc6e8129838764dfc4a662cb4df419c6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c9efcd688f4886df56fb4d0e2ff5f5dd0c9e877186e1b53d398b7bed0547609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c9efcd688f4886df56fb4d0e2ff5f5dd0c9e877186e1b53d398b7bed0547609\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a4a622d5231f5bacbb9dba0290a87f10310a0053be2dd692718fcf1002deb44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a4a622d5231f5bacbb9dba0290a87f10310a0053be2dd692718fcf1002deb44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://825649f5768765281bad22f9b86ed162b80cc258ce11a746ec06c9f415a21529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://825649f5768765281bad22f9b86ed162b80cc258ce11a746ec06c9f415a21529\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gmf4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:59Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:59 crc kubenswrapper[4815]: I1207 19:15:59.436853 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tqxds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dfd265a-0c7f-40bd-9226-82c2b1abbeda\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3523e317081c56f05c00d0288c6ad6f1ff04f1346772bc0c86aed488570786a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzfhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a277a724d026909546ae84aee6d8d29fc87f0277f9d6aa45ca99e5f116ac79d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzfhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tqxds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:15:59Z is after 2025-08-24T17:21:41Z" Dec 07 19:15:59 crc kubenswrapper[4815]: I1207 19:15:59.475904 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:59 crc kubenswrapper[4815]: I1207 19:15:59.475956 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:59 crc kubenswrapper[4815]: I1207 19:15:59.475969 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:59 crc kubenswrapper[4815]: I1207 19:15:59.475988 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:59 crc kubenswrapper[4815]: I1207 19:15:59.476027 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:59Z","lastTransitionTime":"2025-12-07T19:15:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:59 crc kubenswrapper[4815]: I1207 19:15:59.578473 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:59 crc kubenswrapper[4815]: I1207 19:15:59.578503 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:59 crc kubenswrapper[4815]: I1207 19:15:59.578531 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:59 crc kubenswrapper[4815]: I1207 19:15:59.578546 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:59 crc kubenswrapper[4815]: I1207 19:15:59.578555 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:59Z","lastTransitionTime":"2025-12-07T19:15:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:59 crc kubenswrapper[4815]: I1207 19:15:59.681611 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:59 crc kubenswrapper[4815]: I1207 19:15:59.681691 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:59 crc kubenswrapper[4815]: I1207 19:15:59.681714 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:59 crc kubenswrapper[4815]: I1207 19:15:59.681743 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:59 crc kubenswrapper[4815]: I1207 19:15:59.681765 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:59Z","lastTransitionTime":"2025-12-07T19:15:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:59 crc kubenswrapper[4815]: I1207 19:15:59.769633 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 07 19:15:59 crc kubenswrapper[4815]: E1207 19:15:59.769787 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 07 19:15:59 crc kubenswrapper[4815]: I1207 19:15:59.769854 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 07 19:15:59 crc kubenswrapper[4815]: E1207 19:15:59.770071 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 07 19:15:59 crc kubenswrapper[4815]: I1207 19:15:59.784280 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:59 crc kubenswrapper[4815]: I1207 19:15:59.784327 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:59 crc kubenswrapper[4815]: I1207 19:15:59.784344 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:59 crc kubenswrapper[4815]: I1207 19:15:59.784365 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:59 crc kubenswrapper[4815]: I1207 19:15:59.784382 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:59Z","lastTransitionTime":"2025-12-07T19:15:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:59 crc kubenswrapper[4815]: I1207 19:15:59.886566 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:59 crc kubenswrapper[4815]: I1207 19:15:59.886637 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:59 crc kubenswrapper[4815]: I1207 19:15:59.886658 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:59 crc kubenswrapper[4815]: I1207 19:15:59.886688 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:59 crc kubenswrapper[4815]: I1207 19:15:59.886711 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:59Z","lastTransitionTime":"2025-12-07T19:15:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:15:59 crc kubenswrapper[4815]: I1207 19:15:59.988456 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:15:59 crc kubenswrapper[4815]: I1207 19:15:59.988508 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:15:59 crc kubenswrapper[4815]: I1207 19:15:59.988526 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:15:59 crc kubenswrapper[4815]: I1207 19:15:59.988550 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:15:59 crc kubenswrapper[4815]: I1207 19:15:59.988568 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:15:59Z","lastTransitionTime":"2025-12-07T19:15:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:00 crc kubenswrapper[4815]: I1207 19:16:00.090871 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:00 crc kubenswrapper[4815]: I1207 19:16:00.090959 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:00 crc kubenswrapper[4815]: I1207 19:16:00.090977 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:00 crc kubenswrapper[4815]: I1207 19:16:00.091000 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:00 crc kubenswrapper[4815]: I1207 19:16:00.091016 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:00Z","lastTransitionTime":"2025-12-07T19:16:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:00 crc kubenswrapper[4815]: I1207 19:16:00.193746 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:00 crc kubenswrapper[4815]: I1207 19:16:00.193799 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:00 crc kubenswrapper[4815]: I1207 19:16:00.193817 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:00 crc kubenswrapper[4815]: I1207 19:16:00.193837 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:00 crc kubenswrapper[4815]: I1207 19:16:00.193851 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:00Z","lastTransitionTime":"2025-12-07T19:16:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:00 crc kubenswrapper[4815]: I1207 19:16:00.232765 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zzw6c_13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f/ovnkube-controller/2.log" Dec 07 19:16:00 crc kubenswrapper[4815]: I1207 19:16:00.236785 4815 scope.go:117] "RemoveContainer" containerID="9d8bbcb8ed47df52ce8120a29864bc098ad0cce571dad6460aea888eda70ccf8" Dec 07 19:16:00 crc kubenswrapper[4815]: E1207 19:16:00.237046 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zzw6c_openshift-ovn-kubernetes(13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" podUID="13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f" Dec 07 19:16:00 crc kubenswrapper[4815]: I1207 19:16:00.248043 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"538dccc2-452b-4d13-8b0d-df993c2b69a2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae001d2e4c277c57382ca0c4a1286c3c135f2fc06ca5a78df27cbdab588ce576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f92a2d37af34c519a6b00a9643f4a46de779ae0070adb0826e9f227960fe4a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89a76372fcc5ff7131084098e41183d148096169b9ccaf69d59dbef45dcfd755\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f29e69a7aa018d68c2e39f3e0f7a619fb25262acaebf83b81fabb6f9b87a9892\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:00Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:00 crc kubenswrapper[4815]: I1207 19:16:00.259087 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cb6cb3bfed28d77ff74b7a2273ddb477e21f930f79b169bfbd98fa10a5ead5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:00Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:00 crc kubenswrapper[4815]: I1207 19:16:00.275210 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac471afc20373bd22caa4941b5ebe38fdf62ad2145fabb4211fa02ce8728ab1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4050d0d7398492efc8809311a01c1a60986b97b6d177846e70c9283399f37b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c63ad6fab49ca4b8deb896aef3b5ff43c3a1ae1035e7ad94815a938939c57c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://812356ce6808d8d2a5845dec1132733fab86a0d6853951920ce5fb5083551c7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ab1c01dcbfe50580e64d89dc9e14843419588c89d7d792af89034d8f272e44e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f140818c38211cb31d63c0b540b60681a781c349ba0523866a3c7e80f33756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d8bbcb8ed47df52ce8120a29864bc098ad0cce571dad6460aea888eda70ccf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d8bbcb8ed47df52ce8120a29864bc098ad0cce571dad6460aea888eda70ccf8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-07T19:15:59Z\\\",\\\"message\\\":\\\"map[10.217.4.213:80:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {2ead45b3-c313-4fbc-a7bc-2b3c4ffd610c}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1207 19:15:58.661470 6472 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1207 19:15:58.661480 6472 ovn.go:134] Ensuring zone local for Pod openshift-kube-controller-manager/kube-controller-manager-crc in node crc\\\\nI1207 19:15:58.661485 6472 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nI1207 19:15:58.661491 6472 default_network_controller.go:776] Recording success event on pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1207 19:15:58.661489 6472 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1207 19:15:58.658410 6472 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:57Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zzw6c_openshift-ovn-kubernetes(13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba72f8ec4b6a36d0c1a6fd8f9f8e5fad577a00b8ae515738a007b994b989d134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a4a4b2d4e80c078ae857e91491c082c6b1c57a5297c485c31fac0c90996b19f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a4a4b2d4e80c078ae857e91491c082c6b1c57a5297c485c31fac0c90996b19f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zzw6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:00Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:00 crc kubenswrapper[4815]: I1207 19:16:00.284556 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kh7gd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29b15186-a725-4067-ba76-336f580327fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1fa8bda26912926bc52757424a9930e1eca6ee953f28ffdb205c1b8b8f29d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kh7gd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:00Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:00 crc kubenswrapper[4815]: I1207 19:16:00.294781 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c752886-7f9b-4605-8358-0fde597c93da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://441c88363ea1cc8305fa21c51c9237798c47d82a207c9e80da98df93027fe4a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb1ec115ead917caf40ce3586653d6eb78e9a290d845f766f8818c2b57ece6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bda919a651aa83d2f9a82f711d0b242e14140e3d13ec5a80f7ec675a4aedb21a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33a4c84a8cd512a90c51f4721d9ddfde2a452e7f03d9809d492d5bccd7cf4aa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33a4c84a8cd512a90c51f4721d9ddfde2a452e7f03d9809d492d5bccd7cf4aa4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:00Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:00 crc kubenswrapper[4815]: I1207 19:16:00.296221 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:00 crc kubenswrapper[4815]: I1207 19:16:00.296255 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:00 crc kubenswrapper[4815]: I1207 19:16:00.296263 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:00 crc kubenswrapper[4815]: I1207 19:16:00.296276 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:00 crc kubenswrapper[4815]: I1207 19:16:00.296285 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:00Z","lastTransitionTime":"2025-12-07T19:16:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:00 crc kubenswrapper[4815]: I1207 19:16:00.306393 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe01e9d3-9839-41ea-82d9-f4b6f48bf3b2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52ab545f838534306ea70cc6e400e9fb56ed71ed8206d72c6626166f3a51a93c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f59886c065844948639bf9828ed20c24546cc38659683c98162568c6b12c029\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bec885c9a80d9b5b1fe630a9b7a3f19b23b0af05f359bd87a664f007812b4c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c76d9f056d22d1be012327374370269c2f6d6b8bd341c98419fcb744370751e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539bb1f0e569004272aac994e13bdc7f17f342f5986f08beca97a4c888e046c2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1207 19:15:09.905103 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1207 19:15:09.906859 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4228347019/tls.crt::/tmp/serving-cert-4228347019/tls.key\\\\\\\"\\\\nI1207 19:15:20.294696 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1207 19:15:20.298748 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1207 19:15:20.298781 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1207 19:15:20.298843 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1207 19:15:20.298860 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1207 19:15:20.305864 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1207 19:15:20.305885 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1207 19:15:20.305891 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1207 19:15:20.305973 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1207 19:15:20.305983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1207 19:15:20.305988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1207 19:15:20.305992 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1207 19:15:20.305997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1207 19:15:20.308727 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://902a764a3f6e3ac55c1817f369dfd69ffbf8528bfe4444a6514cdabd48f09fb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:00Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:00 crc kubenswrapper[4815]: I1207 19:16:00.315178 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:00Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:00 crc kubenswrapper[4815]: I1207 19:16:00.325662 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b57a8653da7e9136a6c54872ff67a01c2d6ad5077842b783ffe74e0069bc8693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:00Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:00 crc kubenswrapper[4815]: I1207 19:16:00.335712 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:00Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:00 crc kubenswrapper[4815]: I1207 19:16:00.347677 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ec34a6371c74ee0778e27d73f5e2948d9ee1bb518f658675df123df60879c61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc20ae853b989c9e9cfeb47a40bd1a02397e3ce0b0b4feec3fcec56ea7be16a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:00Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:00 crc kubenswrapper[4815]: I1207 19:16:00.360946 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:00Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:00 crc kubenswrapper[4815]: I1207 19:16:00.368832 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nz6q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92d414c2-ecc3-4598-ac9d-b982bfd89c7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://003f0d6dca39f167925bc3a05634d9dd20f23e54135631e0099a56801f9daf3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cp5k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nz6q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:00Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:00 crc kubenswrapper[4815]: I1207 19:16:00.378672 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d662ba2-aa03-4eea-bd30-8ad40638f6c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4939db657209a8744656f63541b50598888981216b000dd4316a9327fdfbcf34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2v8vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55eba62082762b6527923446c2d419d2eb00919331cf8e59bfa9cb27ca65a82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2v8vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gkn4h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:00Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:00 crc kubenswrapper[4815]: I1207 19:16:00.389720 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xbq22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"201e9ba8-3e19-4555-90f0-587497a2a328\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z24bd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z24bd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xbq22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:00Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:00 crc kubenswrapper[4815]: I1207 19:16:00.398477 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:00 crc kubenswrapper[4815]: I1207 19:16:00.398508 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:00 crc kubenswrapper[4815]: I1207 19:16:00.398517 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:00 crc kubenswrapper[4815]: I1207 19:16:00.398532 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:00 crc kubenswrapper[4815]: I1207 19:16:00.398543 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:00Z","lastTransitionTime":"2025-12-07T19:16:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:00 crc kubenswrapper[4815]: I1207 19:16:00.400467 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s95hp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b739f36-d9c4-4fb6-9ead-9df05e283dea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73eedf1d85ce69ddf2fed2445ed9318cb4515aee47f89fe9b1db69bbf213ea2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prbsc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s95hp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:00Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:00 crc kubenswrapper[4815]: I1207 19:16:00.413968 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gmf4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fefaece-ab52-48e2-9ee9-fb07be1922f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://600d32fcae56dd0fa63421283e9b0fdfc23c4589a1dc4a66e5ff1f4c6a415e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e14454c2c5dbbdcbfd3ff7ec5d618c2dfe48f41af9846b72f61941b878e53d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e14454c2c5dbbdcbfd3ff7ec5d618c2dfe48f41af9846b72f61941b878e53d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f33f1decd49213b8e034c2d1f2a3a51ab99aa18679bd3378becaa27f1ef4b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f33f1decd49213b8e034c2d1f2a3a51ab99aa18679bd3378becaa27f1ef4b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c519c7c8ea1b7405466970c64269b72cc6e8129838764dfc4a662cb4df419c6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c519c7c8ea1b7405466970c64269b72cc6e8129838764dfc4a662cb4df419c6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c9efcd688f4886df56fb4d0e2ff5f5dd0c9e877186e1b53d398b7bed0547609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c9efcd688f4886df56fb4d0e2ff5f5dd0c9e877186e1b53d398b7bed0547609\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a4a622d5231f5bacbb9dba0290a87f10310a0053be2dd692718fcf1002deb44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a4a622d5231f5bacbb9dba0290a87f10310a0053be2dd692718fcf1002deb44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://825649f5768765281bad22f9b86ed162b80cc258ce11a746ec06c9f415a21529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://825649f5768765281bad22f9b86ed162b80cc258ce11a746ec06c9f415a21529\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gmf4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:00Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:00 crc kubenswrapper[4815]: I1207 19:16:00.424989 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tqxds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dfd265a-0c7f-40bd-9226-82c2b1abbeda\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3523e317081c56f05c00d0288c6ad6f1ff04f1346772bc0c86aed488570786a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzfhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a277a724d026909546ae84aee6d8d29fc87f0277f9d6aa45ca99e5f116ac79d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzfhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tqxds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:00Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:00 crc kubenswrapper[4815]: I1207 19:16:00.501551 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:00 crc kubenswrapper[4815]: I1207 19:16:00.501599 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:00 crc kubenswrapper[4815]: I1207 19:16:00.501616 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:00 crc kubenswrapper[4815]: I1207 19:16:00.501639 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:00 crc kubenswrapper[4815]: I1207 19:16:00.501656 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:00Z","lastTransitionTime":"2025-12-07T19:16:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:00 crc kubenswrapper[4815]: I1207 19:16:00.604653 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:00 crc kubenswrapper[4815]: I1207 19:16:00.604686 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:00 crc kubenswrapper[4815]: I1207 19:16:00.604694 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:00 crc kubenswrapper[4815]: I1207 19:16:00.604711 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:00 crc kubenswrapper[4815]: I1207 19:16:00.604720 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:00Z","lastTransitionTime":"2025-12-07T19:16:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:00 crc kubenswrapper[4815]: I1207 19:16:00.707334 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:00 crc kubenswrapper[4815]: I1207 19:16:00.707382 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:00 crc kubenswrapper[4815]: I1207 19:16:00.707391 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:00 crc kubenswrapper[4815]: I1207 19:16:00.707407 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:00 crc kubenswrapper[4815]: I1207 19:16:00.707418 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:00Z","lastTransitionTime":"2025-12-07T19:16:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:00 crc kubenswrapper[4815]: I1207 19:16:00.769457 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 07 19:16:00 crc kubenswrapper[4815]: I1207 19:16:00.769465 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xbq22" Dec 07 19:16:00 crc kubenswrapper[4815]: E1207 19:16:00.769638 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 07 19:16:00 crc kubenswrapper[4815]: E1207 19:16:00.769703 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xbq22" podUID="201e9ba8-3e19-4555-90f0-587497a2a328" Dec 07 19:16:00 crc kubenswrapper[4815]: I1207 19:16:00.809657 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:00 crc kubenswrapper[4815]: I1207 19:16:00.809731 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:00 crc kubenswrapper[4815]: I1207 19:16:00.809753 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:00 crc kubenswrapper[4815]: I1207 19:16:00.809784 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:00 crc kubenswrapper[4815]: I1207 19:16:00.809809 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:00Z","lastTransitionTime":"2025-12-07T19:16:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:00 crc kubenswrapper[4815]: I1207 19:16:00.912069 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:00 crc kubenswrapper[4815]: I1207 19:16:00.912134 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:00 crc kubenswrapper[4815]: I1207 19:16:00.912157 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:00 crc kubenswrapper[4815]: I1207 19:16:00.912187 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:00 crc kubenswrapper[4815]: I1207 19:16:00.912210 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:00Z","lastTransitionTime":"2025-12-07T19:16:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:01 crc kubenswrapper[4815]: I1207 19:16:01.014676 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:01 crc kubenswrapper[4815]: I1207 19:16:01.014702 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:01 crc kubenswrapper[4815]: I1207 19:16:01.014711 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:01 crc kubenswrapper[4815]: I1207 19:16:01.014724 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:01 crc kubenswrapper[4815]: I1207 19:16:01.014732 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:01Z","lastTransitionTime":"2025-12-07T19:16:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:01 crc kubenswrapper[4815]: I1207 19:16:01.117836 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:01 crc kubenswrapper[4815]: I1207 19:16:01.117886 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:01 crc kubenswrapper[4815]: I1207 19:16:01.117905 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:01 crc kubenswrapper[4815]: I1207 19:16:01.117971 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:01 crc kubenswrapper[4815]: I1207 19:16:01.117988 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:01Z","lastTransitionTime":"2025-12-07T19:16:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:01 crc kubenswrapper[4815]: I1207 19:16:01.220359 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:01 crc kubenswrapper[4815]: I1207 19:16:01.220389 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:01 crc kubenswrapper[4815]: I1207 19:16:01.220397 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:01 crc kubenswrapper[4815]: I1207 19:16:01.220409 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:01 crc kubenswrapper[4815]: I1207 19:16:01.220434 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:01Z","lastTransitionTime":"2025-12-07T19:16:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:01 crc kubenswrapper[4815]: I1207 19:16:01.322469 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:01 crc kubenswrapper[4815]: I1207 19:16:01.322503 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:01 crc kubenswrapper[4815]: I1207 19:16:01.322514 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:01 crc kubenswrapper[4815]: I1207 19:16:01.322528 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:01 crc kubenswrapper[4815]: I1207 19:16:01.322537 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:01Z","lastTransitionTime":"2025-12-07T19:16:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:01 crc kubenswrapper[4815]: I1207 19:16:01.425313 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:01 crc kubenswrapper[4815]: I1207 19:16:01.425352 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:01 crc kubenswrapper[4815]: I1207 19:16:01.425362 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:01 crc kubenswrapper[4815]: I1207 19:16:01.425376 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:01 crc kubenswrapper[4815]: I1207 19:16:01.425387 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:01Z","lastTransitionTime":"2025-12-07T19:16:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:01 crc kubenswrapper[4815]: I1207 19:16:01.527854 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:01 crc kubenswrapper[4815]: I1207 19:16:01.528014 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:01 crc kubenswrapper[4815]: I1207 19:16:01.528042 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:01 crc kubenswrapper[4815]: I1207 19:16:01.528073 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:01 crc kubenswrapper[4815]: I1207 19:16:01.528093 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:01Z","lastTransitionTime":"2025-12-07T19:16:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:01 crc kubenswrapper[4815]: I1207 19:16:01.630847 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:01 crc kubenswrapper[4815]: I1207 19:16:01.630971 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:01 crc kubenswrapper[4815]: I1207 19:16:01.630997 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:01 crc kubenswrapper[4815]: I1207 19:16:01.631027 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:01 crc kubenswrapper[4815]: I1207 19:16:01.631047 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:01Z","lastTransitionTime":"2025-12-07T19:16:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:01 crc kubenswrapper[4815]: I1207 19:16:01.733764 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:01 crc kubenswrapper[4815]: I1207 19:16:01.733802 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:01 crc kubenswrapper[4815]: I1207 19:16:01.733814 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:01 crc kubenswrapper[4815]: I1207 19:16:01.733829 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:01 crc kubenswrapper[4815]: I1207 19:16:01.733840 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:01Z","lastTransitionTime":"2025-12-07T19:16:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:01 crc kubenswrapper[4815]: I1207 19:16:01.769127 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 07 19:16:01 crc kubenswrapper[4815]: E1207 19:16:01.769239 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 07 19:16:01 crc kubenswrapper[4815]: I1207 19:16:01.769130 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 07 19:16:01 crc kubenswrapper[4815]: E1207 19:16:01.769382 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 07 19:16:01 crc kubenswrapper[4815]: I1207 19:16:01.836680 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:01 crc kubenswrapper[4815]: I1207 19:16:01.836768 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:01 crc kubenswrapper[4815]: I1207 19:16:01.836793 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:01 crc kubenswrapper[4815]: I1207 19:16:01.836818 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:01 crc kubenswrapper[4815]: I1207 19:16:01.836872 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:01Z","lastTransitionTime":"2025-12-07T19:16:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:01 crc kubenswrapper[4815]: I1207 19:16:01.939926 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:01 crc kubenswrapper[4815]: I1207 19:16:01.939977 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:01 crc kubenswrapper[4815]: I1207 19:16:01.939989 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:01 crc kubenswrapper[4815]: I1207 19:16:01.940007 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:01 crc kubenswrapper[4815]: I1207 19:16:01.940020 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:01Z","lastTransitionTime":"2025-12-07T19:16:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:02 crc kubenswrapper[4815]: I1207 19:16:02.042124 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:02 crc kubenswrapper[4815]: I1207 19:16:02.042161 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:02 crc kubenswrapper[4815]: I1207 19:16:02.042172 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:02 crc kubenswrapper[4815]: I1207 19:16:02.042189 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:02 crc kubenswrapper[4815]: I1207 19:16:02.042203 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:02Z","lastTransitionTime":"2025-12-07T19:16:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:02 crc kubenswrapper[4815]: I1207 19:16:02.144148 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:02 crc kubenswrapper[4815]: I1207 19:16:02.144199 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:02 crc kubenswrapper[4815]: I1207 19:16:02.144218 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:02 crc kubenswrapper[4815]: I1207 19:16:02.144242 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:02 crc kubenswrapper[4815]: I1207 19:16:02.144258 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:02Z","lastTransitionTime":"2025-12-07T19:16:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:02 crc kubenswrapper[4815]: I1207 19:16:02.246039 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:02 crc kubenswrapper[4815]: I1207 19:16:02.246078 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:02 crc kubenswrapper[4815]: I1207 19:16:02.246090 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:02 crc kubenswrapper[4815]: I1207 19:16:02.246108 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:02 crc kubenswrapper[4815]: I1207 19:16:02.246119 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:02Z","lastTransitionTime":"2025-12-07T19:16:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:02 crc kubenswrapper[4815]: I1207 19:16:02.349097 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:02 crc kubenswrapper[4815]: I1207 19:16:02.349131 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:02 crc kubenswrapper[4815]: I1207 19:16:02.349143 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:02 crc kubenswrapper[4815]: I1207 19:16:02.349158 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:02 crc kubenswrapper[4815]: I1207 19:16:02.349167 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:02Z","lastTransitionTime":"2025-12-07T19:16:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:02 crc kubenswrapper[4815]: I1207 19:16:02.452723 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:02 crc kubenswrapper[4815]: I1207 19:16:02.452771 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:02 crc kubenswrapper[4815]: I1207 19:16:02.452794 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:02 crc kubenswrapper[4815]: I1207 19:16:02.452821 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:02 crc kubenswrapper[4815]: I1207 19:16:02.452841 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:02Z","lastTransitionTime":"2025-12-07T19:16:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:02 crc kubenswrapper[4815]: I1207 19:16:02.556791 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:02 crc kubenswrapper[4815]: I1207 19:16:02.556885 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:02 crc kubenswrapper[4815]: I1207 19:16:02.557438 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:02 crc kubenswrapper[4815]: I1207 19:16:02.557785 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:02 crc kubenswrapper[4815]: I1207 19:16:02.557828 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:02Z","lastTransitionTime":"2025-12-07T19:16:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:02 crc kubenswrapper[4815]: I1207 19:16:02.662607 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:02 crc kubenswrapper[4815]: I1207 19:16:02.662656 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:02 crc kubenswrapper[4815]: I1207 19:16:02.662668 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:02 crc kubenswrapper[4815]: I1207 19:16:02.662686 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:02 crc kubenswrapper[4815]: I1207 19:16:02.662698 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:02Z","lastTransitionTime":"2025-12-07T19:16:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:02 crc kubenswrapper[4815]: I1207 19:16:02.764900 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:02 crc kubenswrapper[4815]: I1207 19:16:02.764974 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:02 crc kubenswrapper[4815]: I1207 19:16:02.764984 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:02 crc kubenswrapper[4815]: I1207 19:16:02.764999 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:02 crc kubenswrapper[4815]: I1207 19:16:02.765011 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:02Z","lastTransitionTime":"2025-12-07T19:16:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:02 crc kubenswrapper[4815]: I1207 19:16:02.769342 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 07 19:16:02 crc kubenswrapper[4815]: I1207 19:16:02.769404 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xbq22" Dec 07 19:16:02 crc kubenswrapper[4815]: E1207 19:16:02.769435 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 07 19:16:02 crc kubenswrapper[4815]: E1207 19:16:02.769596 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xbq22" podUID="201e9ba8-3e19-4555-90f0-587497a2a328" Dec 07 19:16:02 crc kubenswrapper[4815]: I1207 19:16:02.867821 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:02 crc kubenswrapper[4815]: I1207 19:16:02.867859 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:02 crc kubenswrapper[4815]: I1207 19:16:02.867867 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:02 crc kubenswrapper[4815]: I1207 19:16:02.867881 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:02 crc kubenswrapper[4815]: I1207 19:16:02.867890 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:02Z","lastTransitionTime":"2025-12-07T19:16:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:02 crc kubenswrapper[4815]: I1207 19:16:02.970405 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:02 crc kubenswrapper[4815]: I1207 19:16:02.970436 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:02 crc kubenswrapper[4815]: I1207 19:16:02.970447 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:02 crc kubenswrapper[4815]: I1207 19:16:02.970463 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:02 crc kubenswrapper[4815]: I1207 19:16:02.970474 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:02Z","lastTransitionTime":"2025-12-07T19:16:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:03 crc kubenswrapper[4815]: I1207 19:16:03.073459 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:03 crc kubenswrapper[4815]: I1207 19:16:03.073499 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:03 crc kubenswrapper[4815]: I1207 19:16:03.073512 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:03 crc kubenswrapper[4815]: I1207 19:16:03.073529 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:03 crc kubenswrapper[4815]: I1207 19:16:03.073541 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:03Z","lastTransitionTime":"2025-12-07T19:16:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:03 crc kubenswrapper[4815]: I1207 19:16:03.176404 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:03 crc kubenswrapper[4815]: I1207 19:16:03.176443 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:03 crc kubenswrapper[4815]: I1207 19:16:03.176455 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:03 crc kubenswrapper[4815]: I1207 19:16:03.176470 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:03 crc kubenswrapper[4815]: I1207 19:16:03.176481 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:03Z","lastTransitionTime":"2025-12-07T19:16:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:03 crc kubenswrapper[4815]: I1207 19:16:03.279187 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:03 crc kubenswrapper[4815]: I1207 19:16:03.279223 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:03 crc kubenswrapper[4815]: I1207 19:16:03.279232 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:03 crc kubenswrapper[4815]: I1207 19:16:03.279247 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:03 crc kubenswrapper[4815]: I1207 19:16:03.279256 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:03Z","lastTransitionTime":"2025-12-07T19:16:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:03 crc kubenswrapper[4815]: I1207 19:16:03.381226 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:03 crc kubenswrapper[4815]: I1207 19:16:03.381267 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:03 crc kubenswrapper[4815]: I1207 19:16:03.381276 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:03 crc kubenswrapper[4815]: I1207 19:16:03.381291 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:03 crc kubenswrapper[4815]: I1207 19:16:03.381301 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:03Z","lastTransitionTime":"2025-12-07T19:16:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:03 crc kubenswrapper[4815]: I1207 19:16:03.483564 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:03 crc kubenswrapper[4815]: I1207 19:16:03.483624 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:03 crc kubenswrapper[4815]: I1207 19:16:03.483638 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:03 crc kubenswrapper[4815]: I1207 19:16:03.483657 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:03 crc kubenswrapper[4815]: I1207 19:16:03.483670 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:03Z","lastTransitionTime":"2025-12-07T19:16:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:03 crc kubenswrapper[4815]: I1207 19:16:03.586075 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:03 crc kubenswrapper[4815]: I1207 19:16:03.586105 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:03 crc kubenswrapper[4815]: I1207 19:16:03.586114 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:03 crc kubenswrapper[4815]: I1207 19:16:03.586127 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:03 crc kubenswrapper[4815]: I1207 19:16:03.586135 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:03Z","lastTransitionTime":"2025-12-07T19:16:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:03 crc kubenswrapper[4815]: I1207 19:16:03.688669 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:03 crc kubenswrapper[4815]: I1207 19:16:03.688705 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:03 crc kubenswrapper[4815]: I1207 19:16:03.688714 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:03 crc kubenswrapper[4815]: I1207 19:16:03.688742 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:03 crc kubenswrapper[4815]: I1207 19:16:03.688758 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:03Z","lastTransitionTime":"2025-12-07T19:16:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:03 crc kubenswrapper[4815]: I1207 19:16:03.769635 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 07 19:16:03 crc kubenswrapper[4815]: I1207 19:16:03.769678 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 07 19:16:03 crc kubenswrapper[4815]: E1207 19:16:03.769799 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 07 19:16:03 crc kubenswrapper[4815]: E1207 19:16:03.769887 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 07 19:16:03 crc kubenswrapper[4815]: I1207 19:16:03.795189 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:03 crc kubenswrapper[4815]: I1207 19:16:03.795243 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:03 crc kubenswrapper[4815]: I1207 19:16:03.795257 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:03 crc kubenswrapper[4815]: I1207 19:16:03.795273 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:03 crc kubenswrapper[4815]: I1207 19:16:03.795307 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:03Z","lastTransitionTime":"2025-12-07T19:16:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:03 crc kubenswrapper[4815]: I1207 19:16:03.897758 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:03 crc kubenswrapper[4815]: I1207 19:16:03.897799 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:03 crc kubenswrapper[4815]: I1207 19:16:03.897810 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:03 crc kubenswrapper[4815]: I1207 19:16:03.897827 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:03 crc kubenswrapper[4815]: I1207 19:16:03.897838 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:03Z","lastTransitionTime":"2025-12-07T19:16:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:04 crc kubenswrapper[4815]: I1207 19:16:04.000368 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:04 crc kubenswrapper[4815]: I1207 19:16:04.000457 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:04 crc kubenswrapper[4815]: I1207 19:16:04.000482 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:04 crc kubenswrapper[4815]: I1207 19:16:04.000513 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:04 crc kubenswrapper[4815]: I1207 19:16:04.000537 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:04Z","lastTransitionTime":"2025-12-07T19:16:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:04 crc kubenswrapper[4815]: I1207 19:16:04.102579 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:04 crc kubenswrapper[4815]: I1207 19:16:04.102653 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:04 crc kubenswrapper[4815]: I1207 19:16:04.102675 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:04 crc kubenswrapper[4815]: I1207 19:16:04.102704 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:04 crc kubenswrapper[4815]: I1207 19:16:04.102735 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:04Z","lastTransitionTime":"2025-12-07T19:16:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:04 crc kubenswrapper[4815]: I1207 19:16:04.205301 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:04 crc kubenswrapper[4815]: I1207 19:16:04.205338 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:04 crc kubenswrapper[4815]: I1207 19:16:04.205348 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:04 crc kubenswrapper[4815]: I1207 19:16:04.205362 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:04 crc kubenswrapper[4815]: I1207 19:16:04.205373 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:04Z","lastTransitionTime":"2025-12-07T19:16:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:04 crc kubenswrapper[4815]: I1207 19:16:04.307872 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:04 crc kubenswrapper[4815]: I1207 19:16:04.307903 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:04 crc kubenswrapper[4815]: I1207 19:16:04.307930 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:04 crc kubenswrapper[4815]: I1207 19:16:04.307943 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:04 crc kubenswrapper[4815]: I1207 19:16:04.307953 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:04Z","lastTransitionTime":"2025-12-07T19:16:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:04 crc kubenswrapper[4815]: I1207 19:16:04.410329 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:04 crc kubenswrapper[4815]: I1207 19:16:04.410360 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:04 crc kubenswrapper[4815]: I1207 19:16:04.410369 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:04 crc kubenswrapper[4815]: I1207 19:16:04.410382 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:04 crc kubenswrapper[4815]: I1207 19:16:04.410391 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:04Z","lastTransitionTime":"2025-12-07T19:16:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:04 crc kubenswrapper[4815]: I1207 19:16:04.512397 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:04 crc kubenswrapper[4815]: I1207 19:16:04.512461 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:04 crc kubenswrapper[4815]: I1207 19:16:04.512478 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:04 crc kubenswrapper[4815]: I1207 19:16:04.512502 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:04 crc kubenswrapper[4815]: I1207 19:16:04.512530 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:04Z","lastTransitionTime":"2025-12-07T19:16:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:04 crc kubenswrapper[4815]: I1207 19:16:04.615530 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:04 crc kubenswrapper[4815]: I1207 19:16:04.615583 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:04 crc kubenswrapper[4815]: I1207 19:16:04.615599 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:04 crc kubenswrapper[4815]: I1207 19:16:04.615623 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:04 crc kubenswrapper[4815]: I1207 19:16:04.615637 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:04Z","lastTransitionTime":"2025-12-07T19:16:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:04 crc kubenswrapper[4815]: I1207 19:16:04.718199 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:04 crc kubenswrapper[4815]: I1207 19:16:04.718259 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:04 crc kubenswrapper[4815]: I1207 19:16:04.718277 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:04 crc kubenswrapper[4815]: I1207 19:16:04.718305 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:04 crc kubenswrapper[4815]: I1207 19:16:04.718324 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:04Z","lastTransitionTime":"2025-12-07T19:16:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:04 crc kubenswrapper[4815]: I1207 19:16:04.769473 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xbq22" Dec 07 19:16:04 crc kubenswrapper[4815]: I1207 19:16:04.769514 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 07 19:16:04 crc kubenswrapper[4815]: E1207 19:16:04.769660 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xbq22" podUID="201e9ba8-3e19-4555-90f0-587497a2a328" Dec 07 19:16:04 crc kubenswrapper[4815]: E1207 19:16:04.769802 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 07 19:16:04 crc kubenswrapper[4815]: I1207 19:16:04.820845 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:04 crc kubenswrapper[4815]: I1207 19:16:04.820896 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:04 crc kubenswrapper[4815]: I1207 19:16:04.820908 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:04 crc kubenswrapper[4815]: I1207 19:16:04.820943 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:04 crc kubenswrapper[4815]: I1207 19:16:04.820962 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:04Z","lastTransitionTime":"2025-12-07T19:16:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:04 crc kubenswrapper[4815]: I1207 19:16:04.923205 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:04 crc kubenswrapper[4815]: I1207 19:16:04.923241 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:04 crc kubenswrapper[4815]: I1207 19:16:04.923251 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:04 crc kubenswrapper[4815]: I1207 19:16:04.923264 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:04 crc kubenswrapper[4815]: I1207 19:16:04.923275 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:04Z","lastTransitionTime":"2025-12-07T19:16:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:05 crc kubenswrapper[4815]: I1207 19:16:05.026649 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:05 crc kubenswrapper[4815]: I1207 19:16:05.026681 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:05 crc kubenswrapper[4815]: I1207 19:16:05.026692 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:05 crc kubenswrapper[4815]: I1207 19:16:05.026706 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:05 crc kubenswrapper[4815]: I1207 19:16:05.026715 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:05Z","lastTransitionTime":"2025-12-07T19:16:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:05 crc kubenswrapper[4815]: I1207 19:16:05.128854 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:05 crc kubenswrapper[4815]: I1207 19:16:05.128903 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:05 crc kubenswrapper[4815]: I1207 19:16:05.128947 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:05 crc kubenswrapper[4815]: I1207 19:16:05.128971 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:05 crc kubenswrapper[4815]: I1207 19:16:05.128987 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:05Z","lastTransitionTime":"2025-12-07T19:16:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:05 crc kubenswrapper[4815]: I1207 19:16:05.231279 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:05 crc kubenswrapper[4815]: I1207 19:16:05.231312 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:05 crc kubenswrapper[4815]: I1207 19:16:05.231354 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:05 crc kubenswrapper[4815]: I1207 19:16:05.231372 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:05 crc kubenswrapper[4815]: I1207 19:16:05.231383 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:05Z","lastTransitionTime":"2025-12-07T19:16:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:05 crc kubenswrapper[4815]: I1207 19:16:05.333983 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:05 crc kubenswrapper[4815]: I1207 19:16:05.334031 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:05 crc kubenswrapper[4815]: I1207 19:16:05.334049 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:05 crc kubenswrapper[4815]: I1207 19:16:05.334072 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:05 crc kubenswrapper[4815]: I1207 19:16:05.334088 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:05Z","lastTransitionTime":"2025-12-07T19:16:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:05 crc kubenswrapper[4815]: I1207 19:16:05.437144 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:05 crc kubenswrapper[4815]: I1207 19:16:05.437193 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:05 crc kubenswrapper[4815]: I1207 19:16:05.437203 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:05 crc kubenswrapper[4815]: I1207 19:16:05.437217 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:05 crc kubenswrapper[4815]: I1207 19:16:05.437226 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:05Z","lastTransitionTime":"2025-12-07T19:16:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:05 crc kubenswrapper[4815]: I1207 19:16:05.540084 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:05 crc kubenswrapper[4815]: I1207 19:16:05.540127 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:05 crc kubenswrapper[4815]: I1207 19:16:05.540138 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:05 crc kubenswrapper[4815]: I1207 19:16:05.540154 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:05 crc kubenswrapper[4815]: I1207 19:16:05.540167 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:05Z","lastTransitionTime":"2025-12-07T19:16:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:05 crc kubenswrapper[4815]: I1207 19:16:05.643541 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:05 crc kubenswrapper[4815]: I1207 19:16:05.643578 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:05 crc kubenswrapper[4815]: I1207 19:16:05.643594 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:05 crc kubenswrapper[4815]: I1207 19:16:05.643615 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:05 crc kubenswrapper[4815]: I1207 19:16:05.643632 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:05Z","lastTransitionTime":"2025-12-07T19:16:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:05 crc kubenswrapper[4815]: I1207 19:16:05.746533 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:05 crc kubenswrapper[4815]: I1207 19:16:05.746569 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:05 crc kubenswrapper[4815]: I1207 19:16:05.746578 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:05 crc kubenswrapper[4815]: I1207 19:16:05.746593 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:05 crc kubenswrapper[4815]: I1207 19:16:05.746603 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:05Z","lastTransitionTime":"2025-12-07T19:16:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:05 crc kubenswrapper[4815]: I1207 19:16:05.769434 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 07 19:16:05 crc kubenswrapper[4815]: I1207 19:16:05.769504 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 07 19:16:05 crc kubenswrapper[4815]: E1207 19:16:05.769538 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 07 19:16:05 crc kubenswrapper[4815]: E1207 19:16:05.769716 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 07 19:16:05 crc kubenswrapper[4815]: I1207 19:16:05.782881 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cb6cb3bfed28d77ff74b7a2273ddb477e21f930f79b169bfbd98fa10a5ead5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:05Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:05 crc kubenswrapper[4815]: I1207 19:16:05.810653 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac471afc20373bd22caa4941b5ebe38fdf62ad2145fabb4211fa02ce8728ab1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4050d0d7398492efc8809311a01c1a60986b97b6d177846e70c9283399f37b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c63ad6fab49ca4b8deb896aef3b5ff43c3a1ae1035e7ad94815a938939c57c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://812356ce6808d8d2a5845dec1132733fab86a0d6853951920ce5fb5083551c7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ab1c01dcbfe50580e64d89dc9e14843419588c89d7d792af89034d8f272e44e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f140818c38211cb31d63c0b540b60681a781c349ba0523866a3c7e80f33756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d8bbcb8ed47df52ce8120a29864bc098ad0cce571dad6460aea888eda70ccf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d8bbcb8ed47df52ce8120a29864bc098ad0cce571dad6460aea888eda70ccf8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-07T19:15:59Z\\\",\\\"message\\\":\\\"map[10.217.4.213:80:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {2ead45b3-c313-4fbc-a7bc-2b3c4ffd610c}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1207 19:15:58.661470 6472 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1207 19:15:58.661480 6472 ovn.go:134] Ensuring zone local for Pod openshift-kube-controller-manager/kube-controller-manager-crc in node crc\\\\nI1207 19:15:58.661485 6472 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nI1207 19:15:58.661491 6472 default_network_controller.go:776] Recording success event on pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1207 19:15:58.661489 6472 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1207 19:15:58.658410 6472 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:57Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zzw6c_openshift-ovn-kubernetes(13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba72f8ec4b6a36d0c1a6fd8f9f8e5fad577a00b8ae515738a007b994b989d134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a4a4b2d4e80c078ae857e91491c082c6b1c57a5297c485c31fac0c90996b19f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a4a4b2d4e80c078ae857e91491c082c6b1c57a5297c485c31fac0c90996b19f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zzw6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:05Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:05 crc kubenswrapper[4815]: I1207 19:16:05.823801 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kh7gd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29b15186-a725-4067-ba76-336f580327fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1fa8bda26912926bc52757424a9930e1eca6ee953f28ffdb205c1b8b8f29d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kh7gd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:05Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:05 crc kubenswrapper[4815]: I1207 19:16:05.836261 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"538dccc2-452b-4d13-8b0d-df993c2b69a2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae001d2e4c277c57382ca0c4a1286c3c135f2fc06ca5a78df27cbdab588ce576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f92a2d37af34c519a6b00a9643f4a46de779ae0070adb0826e9f227960fe4a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89a76372fcc5ff7131084098e41183d148096169b9ccaf69d59dbef45dcfd755\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f29e69a7aa018d68c2e39f3e0f7a619fb25262acaebf83b81fabb6f9b87a9892\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:05Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:05 crc kubenswrapper[4815]: I1207 19:16:05.848412 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:05 crc kubenswrapper[4815]: I1207 19:16:05.848446 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:05 crc kubenswrapper[4815]: I1207 19:16:05.848458 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:05 crc kubenswrapper[4815]: I1207 19:16:05.848475 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:05 crc kubenswrapper[4815]: I1207 19:16:05.848487 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:05Z","lastTransitionTime":"2025-12-07T19:16:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:05 crc kubenswrapper[4815]: I1207 19:16:05.853221 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe01e9d3-9839-41ea-82d9-f4b6f48bf3b2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52ab545f838534306ea70cc6e400e9fb56ed71ed8206d72c6626166f3a51a93c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f59886c065844948639bf9828ed20c24546cc38659683c98162568c6b12c029\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bec885c9a80d9b5b1fe630a9b7a3f19b23b0af05f359bd87a664f007812b4c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c76d9f056d22d1be012327374370269c2f6d6b8bd341c98419fcb744370751e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539bb1f0e569004272aac994e13bdc7f17f342f5986f08beca97a4c888e046c2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1207 19:15:09.905103 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1207 19:15:09.906859 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4228347019/tls.crt::/tmp/serving-cert-4228347019/tls.key\\\\\\\"\\\\nI1207 19:15:20.294696 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1207 19:15:20.298748 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1207 19:15:20.298781 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1207 19:15:20.298843 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1207 19:15:20.298860 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1207 19:15:20.305864 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1207 19:15:20.305885 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1207 19:15:20.305891 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1207 19:15:20.305973 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1207 19:15:20.305983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1207 19:15:20.305988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1207 19:15:20.305992 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1207 19:15:20.305997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1207 19:15:20.308727 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://902a764a3f6e3ac55c1817f369dfd69ffbf8528bfe4444a6514cdabd48f09fb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:05Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:05 crc kubenswrapper[4815]: I1207 19:16:05.870447 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:05Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:05 crc kubenswrapper[4815]: I1207 19:16:05.887151 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b57a8653da7e9136a6c54872ff67a01c2d6ad5077842b783ffe74e0069bc8693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:05Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:05 crc kubenswrapper[4815]: I1207 19:16:05.901810 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:05Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:05 crc kubenswrapper[4815]: I1207 19:16:05.920614 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ec34a6371c74ee0778e27d73f5e2948d9ee1bb518f658675df123df60879c61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc20ae853b989c9e9cfeb47a40bd1a02397e3ce0b0b4feec3fcec56ea7be16a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:05Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:05 crc kubenswrapper[4815]: I1207 19:16:05.936764 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:05Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:05 crc kubenswrapper[4815]: I1207 19:16:05.947969 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nz6q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92d414c2-ecc3-4598-ac9d-b982bfd89c7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://003f0d6dca39f167925bc3a05634d9dd20f23e54135631e0099a56801f9daf3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cp5k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nz6q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:05Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:05 crc kubenswrapper[4815]: I1207 19:16:05.950218 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:05 crc kubenswrapper[4815]: I1207 19:16:05.950249 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:05 crc kubenswrapper[4815]: I1207 19:16:05.950260 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:05 crc kubenswrapper[4815]: I1207 19:16:05.950276 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:05 crc kubenswrapper[4815]: I1207 19:16:05.950285 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:05Z","lastTransitionTime":"2025-12-07T19:16:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:05 crc kubenswrapper[4815]: I1207 19:16:05.960466 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c752886-7f9b-4605-8358-0fde597c93da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://441c88363ea1cc8305fa21c51c9237798c47d82a207c9e80da98df93027fe4a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb1ec115ead917caf40ce3586653d6eb78e9a290d845f766f8818c2b57ece6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bda919a651aa83d2f9a82f711d0b242e14140e3d13ec5a80f7ec675a4aedb21a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33a4c84a8cd512a90c51f4721d9ddfde2a452e7f03d9809d492d5bccd7cf4aa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33a4c84a8cd512a90c51f4721d9ddfde2a452e7f03d9809d492d5bccd7cf4aa4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:05Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:05 crc kubenswrapper[4815]: I1207 19:16:05.973021 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xbq22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"201e9ba8-3e19-4555-90f0-587497a2a328\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z24bd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z24bd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xbq22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:05Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:05 crc kubenswrapper[4815]: I1207 19:16:05.983861 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d662ba2-aa03-4eea-bd30-8ad40638f6c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4939db657209a8744656f63541b50598888981216b000dd4316a9327fdfbcf34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2v8vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55eba62082762b6527923446c2d419d2eb00919331cf8e59bfa9cb27ca65a82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2v8vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gkn4h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:05Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:05 crc kubenswrapper[4815]: I1207 19:16:05.995641 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s95hp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b739f36-d9c4-4fb6-9ead-9df05e283dea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73eedf1d85ce69ddf2fed2445ed9318cb4515aee47f89fe9b1db69bbf213ea2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prbsc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s95hp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:05Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:06 crc kubenswrapper[4815]: I1207 19:16:06.007841 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gmf4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fefaece-ab52-48e2-9ee9-fb07be1922f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://600d32fcae56dd0fa63421283e9b0fdfc23c4589a1dc4a66e5ff1f4c6a415e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e14454c2c5dbbdcbfd3ff7ec5d618c2dfe48f41af9846b72f61941b878e53d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e14454c2c5dbbdcbfd3ff7ec5d618c2dfe48f41af9846b72f61941b878e53d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f33f1decd49213b8e034c2d1f2a3a51ab99aa18679bd3378becaa27f1ef4b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f33f1decd49213b8e034c2d1f2a3a51ab99aa18679bd3378becaa27f1ef4b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c519c7c8ea1b7405466970c64269b72cc6e8129838764dfc4a662cb4df419c6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c519c7c8ea1b7405466970c64269b72cc6e8129838764dfc4a662cb4df419c6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c9efcd688f4886df56fb4d0e2ff5f5dd0c9e877186e1b53d398b7bed0547609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c9efcd688f4886df56fb4d0e2ff5f5dd0c9e877186e1b53d398b7bed0547609\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a4a622d5231f5bacbb9dba0290a87f10310a0053be2dd692718fcf1002deb44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a4a622d5231f5bacbb9dba0290a87f10310a0053be2dd692718fcf1002deb44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://825649f5768765281bad22f9b86ed162b80cc258ce11a746ec06c9f415a21529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://825649f5768765281bad22f9b86ed162b80cc258ce11a746ec06c9f415a21529\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gmf4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:06Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:06 crc kubenswrapper[4815]: I1207 19:16:06.019288 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tqxds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dfd265a-0c7f-40bd-9226-82c2b1abbeda\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3523e317081c56f05c00d0288c6ad6f1ff04f1346772bc0c86aed488570786a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzfhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a277a724d026909546ae84aee6d8d29fc87f0277f9d6aa45ca99e5f116ac79d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzfhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tqxds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:06Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:06 crc kubenswrapper[4815]: I1207 19:16:06.052292 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:06 crc kubenswrapper[4815]: I1207 19:16:06.052329 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:06 crc kubenswrapper[4815]: I1207 19:16:06.052341 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:06 crc kubenswrapper[4815]: I1207 19:16:06.052356 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:06 crc kubenswrapper[4815]: I1207 19:16:06.052367 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:06Z","lastTransitionTime":"2025-12-07T19:16:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:06 crc kubenswrapper[4815]: I1207 19:16:06.154482 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:06 crc kubenswrapper[4815]: I1207 19:16:06.154532 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:06 crc kubenswrapper[4815]: I1207 19:16:06.154541 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:06 crc kubenswrapper[4815]: I1207 19:16:06.154558 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:06 crc kubenswrapper[4815]: I1207 19:16:06.154568 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:06Z","lastTransitionTime":"2025-12-07T19:16:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:06 crc kubenswrapper[4815]: I1207 19:16:06.256867 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:06 crc kubenswrapper[4815]: I1207 19:16:06.256898 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:06 crc kubenswrapper[4815]: I1207 19:16:06.256934 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:06 crc kubenswrapper[4815]: I1207 19:16:06.256952 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:06 crc kubenswrapper[4815]: I1207 19:16:06.256963 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:06Z","lastTransitionTime":"2025-12-07T19:16:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:06 crc kubenswrapper[4815]: I1207 19:16:06.359342 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:06 crc kubenswrapper[4815]: I1207 19:16:06.359403 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:06 crc kubenswrapper[4815]: I1207 19:16:06.359421 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:06 crc kubenswrapper[4815]: I1207 19:16:06.359448 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:06 crc kubenswrapper[4815]: I1207 19:16:06.359464 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:06Z","lastTransitionTime":"2025-12-07T19:16:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:06 crc kubenswrapper[4815]: I1207 19:16:06.462548 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:06 crc kubenswrapper[4815]: I1207 19:16:06.462582 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:06 crc kubenswrapper[4815]: I1207 19:16:06.462594 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:06 crc kubenswrapper[4815]: I1207 19:16:06.462611 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:06 crc kubenswrapper[4815]: I1207 19:16:06.462621 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:06Z","lastTransitionTime":"2025-12-07T19:16:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:06 crc kubenswrapper[4815]: I1207 19:16:06.565535 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:06 crc kubenswrapper[4815]: I1207 19:16:06.565571 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:06 crc kubenswrapper[4815]: I1207 19:16:06.565582 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:06 crc kubenswrapper[4815]: I1207 19:16:06.565596 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:06 crc kubenswrapper[4815]: I1207 19:16:06.565607 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:06Z","lastTransitionTime":"2025-12-07T19:16:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:06 crc kubenswrapper[4815]: I1207 19:16:06.668517 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:06 crc kubenswrapper[4815]: I1207 19:16:06.668577 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:06 crc kubenswrapper[4815]: I1207 19:16:06.668595 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:06 crc kubenswrapper[4815]: I1207 19:16:06.668620 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:06 crc kubenswrapper[4815]: I1207 19:16:06.668637 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:06Z","lastTransitionTime":"2025-12-07T19:16:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:06 crc kubenswrapper[4815]: I1207 19:16:06.769118 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 07 19:16:06 crc kubenswrapper[4815]: I1207 19:16:06.769130 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xbq22" Dec 07 19:16:06 crc kubenswrapper[4815]: E1207 19:16:06.769298 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 07 19:16:06 crc kubenswrapper[4815]: E1207 19:16:06.769516 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xbq22" podUID="201e9ba8-3e19-4555-90f0-587497a2a328" Dec 07 19:16:06 crc kubenswrapper[4815]: I1207 19:16:06.770978 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:06 crc kubenswrapper[4815]: I1207 19:16:06.771052 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:06 crc kubenswrapper[4815]: I1207 19:16:06.771066 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:06 crc kubenswrapper[4815]: I1207 19:16:06.771104 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:06 crc kubenswrapper[4815]: I1207 19:16:06.771119 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:06Z","lastTransitionTime":"2025-12-07T19:16:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:06 crc kubenswrapper[4815]: I1207 19:16:06.873315 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:06 crc kubenswrapper[4815]: I1207 19:16:06.873380 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:06 crc kubenswrapper[4815]: I1207 19:16:06.873390 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:06 crc kubenswrapper[4815]: I1207 19:16:06.873404 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:06 crc kubenswrapper[4815]: I1207 19:16:06.873414 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:06Z","lastTransitionTime":"2025-12-07T19:16:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:06 crc kubenswrapper[4815]: I1207 19:16:06.975459 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:06 crc kubenswrapper[4815]: I1207 19:16:06.975494 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:06 crc kubenswrapper[4815]: I1207 19:16:06.975503 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:06 crc kubenswrapper[4815]: I1207 19:16:06.975518 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:06 crc kubenswrapper[4815]: I1207 19:16:06.975529 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:06Z","lastTransitionTime":"2025-12-07T19:16:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:07 crc kubenswrapper[4815]: I1207 19:16:07.077612 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:07 crc kubenswrapper[4815]: I1207 19:16:07.077905 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:07 crc kubenswrapper[4815]: I1207 19:16:07.078028 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:07 crc kubenswrapper[4815]: I1207 19:16:07.078104 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:07 crc kubenswrapper[4815]: I1207 19:16:07.078180 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:07Z","lastTransitionTime":"2025-12-07T19:16:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:07 crc kubenswrapper[4815]: I1207 19:16:07.180412 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:07 crc kubenswrapper[4815]: I1207 19:16:07.180577 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:07 crc kubenswrapper[4815]: I1207 19:16:07.180586 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:07 crc kubenswrapper[4815]: I1207 19:16:07.180599 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:07 crc kubenswrapper[4815]: I1207 19:16:07.180607 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:07Z","lastTransitionTime":"2025-12-07T19:16:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:07 crc kubenswrapper[4815]: I1207 19:16:07.283560 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:07 crc kubenswrapper[4815]: I1207 19:16:07.283799 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:07 crc kubenswrapper[4815]: I1207 19:16:07.283889 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:07 crc kubenswrapper[4815]: I1207 19:16:07.284000 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:07 crc kubenswrapper[4815]: I1207 19:16:07.284118 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:07Z","lastTransitionTime":"2025-12-07T19:16:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:07 crc kubenswrapper[4815]: I1207 19:16:07.389988 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:07 crc kubenswrapper[4815]: I1207 19:16:07.390039 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:07 crc kubenswrapper[4815]: I1207 19:16:07.390052 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:07 crc kubenswrapper[4815]: I1207 19:16:07.390072 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:07 crc kubenswrapper[4815]: I1207 19:16:07.390086 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:07Z","lastTransitionTime":"2025-12-07T19:16:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:07 crc kubenswrapper[4815]: I1207 19:16:07.492109 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:07 crc kubenswrapper[4815]: I1207 19:16:07.492147 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:07 crc kubenswrapper[4815]: I1207 19:16:07.492156 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:07 crc kubenswrapper[4815]: I1207 19:16:07.492169 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:07 crc kubenswrapper[4815]: I1207 19:16:07.492178 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:07Z","lastTransitionTime":"2025-12-07T19:16:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:07 crc kubenswrapper[4815]: I1207 19:16:07.594090 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:07 crc kubenswrapper[4815]: I1207 19:16:07.594134 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:07 crc kubenswrapper[4815]: I1207 19:16:07.594149 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:07 crc kubenswrapper[4815]: I1207 19:16:07.594168 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:07 crc kubenswrapper[4815]: I1207 19:16:07.594180 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:07Z","lastTransitionTime":"2025-12-07T19:16:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:07 crc kubenswrapper[4815]: I1207 19:16:07.696595 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:07 crc kubenswrapper[4815]: I1207 19:16:07.696635 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:07 crc kubenswrapper[4815]: I1207 19:16:07.696646 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:07 crc kubenswrapper[4815]: I1207 19:16:07.696665 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:07 crc kubenswrapper[4815]: I1207 19:16:07.696677 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:07Z","lastTransitionTime":"2025-12-07T19:16:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:07 crc kubenswrapper[4815]: I1207 19:16:07.769759 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 07 19:16:07 crc kubenswrapper[4815]: I1207 19:16:07.769834 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 07 19:16:07 crc kubenswrapper[4815]: E1207 19:16:07.770267 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 07 19:16:07 crc kubenswrapper[4815]: E1207 19:16:07.770145 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 07 19:16:07 crc kubenswrapper[4815]: I1207 19:16:07.798957 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:07 crc kubenswrapper[4815]: I1207 19:16:07.799199 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:07 crc kubenswrapper[4815]: I1207 19:16:07.799319 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:07 crc kubenswrapper[4815]: I1207 19:16:07.799409 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:07 crc kubenswrapper[4815]: I1207 19:16:07.799500 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:07Z","lastTransitionTime":"2025-12-07T19:16:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:07 crc kubenswrapper[4815]: I1207 19:16:07.901942 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:07 crc kubenswrapper[4815]: I1207 19:16:07.901976 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:07 crc kubenswrapper[4815]: I1207 19:16:07.901986 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:07 crc kubenswrapper[4815]: I1207 19:16:07.902001 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:07 crc kubenswrapper[4815]: I1207 19:16:07.902011 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:07Z","lastTransitionTime":"2025-12-07T19:16:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:08 crc kubenswrapper[4815]: I1207 19:16:08.003729 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:08 crc kubenswrapper[4815]: I1207 19:16:08.004338 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:08 crc kubenswrapper[4815]: I1207 19:16:08.004427 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:08 crc kubenswrapper[4815]: I1207 19:16:08.004513 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:08 crc kubenswrapper[4815]: I1207 19:16:08.004599 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:08Z","lastTransitionTime":"2025-12-07T19:16:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:08 crc kubenswrapper[4815]: I1207 19:16:08.106939 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:08 crc kubenswrapper[4815]: I1207 19:16:08.106969 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:08 crc kubenswrapper[4815]: I1207 19:16:08.106980 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:08 crc kubenswrapper[4815]: I1207 19:16:08.106993 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:08 crc kubenswrapper[4815]: I1207 19:16:08.107002 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:08Z","lastTransitionTime":"2025-12-07T19:16:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:08 crc kubenswrapper[4815]: I1207 19:16:08.209789 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:08 crc kubenswrapper[4815]: I1207 19:16:08.209821 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:08 crc kubenswrapper[4815]: I1207 19:16:08.209833 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:08 crc kubenswrapper[4815]: I1207 19:16:08.209849 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:08 crc kubenswrapper[4815]: I1207 19:16:08.209859 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:08Z","lastTransitionTime":"2025-12-07T19:16:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:08 crc kubenswrapper[4815]: I1207 19:16:08.242120 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:08 crc kubenswrapper[4815]: I1207 19:16:08.242324 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:08 crc kubenswrapper[4815]: I1207 19:16:08.242381 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:08 crc kubenswrapper[4815]: I1207 19:16:08.242478 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:08 crc kubenswrapper[4815]: I1207 19:16:08.242538 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:08Z","lastTransitionTime":"2025-12-07T19:16:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:08 crc kubenswrapper[4815]: E1207 19:16:08.254107 4815 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:16:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:16:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:16:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:16:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:16:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:16:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:16:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:16:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3d749f27-20c8-4d23-ad52-dc6b852bf3b7\\\",\\\"systemUUID\\\":\\\"077277cc-9bde-4aeb-947a-0cf3c49a1ac0\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:08Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:08 crc kubenswrapper[4815]: I1207 19:16:08.257590 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:08 crc kubenswrapper[4815]: I1207 19:16:08.257618 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:08 crc kubenswrapper[4815]: I1207 19:16:08.257627 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:08 crc kubenswrapper[4815]: I1207 19:16:08.257640 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:08 crc kubenswrapper[4815]: I1207 19:16:08.257648 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:08Z","lastTransitionTime":"2025-12-07T19:16:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:08 crc kubenswrapper[4815]: E1207 19:16:08.268361 4815 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:16:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:16:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:16:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:16:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:16:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:16:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:16:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:16:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3d749f27-20c8-4d23-ad52-dc6b852bf3b7\\\",\\\"systemUUID\\\":\\\"077277cc-9bde-4aeb-947a-0cf3c49a1ac0\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:08Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:08 crc kubenswrapper[4815]: I1207 19:16:08.272097 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:08 crc kubenswrapper[4815]: I1207 19:16:08.272123 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:08 crc kubenswrapper[4815]: I1207 19:16:08.272131 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:08 crc kubenswrapper[4815]: I1207 19:16:08.272142 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:08 crc kubenswrapper[4815]: I1207 19:16:08.272151 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:08Z","lastTransitionTime":"2025-12-07T19:16:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:08 crc kubenswrapper[4815]: E1207 19:16:08.288641 4815 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:16:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:16:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:16:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:16:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:16:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:16:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:16:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:16:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3d749f27-20c8-4d23-ad52-dc6b852bf3b7\\\",\\\"systemUUID\\\":\\\"077277cc-9bde-4aeb-947a-0cf3c49a1ac0\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:08Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:08 crc kubenswrapper[4815]: I1207 19:16:08.292604 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:08 crc kubenswrapper[4815]: I1207 19:16:08.292633 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:08 crc kubenswrapper[4815]: I1207 19:16:08.292642 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:08 crc kubenswrapper[4815]: I1207 19:16:08.292655 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:08 crc kubenswrapper[4815]: I1207 19:16:08.292663 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:08Z","lastTransitionTime":"2025-12-07T19:16:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:08 crc kubenswrapper[4815]: E1207 19:16:08.311839 4815 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:16:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:16:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:16:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:16:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:16:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:16:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:16:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:16:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3d749f27-20c8-4d23-ad52-dc6b852bf3b7\\\",\\\"systemUUID\\\":\\\"077277cc-9bde-4aeb-947a-0cf3c49a1ac0\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:08Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:08 crc kubenswrapper[4815]: I1207 19:16:08.315983 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:08 crc kubenswrapper[4815]: I1207 19:16:08.316029 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:08 crc kubenswrapper[4815]: I1207 19:16:08.316047 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:08 crc kubenswrapper[4815]: I1207 19:16:08.316071 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:08 crc kubenswrapper[4815]: I1207 19:16:08.316089 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:08Z","lastTransitionTime":"2025-12-07T19:16:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:08 crc kubenswrapper[4815]: E1207 19:16:08.330701 4815 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:16:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:16:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:16:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:16:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:16:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:16:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:16:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:16:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3d749f27-20c8-4d23-ad52-dc6b852bf3b7\\\",\\\"systemUUID\\\":\\\"077277cc-9bde-4aeb-947a-0cf3c49a1ac0\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:08Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:08 crc kubenswrapper[4815]: E1207 19:16:08.330807 4815 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 07 19:16:08 crc kubenswrapper[4815]: I1207 19:16:08.332007 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:08 crc kubenswrapper[4815]: I1207 19:16:08.332027 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:08 crc kubenswrapper[4815]: I1207 19:16:08.332037 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:08 crc kubenswrapper[4815]: I1207 19:16:08.332052 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:08 crc kubenswrapper[4815]: I1207 19:16:08.332062 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:08Z","lastTransitionTime":"2025-12-07T19:16:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:08 crc kubenswrapper[4815]: I1207 19:16:08.434639 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:08 crc kubenswrapper[4815]: I1207 19:16:08.434673 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:08 crc kubenswrapper[4815]: I1207 19:16:08.434681 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:08 crc kubenswrapper[4815]: I1207 19:16:08.434694 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:08 crc kubenswrapper[4815]: I1207 19:16:08.434703 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:08Z","lastTransitionTime":"2025-12-07T19:16:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:08 crc kubenswrapper[4815]: I1207 19:16:08.536788 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:08 crc kubenswrapper[4815]: I1207 19:16:08.536869 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:08 crc kubenswrapper[4815]: I1207 19:16:08.536892 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:08 crc kubenswrapper[4815]: I1207 19:16:08.537400 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:08 crc kubenswrapper[4815]: I1207 19:16:08.537767 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:08Z","lastTransitionTime":"2025-12-07T19:16:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:08 crc kubenswrapper[4815]: I1207 19:16:08.639974 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:08 crc kubenswrapper[4815]: I1207 19:16:08.640031 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:08 crc kubenswrapper[4815]: I1207 19:16:08.640052 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:08 crc kubenswrapper[4815]: I1207 19:16:08.640079 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:08 crc kubenswrapper[4815]: I1207 19:16:08.640099 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:08Z","lastTransitionTime":"2025-12-07T19:16:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:08 crc kubenswrapper[4815]: I1207 19:16:08.742645 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:08 crc kubenswrapper[4815]: I1207 19:16:08.742701 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:08 crc kubenswrapper[4815]: I1207 19:16:08.742722 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:08 crc kubenswrapper[4815]: I1207 19:16:08.742756 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:08 crc kubenswrapper[4815]: I1207 19:16:08.742778 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:08Z","lastTransitionTime":"2025-12-07T19:16:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:08 crc kubenswrapper[4815]: I1207 19:16:08.769028 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 07 19:16:08 crc kubenswrapper[4815]: E1207 19:16:08.769132 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 07 19:16:08 crc kubenswrapper[4815]: I1207 19:16:08.769028 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xbq22" Dec 07 19:16:08 crc kubenswrapper[4815]: E1207 19:16:08.769226 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xbq22" podUID="201e9ba8-3e19-4555-90f0-587497a2a328" Dec 07 19:16:08 crc kubenswrapper[4815]: I1207 19:16:08.844434 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:08 crc kubenswrapper[4815]: I1207 19:16:08.844490 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:08 crc kubenswrapper[4815]: I1207 19:16:08.844508 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:08 crc kubenswrapper[4815]: I1207 19:16:08.844532 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:08 crc kubenswrapper[4815]: I1207 19:16:08.844548 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:08Z","lastTransitionTime":"2025-12-07T19:16:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:08 crc kubenswrapper[4815]: I1207 19:16:08.946542 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:08 crc kubenswrapper[4815]: I1207 19:16:08.946582 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:08 crc kubenswrapper[4815]: I1207 19:16:08.946594 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:08 crc kubenswrapper[4815]: I1207 19:16:08.946610 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:08 crc kubenswrapper[4815]: I1207 19:16:08.946620 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:08Z","lastTransitionTime":"2025-12-07T19:16:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:09 crc kubenswrapper[4815]: I1207 19:16:09.049856 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:09 crc kubenswrapper[4815]: I1207 19:16:09.049885 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:09 crc kubenswrapper[4815]: I1207 19:16:09.049894 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:09 crc kubenswrapper[4815]: I1207 19:16:09.049907 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:09 crc kubenswrapper[4815]: I1207 19:16:09.049980 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:09Z","lastTransitionTime":"2025-12-07T19:16:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:09 crc kubenswrapper[4815]: I1207 19:16:09.153890 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:09 crc kubenswrapper[4815]: I1207 19:16:09.153984 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:09 crc kubenswrapper[4815]: I1207 19:16:09.154014 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:09 crc kubenswrapper[4815]: I1207 19:16:09.154039 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:09 crc kubenswrapper[4815]: I1207 19:16:09.154054 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:09Z","lastTransitionTime":"2025-12-07T19:16:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:09 crc kubenswrapper[4815]: I1207 19:16:09.257763 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:09 crc kubenswrapper[4815]: I1207 19:16:09.257795 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:09 crc kubenswrapper[4815]: I1207 19:16:09.257806 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:09 crc kubenswrapper[4815]: I1207 19:16:09.257823 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:09 crc kubenswrapper[4815]: I1207 19:16:09.257835 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:09Z","lastTransitionTime":"2025-12-07T19:16:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:09 crc kubenswrapper[4815]: I1207 19:16:09.361348 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:09 crc kubenswrapper[4815]: I1207 19:16:09.361391 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:09 crc kubenswrapper[4815]: I1207 19:16:09.361428 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:09 crc kubenswrapper[4815]: I1207 19:16:09.361450 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:09 crc kubenswrapper[4815]: I1207 19:16:09.361462 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:09Z","lastTransitionTime":"2025-12-07T19:16:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:09 crc kubenswrapper[4815]: I1207 19:16:09.463538 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:09 crc kubenswrapper[4815]: I1207 19:16:09.463585 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:09 crc kubenswrapper[4815]: I1207 19:16:09.463596 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:09 crc kubenswrapper[4815]: I1207 19:16:09.463612 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:09 crc kubenswrapper[4815]: I1207 19:16:09.463625 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:09Z","lastTransitionTime":"2025-12-07T19:16:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:09 crc kubenswrapper[4815]: I1207 19:16:09.565782 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:09 crc kubenswrapper[4815]: I1207 19:16:09.565822 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:09 crc kubenswrapper[4815]: I1207 19:16:09.565833 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:09 crc kubenswrapper[4815]: I1207 19:16:09.565850 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:09 crc kubenswrapper[4815]: I1207 19:16:09.565861 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:09Z","lastTransitionTime":"2025-12-07T19:16:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:09 crc kubenswrapper[4815]: I1207 19:16:09.668470 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:09 crc kubenswrapper[4815]: I1207 19:16:09.668506 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:09 crc kubenswrapper[4815]: I1207 19:16:09.668517 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:09 crc kubenswrapper[4815]: I1207 19:16:09.668531 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:09 crc kubenswrapper[4815]: I1207 19:16:09.668541 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:09Z","lastTransitionTime":"2025-12-07T19:16:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:09 crc kubenswrapper[4815]: I1207 19:16:09.769571 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 07 19:16:09 crc kubenswrapper[4815]: E1207 19:16:09.769773 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 07 19:16:09 crc kubenswrapper[4815]: I1207 19:16:09.769630 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 07 19:16:09 crc kubenswrapper[4815]: E1207 19:16:09.770031 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 07 19:16:09 crc kubenswrapper[4815]: I1207 19:16:09.771652 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:09 crc kubenswrapper[4815]: I1207 19:16:09.771666 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:09 crc kubenswrapper[4815]: I1207 19:16:09.771674 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:09 crc kubenswrapper[4815]: I1207 19:16:09.771684 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:09 crc kubenswrapper[4815]: I1207 19:16:09.771693 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:09Z","lastTransitionTime":"2025-12-07T19:16:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:09 crc kubenswrapper[4815]: I1207 19:16:09.873514 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:09 crc kubenswrapper[4815]: I1207 19:16:09.873565 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:09 crc kubenswrapper[4815]: I1207 19:16:09.873577 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:09 crc kubenswrapper[4815]: I1207 19:16:09.873595 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:09 crc kubenswrapper[4815]: I1207 19:16:09.873607 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:09Z","lastTransitionTime":"2025-12-07T19:16:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:09 crc kubenswrapper[4815]: I1207 19:16:09.975656 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:09 crc kubenswrapper[4815]: I1207 19:16:09.975702 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:09 crc kubenswrapper[4815]: I1207 19:16:09.975714 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:09 crc kubenswrapper[4815]: I1207 19:16:09.975732 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:09 crc kubenswrapper[4815]: I1207 19:16:09.975744 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:09Z","lastTransitionTime":"2025-12-07T19:16:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:10 crc kubenswrapper[4815]: I1207 19:16:10.078112 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:10 crc kubenswrapper[4815]: I1207 19:16:10.078739 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:10 crc kubenswrapper[4815]: I1207 19:16:10.078861 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:10 crc kubenswrapper[4815]: I1207 19:16:10.078986 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:10 crc kubenswrapper[4815]: I1207 19:16:10.079069 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:10Z","lastTransitionTime":"2025-12-07T19:16:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:10 crc kubenswrapper[4815]: I1207 19:16:10.182127 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:10 crc kubenswrapper[4815]: I1207 19:16:10.182195 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:10 crc kubenswrapper[4815]: I1207 19:16:10.182219 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:10 crc kubenswrapper[4815]: I1207 19:16:10.182249 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:10 crc kubenswrapper[4815]: I1207 19:16:10.182271 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:10Z","lastTransitionTime":"2025-12-07T19:16:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:10 crc kubenswrapper[4815]: I1207 19:16:10.285618 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:10 crc kubenswrapper[4815]: I1207 19:16:10.285699 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:10 crc kubenswrapper[4815]: I1207 19:16:10.285718 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:10 crc kubenswrapper[4815]: I1207 19:16:10.285744 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:10 crc kubenswrapper[4815]: I1207 19:16:10.285763 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:10Z","lastTransitionTime":"2025-12-07T19:16:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:10 crc kubenswrapper[4815]: I1207 19:16:10.388865 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:10 crc kubenswrapper[4815]: I1207 19:16:10.388977 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:10 crc kubenswrapper[4815]: I1207 19:16:10.388999 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:10 crc kubenswrapper[4815]: I1207 19:16:10.389025 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:10 crc kubenswrapper[4815]: I1207 19:16:10.389042 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:10Z","lastTransitionTime":"2025-12-07T19:16:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:10 crc kubenswrapper[4815]: I1207 19:16:10.493195 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:10 crc kubenswrapper[4815]: I1207 19:16:10.493262 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:10 crc kubenswrapper[4815]: I1207 19:16:10.493280 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:10 crc kubenswrapper[4815]: I1207 19:16:10.493312 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:10 crc kubenswrapper[4815]: I1207 19:16:10.493336 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:10Z","lastTransitionTime":"2025-12-07T19:16:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:10 crc kubenswrapper[4815]: I1207 19:16:10.597128 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:10 crc kubenswrapper[4815]: I1207 19:16:10.597189 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:10 crc kubenswrapper[4815]: I1207 19:16:10.597208 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:10 crc kubenswrapper[4815]: I1207 19:16:10.597240 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:10 crc kubenswrapper[4815]: I1207 19:16:10.597261 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:10Z","lastTransitionTime":"2025-12-07T19:16:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:10 crc kubenswrapper[4815]: I1207 19:16:10.701549 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:10 crc kubenswrapper[4815]: I1207 19:16:10.701624 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:10 crc kubenswrapper[4815]: I1207 19:16:10.701642 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:10 crc kubenswrapper[4815]: I1207 19:16:10.701681 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:10 crc kubenswrapper[4815]: I1207 19:16:10.701698 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:10Z","lastTransitionTime":"2025-12-07T19:16:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:10 crc kubenswrapper[4815]: I1207 19:16:10.769787 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xbq22" Dec 07 19:16:10 crc kubenswrapper[4815]: I1207 19:16:10.769861 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 07 19:16:10 crc kubenswrapper[4815]: E1207 19:16:10.770004 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xbq22" podUID="201e9ba8-3e19-4555-90f0-587497a2a328" Dec 07 19:16:10 crc kubenswrapper[4815]: E1207 19:16:10.770138 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 07 19:16:10 crc kubenswrapper[4815]: I1207 19:16:10.805281 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:10 crc kubenswrapper[4815]: I1207 19:16:10.805337 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:10 crc kubenswrapper[4815]: I1207 19:16:10.805437 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:10 crc kubenswrapper[4815]: I1207 19:16:10.805473 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:10 crc kubenswrapper[4815]: I1207 19:16:10.805495 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:10Z","lastTransitionTime":"2025-12-07T19:16:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:10 crc kubenswrapper[4815]: I1207 19:16:10.908786 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:10 crc kubenswrapper[4815]: I1207 19:16:10.908845 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:10 crc kubenswrapper[4815]: I1207 19:16:10.908872 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:10 crc kubenswrapper[4815]: I1207 19:16:10.908901 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:10 crc kubenswrapper[4815]: I1207 19:16:10.908954 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:10Z","lastTransitionTime":"2025-12-07T19:16:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:11 crc kubenswrapper[4815]: I1207 19:16:11.011553 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:11 crc kubenswrapper[4815]: I1207 19:16:11.011608 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:11 crc kubenswrapper[4815]: I1207 19:16:11.011629 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:11 crc kubenswrapper[4815]: I1207 19:16:11.011656 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:11 crc kubenswrapper[4815]: I1207 19:16:11.011682 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:11Z","lastTransitionTime":"2025-12-07T19:16:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:11 crc kubenswrapper[4815]: I1207 19:16:11.115086 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:11 crc kubenswrapper[4815]: I1207 19:16:11.115160 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:11 crc kubenswrapper[4815]: I1207 19:16:11.115178 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:11 crc kubenswrapper[4815]: I1207 19:16:11.115203 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:11 crc kubenswrapper[4815]: I1207 19:16:11.115220 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:11Z","lastTransitionTime":"2025-12-07T19:16:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:11 crc kubenswrapper[4815]: I1207 19:16:11.218233 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:11 crc kubenswrapper[4815]: I1207 19:16:11.218278 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:11 crc kubenswrapper[4815]: I1207 19:16:11.218291 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:11 crc kubenswrapper[4815]: I1207 19:16:11.218308 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:11 crc kubenswrapper[4815]: I1207 19:16:11.218319 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:11Z","lastTransitionTime":"2025-12-07T19:16:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:11 crc kubenswrapper[4815]: I1207 19:16:11.321241 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:11 crc kubenswrapper[4815]: I1207 19:16:11.321318 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:11 crc kubenswrapper[4815]: I1207 19:16:11.321337 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:11 crc kubenswrapper[4815]: I1207 19:16:11.321364 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:11 crc kubenswrapper[4815]: I1207 19:16:11.321385 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:11Z","lastTransitionTime":"2025-12-07T19:16:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:11 crc kubenswrapper[4815]: I1207 19:16:11.425101 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:11 crc kubenswrapper[4815]: I1207 19:16:11.425168 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:11 crc kubenswrapper[4815]: I1207 19:16:11.425185 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:11 crc kubenswrapper[4815]: I1207 19:16:11.425210 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:11 crc kubenswrapper[4815]: I1207 19:16:11.425228 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:11Z","lastTransitionTime":"2025-12-07T19:16:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:11 crc kubenswrapper[4815]: I1207 19:16:11.528312 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:11 crc kubenswrapper[4815]: I1207 19:16:11.528380 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:11 crc kubenswrapper[4815]: I1207 19:16:11.528400 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:11 crc kubenswrapper[4815]: I1207 19:16:11.528434 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:11 crc kubenswrapper[4815]: I1207 19:16:11.528453 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:11Z","lastTransitionTime":"2025-12-07T19:16:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:11 crc kubenswrapper[4815]: I1207 19:16:11.631710 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:11 crc kubenswrapper[4815]: I1207 19:16:11.631777 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:11 crc kubenswrapper[4815]: I1207 19:16:11.631797 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:11 crc kubenswrapper[4815]: I1207 19:16:11.631824 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:11 crc kubenswrapper[4815]: I1207 19:16:11.631843 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:11Z","lastTransitionTime":"2025-12-07T19:16:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:11 crc kubenswrapper[4815]: I1207 19:16:11.734981 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:11 crc kubenswrapper[4815]: I1207 19:16:11.735202 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:11 crc kubenswrapper[4815]: I1207 19:16:11.735224 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:11 crc kubenswrapper[4815]: I1207 19:16:11.735266 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:11 crc kubenswrapper[4815]: I1207 19:16:11.736155 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:11Z","lastTransitionTime":"2025-12-07T19:16:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:11 crc kubenswrapper[4815]: I1207 19:16:11.769013 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 07 19:16:11 crc kubenswrapper[4815]: I1207 19:16:11.769017 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 07 19:16:11 crc kubenswrapper[4815]: E1207 19:16:11.769162 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 07 19:16:11 crc kubenswrapper[4815]: E1207 19:16:11.769231 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 07 19:16:11 crc kubenswrapper[4815]: I1207 19:16:11.839101 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:11 crc kubenswrapper[4815]: I1207 19:16:11.839175 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:11 crc kubenswrapper[4815]: I1207 19:16:11.839194 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:11 crc kubenswrapper[4815]: I1207 19:16:11.839219 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:11 crc kubenswrapper[4815]: I1207 19:16:11.839236 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:11Z","lastTransitionTime":"2025-12-07T19:16:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:11 crc kubenswrapper[4815]: I1207 19:16:11.942600 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:11 crc kubenswrapper[4815]: I1207 19:16:11.942673 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:11 crc kubenswrapper[4815]: I1207 19:16:11.942708 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:11 crc kubenswrapper[4815]: I1207 19:16:11.942737 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:11 crc kubenswrapper[4815]: I1207 19:16:11.942758 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:11Z","lastTransitionTime":"2025-12-07T19:16:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:12 crc kubenswrapper[4815]: I1207 19:16:12.045863 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:12 crc kubenswrapper[4815]: I1207 19:16:12.045948 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:12 crc kubenswrapper[4815]: I1207 19:16:12.045968 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:12 crc kubenswrapper[4815]: I1207 19:16:12.045993 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:12 crc kubenswrapper[4815]: I1207 19:16:12.046012 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:12Z","lastTransitionTime":"2025-12-07T19:16:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:12 crc kubenswrapper[4815]: I1207 19:16:12.148162 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:12 crc kubenswrapper[4815]: I1207 19:16:12.148225 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:12 crc kubenswrapper[4815]: I1207 19:16:12.148244 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:12 crc kubenswrapper[4815]: I1207 19:16:12.148271 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:12 crc kubenswrapper[4815]: I1207 19:16:12.148291 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:12Z","lastTransitionTime":"2025-12-07T19:16:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:12 crc kubenswrapper[4815]: I1207 19:16:12.250817 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:12 crc kubenswrapper[4815]: I1207 19:16:12.250882 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:12 crc kubenswrapper[4815]: I1207 19:16:12.250900 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:12 crc kubenswrapper[4815]: I1207 19:16:12.250953 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:12 crc kubenswrapper[4815]: I1207 19:16:12.250971 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:12Z","lastTransitionTime":"2025-12-07T19:16:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:12 crc kubenswrapper[4815]: I1207 19:16:12.354398 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:12 crc kubenswrapper[4815]: I1207 19:16:12.354479 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:12 crc kubenswrapper[4815]: I1207 19:16:12.354497 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:12 crc kubenswrapper[4815]: I1207 19:16:12.354522 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:12 crc kubenswrapper[4815]: I1207 19:16:12.354543 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:12Z","lastTransitionTime":"2025-12-07T19:16:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:12 crc kubenswrapper[4815]: I1207 19:16:12.458159 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:12 crc kubenswrapper[4815]: I1207 19:16:12.458206 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:12 crc kubenswrapper[4815]: I1207 19:16:12.458218 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:12 crc kubenswrapper[4815]: I1207 19:16:12.458236 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:12 crc kubenswrapper[4815]: I1207 19:16:12.458248 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:12Z","lastTransitionTime":"2025-12-07T19:16:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:12 crc kubenswrapper[4815]: I1207 19:16:12.560801 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:12 crc kubenswrapper[4815]: I1207 19:16:12.560850 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:12 crc kubenswrapper[4815]: I1207 19:16:12.560862 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:12 crc kubenswrapper[4815]: I1207 19:16:12.560884 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:12 crc kubenswrapper[4815]: I1207 19:16:12.560895 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:12Z","lastTransitionTime":"2025-12-07T19:16:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:12 crc kubenswrapper[4815]: I1207 19:16:12.643597 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/201e9ba8-3e19-4555-90f0-587497a2a328-metrics-certs\") pod \"network-metrics-daemon-xbq22\" (UID: \"201e9ba8-3e19-4555-90f0-587497a2a328\") " pod="openshift-multus/network-metrics-daemon-xbq22" Dec 07 19:16:12 crc kubenswrapper[4815]: E1207 19:16:12.643809 4815 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 07 19:16:12 crc kubenswrapper[4815]: E1207 19:16:12.643907 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/201e9ba8-3e19-4555-90f0-587497a2a328-metrics-certs podName:201e9ba8-3e19-4555-90f0-587497a2a328 nodeName:}" failed. No retries permitted until 2025-12-07 19:16:44.643883703 +0000 UTC m=+109.222873758 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/201e9ba8-3e19-4555-90f0-587497a2a328-metrics-certs") pod "network-metrics-daemon-xbq22" (UID: "201e9ba8-3e19-4555-90f0-587497a2a328") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 07 19:16:12 crc kubenswrapper[4815]: I1207 19:16:12.663708 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:12 crc kubenswrapper[4815]: I1207 19:16:12.663777 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:12 crc kubenswrapper[4815]: I1207 19:16:12.663799 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:12 crc kubenswrapper[4815]: I1207 19:16:12.663826 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:12 crc kubenswrapper[4815]: I1207 19:16:12.663846 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:12Z","lastTransitionTime":"2025-12-07T19:16:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:12 crc kubenswrapper[4815]: I1207 19:16:12.766428 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:12 crc kubenswrapper[4815]: I1207 19:16:12.766478 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:12 crc kubenswrapper[4815]: I1207 19:16:12.766497 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:12 crc kubenswrapper[4815]: I1207 19:16:12.766522 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:12 crc kubenswrapper[4815]: I1207 19:16:12.766539 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:12Z","lastTransitionTime":"2025-12-07T19:16:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:12 crc kubenswrapper[4815]: I1207 19:16:12.769459 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 07 19:16:12 crc kubenswrapper[4815]: I1207 19:16:12.769490 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xbq22" Dec 07 19:16:12 crc kubenswrapper[4815]: E1207 19:16:12.769569 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 07 19:16:12 crc kubenswrapper[4815]: E1207 19:16:12.769689 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xbq22" podUID="201e9ba8-3e19-4555-90f0-587497a2a328" Dec 07 19:16:12 crc kubenswrapper[4815]: I1207 19:16:12.869904 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:12 crc kubenswrapper[4815]: I1207 19:16:12.870001 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:12 crc kubenswrapper[4815]: I1207 19:16:12.870020 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:12 crc kubenswrapper[4815]: I1207 19:16:12.870045 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:12 crc kubenswrapper[4815]: I1207 19:16:12.870066 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:12Z","lastTransitionTime":"2025-12-07T19:16:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:12 crc kubenswrapper[4815]: I1207 19:16:12.974004 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:12 crc kubenswrapper[4815]: I1207 19:16:12.974066 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:12 crc kubenswrapper[4815]: I1207 19:16:12.974085 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:12 crc kubenswrapper[4815]: I1207 19:16:12.974112 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:12 crc kubenswrapper[4815]: I1207 19:16:12.974136 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:12Z","lastTransitionTime":"2025-12-07T19:16:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:13 crc kubenswrapper[4815]: I1207 19:16:13.077323 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:13 crc kubenswrapper[4815]: I1207 19:16:13.077370 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:13 crc kubenswrapper[4815]: I1207 19:16:13.077387 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:13 crc kubenswrapper[4815]: I1207 19:16:13.077409 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:13 crc kubenswrapper[4815]: I1207 19:16:13.077428 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:13Z","lastTransitionTime":"2025-12-07T19:16:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:13 crc kubenswrapper[4815]: I1207 19:16:13.180898 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:13 crc kubenswrapper[4815]: I1207 19:16:13.180991 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:13 crc kubenswrapper[4815]: I1207 19:16:13.181009 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:13 crc kubenswrapper[4815]: I1207 19:16:13.181035 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:13 crc kubenswrapper[4815]: I1207 19:16:13.181053 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:13Z","lastTransitionTime":"2025-12-07T19:16:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:13 crc kubenswrapper[4815]: I1207 19:16:13.276391 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-s95hp_0b739f36-d9c4-4fb6-9ead-9df05e283dea/kube-multus/0.log" Dec 07 19:16:13 crc kubenswrapper[4815]: I1207 19:16:13.276466 4815 generic.go:334] "Generic (PLEG): container finished" podID="0b739f36-d9c4-4fb6-9ead-9df05e283dea" containerID="73eedf1d85ce69ddf2fed2445ed9318cb4515aee47f89fe9b1db69bbf213ea2a" exitCode=1 Dec 07 19:16:13 crc kubenswrapper[4815]: I1207 19:16:13.276509 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-s95hp" event={"ID":"0b739f36-d9c4-4fb6-9ead-9df05e283dea","Type":"ContainerDied","Data":"73eedf1d85ce69ddf2fed2445ed9318cb4515aee47f89fe9b1db69bbf213ea2a"} Dec 07 19:16:13 crc kubenswrapper[4815]: I1207 19:16:13.277101 4815 scope.go:117] "RemoveContainer" containerID="73eedf1d85ce69ddf2fed2445ed9318cb4515aee47f89fe9b1db69bbf213ea2a" Dec 07 19:16:13 crc kubenswrapper[4815]: I1207 19:16:13.283992 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:13 crc kubenswrapper[4815]: I1207 19:16:13.284101 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:13 crc kubenswrapper[4815]: I1207 19:16:13.284173 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:13 crc kubenswrapper[4815]: I1207 19:16:13.284207 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:13 crc kubenswrapper[4815]: I1207 19:16:13.284274 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:13Z","lastTransitionTime":"2025-12-07T19:16:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:13 crc kubenswrapper[4815]: I1207 19:16:13.305262 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"538dccc2-452b-4d13-8b0d-df993c2b69a2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae001d2e4c277c57382ca0c4a1286c3c135f2fc06ca5a78df27cbdab588ce576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f92a2d37af34c519a6b00a9643f4a46de779ae0070adb0826e9f227960fe4a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89a76372fcc5ff7131084098e41183d148096169b9ccaf69d59dbef45dcfd755\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f29e69a7aa018d68c2e39f3e0f7a619fb25262acaebf83b81fabb6f9b87a9892\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:13Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:13 crc kubenswrapper[4815]: I1207 19:16:13.327974 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cb6cb3bfed28d77ff74b7a2273ddb477e21f930f79b169bfbd98fa10a5ead5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:13Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:13 crc kubenswrapper[4815]: I1207 19:16:13.360449 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac471afc20373bd22caa4941b5ebe38fdf62ad2145fabb4211fa02ce8728ab1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4050d0d7398492efc8809311a01c1a60986b97b6d177846e70c9283399f37b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c63ad6fab49ca4b8deb896aef3b5ff43c3a1ae1035e7ad94815a938939c57c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://812356ce6808d8d2a5845dec1132733fab86a0d6853951920ce5fb5083551c7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ab1c01dcbfe50580e64d89dc9e14843419588c89d7d792af89034d8f272e44e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f140818c38211cb31d63c0b540b60681a781c349ba0523866a3c7e80f33756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d8bbcb8ed47df52ce8120a29864bc098ad0cce571dad6460aea888eda70ccf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d8bbcb8ed47df52ce8120a29864bc098ad0cce571dad6460aea888eda70ccf8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-07T19:15:59Z\\\",\\\"message\\\":\\\"map[10.217.4.213:80:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {2ead45b3-c313-4fbc-a7bc-2b3c4ffd610c}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1207 19:15:58.661470 6472 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1207 19:15:58.661480 6472 ovn.go:134] Ensuring zone local for Pod openshift-kube-controller-manager/kube-controller-manager-crc in node crc\\\\nI1207 19:15:58.661485 6472 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nI1207 19:15:58.661491 6472 default_network_controller.go:776] Recording success event on pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1207 19:15:58.661489 6472 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1207 19:15:58.658410 6472 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:57Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zzw6c_openshift-ovn-kubernetes(13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba72f8ec4b6a36d0c1a6fd8f9f8e5fad577a00b8ae515738a007b994b989d134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a4a4b2d4e80c078ae857e91491c082c6b1c57a5297c485c31fac0c90996b19f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a4a4b2d4e80c078ae857e91491c082c6b1c57a5297c485c31fac0c90996b19f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zzw6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:13Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:13 crc kubenswrapper[4815]: I1207 19:16:13.378840 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kh7gd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29b15186-a725-4067-ba76-336f580327fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1fa8bda26912926bc52757424a9930e1eca6ee953f28ffdb205c1b8b8f29d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kh7gd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:13Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:13 crc kubenswrapper[4815]: I1207 19:16:13.389062 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:13 crc kubenswrapper[4815]: I1207 19:16:13.389159 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:13 crc kubenswrapper[4815]: I1207 19:16:13.389179 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:13 crc kubenswrapper[4815]: I1207 19:16:13.389250 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:13 crc kubenswrapper[4815]: I1207 19:16:13.389271 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:13Z","lastTransitionTime":"2025-12-07T19:16:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:13 crc kubenswrapper[4815]: I1207 19:16:13.395970 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:13Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:13 crc kubenswrapper[4815]: I1207 19:16:13.412627 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nz6q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92d414c2-ecc3-4598-ac9d-b982bfd89c7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://003f0d6dca39f167925bc3a05634d9dd20f23e54135631e0099a56801f9daf3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cp5k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nz6q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:13Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:13 crc kubenswrapper[4815]: I1207 19:16:13.432739 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c752886-7f9b-4605-8358-0fde597c93da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://441c88363ea1cc8305fa21c51c9237798c47d82a207c9e80da98df93027fe4a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb1ec115ead917caf40ce3586653d6eb78e9a290d845f766f8818c2b57ece6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bda919a651aa83d2f9a82f711d0b242e14140e3d13ec5a80f7ec675a4aedb21a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33a4c84a8cd512a90c51f4721d9ddfde2a452e7f03d9809d492d5bccd7cf4aa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33a4c84a8cd512a90c51f4721d9ddfde2a452e7f03d9809d492d5bccd7cf4aa4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:13Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:13 crc kubenswrapper[4815]: I1207 19:16:13.456207 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe01e9d3-9839-41ea-82d9-f4b6f48bf3b2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52ab545f838534306ea70cc6e400e9fb56ed71ed8206d72c6626166f3a51a93c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f59886c065844948639bf9828ed20c24546cc38659683c98162568c6b12c029\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bec885c9a80d9b5b1fe630a9b7a3f19b23b0af05f359bd87a664f007812b4c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c76d9f056d22d1be012327374370269c2f6d6b8bd341c98419fcb744370751e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539bb1f0e569004272aac994e13bdc7f17f342f5986f08beca97a4c888e046c2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1207 19:15:09.905103 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1207 19:15:09.906859 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4228347019/tls.crt::/tmp/serving-cert-4228347019/tls.key\\\\\\\"\\\\nI1207 19:15:20.294696 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1207 19:15:20.298748 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1207 19:15:20.298781 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1207 19:15:20.298843 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1207 19:15:20.298860 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1207 19:15:20.305864 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1207 19:15:20.305885 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1207 19:15:20.305891 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1207 19:15:20.305973 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1207 19:15:20.305983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1207 19:15:20.305988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1207 19:15:20.305992 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1207 19:15:20.305997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1207 19:15:20.308727 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://902a764a3f6e3ac55c1817f369dfd69ffbf8528bfe4444a6514cdabd48f09fb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:13Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:13 crc kubenswrapper[4815]: I1207 19:16:13.474737 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:13Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:13 crc kubenswrapper[4815]: I1207 19:16:13.489521 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b57a8653da7e9136a6c54872ff67a01c2d6ad5077842b783ffe74e0069bc8693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:13Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:13 crc kubenswrapper[4815]: I1207 19:16:13.493465 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:13 crc kubenswrapper[4815]: I1207 19:16:13.493693 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:13 crc kubenswrapper[4815]: I1207 19:16:13.493811 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:13 crc kubenswrapper[4815]: I1207 19:16:13.493956 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:13 crc kubenswrapper[4815]: I1207 19:16:13.494085 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:13Z","lastTransitionTime":"2025-12-07T19:16:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:13 crc kubenswrapper[4815]: I1207 19:16:13.507278 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:13Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:13 crc kubenswrapper[4815]: I1207 19:16:13.524591 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ec34a6371c74ee0778e27d73f5e2948d9ee1bb518f658675df123df60879c61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc20ae853b989c9e9cfeb47a40bd1a02397e3ce0b0b4feec3fcec56ea7be16a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:13Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:13 crc kubenswrapper[4815]: I1207 19:16:13.540689 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d662ba2-aa03-4eea-bd30-8ad40638f6c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4939db657209a8744656f63541b50598888981216b000dd4316a9327fdfbcf34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2v8vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55eba62082762b6527923446c2d419d2eb00919331cf8e59bfa9cb27ca65a82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2v8vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gkn4h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:13Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:13 crc kubenswrapper[4815]: I1207 19:16:13.554402 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xbq22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"201e9ba8-3e19-4555-90f0-587497a2a328\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z24bd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z24bd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xbq22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:13Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:13 crc kubenswrapper[4815]: I1207 19:16:13.572111 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s95hp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b739f36-d9c4-4fb6-9ead-9df05e283dea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:16:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:16:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73eedf1d85ce69ddf2fed2445ed9318cb4515aee47f89fe9b1db69bbf213ea2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73eedf1d85ce69ddf2fed2445ed9318cb4515aee47f89fe9b1db69bbf213ea2a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-07T19:16:12Z\\\",\\\"message\\\":\\\"2025-12-07T19:15:27+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_5074391b-4630-4c1f-b2fa-f9e22786274e\\\\n2025-12-07T19:15:27+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_5074391b-4630-4c1f-b2fa-f9e22786274e to /host/opt/cni/bin/\\\\n2025-12-07T19:15:27Z [verbose] multus-daemon started\\\\n2025-12-07T19:15:27Z [verbose] Readiness Indicator file check\\\\n2025-12-07T19:16:12Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prbsc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s95hp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:13Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:13 crc kubenswrapper[4815]: I1207 19:16:13.588794 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gmf4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fefaece-ab52-48e2-9ee9-fb07be1922f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://600d32fcae56dd0fa63421283e9b0fdfc23c4589a1dc4a66e5ff1f4c6a415e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e14454c2c5dbbdcbfd3ff7ec5d618c2dfe48f41af9846b72f61941b878e53d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e14454c2c5dbbdcbfd3ff7ec5d618c2dfe48f41af9846b72f61941b878e53d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f33f1decd49213b8e034c2d1f2a3a51ab99aa18679bd3378becaa27f1ef4b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f33f1decd49213b8e034c2d1f2a3a51ab99aa18679bd3378becaa27f1ef4b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c519c7c8ea1b7405466970c64269b72cc6e8129838764dfc4a662cb4df419c6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c519c7c8ea1b7405466970c64269b72cc6e8129838764dfc4a662cb4df419c6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c9efcd688f4886df56fb4d0e2ff5f5dd0c9e877186e1b53d398b7bed0547609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c9efcd688f4886df56fb4d0e2ff5f5dd0c9e877186e1b53d398b7bed0547609\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a4a622d5231f5bacbb9dba0290a87f10310a0053be2dd692718fcf1002deb44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a4a622d5231f5bacbb9dba0290a87f10310a0053be2dd692718fcf1002deb44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://825649f5768765281bad22f9b86ed162b80cc258ce11a746ec06c9f415a21529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://825649f5768765281bad22f9b86ed162b80cc258ce11a746ec06c9f415a21529\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gmf4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:13Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:13 crc kubenswrapper[4815]: I1207 19:16:13.597429 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:13 crc kubenswrapper[4815]: I1207 19:16:13.597685 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:13 crc kubenswrapper[4815]: I1207 19:16:13.597840 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:13 crc kubenswrapper[4815]: I1207 19:16:13.597998 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:13 crc kubenswrapper[4815]: I1207 19:16:13.598113 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:13Z","lastTransitionTime":"2025-12-07T19:16:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:13 crc kubenswrapper[4815]: I1207 19:16:13.607063 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tqxds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dfd265a-0c7f-40bd-9226-82c2b1abbeda\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3523e317081c56f05c00d0288c6ad6f1ff04f1346772bc0c86aed488570786a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzfhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a277a724d026909546ae84aee6d8d29fc87f0277f9d6aa45ca99e5f116ac79d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzfhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tqxds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:13Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:13 crc kubenswrapper[4815]: I1207 19:16:13.700783 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:13 crc kubenswrapper[4815]: I1207 19:16:13.701236 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:13 crc kubenswrapper[4815]: I1207 19:16:13.701456 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:13 crc kubenswrapper[4815]: I1207 19:16:13.701645 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:13 crc kubenswrapper[4815]: I1207 19:16:13.701790 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:13Z","lastTransitionTime":"2025-12-07T19:16:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:13 crc kubenswrapper[4815]: I1207 19:16:13.769322 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 07 19:16:13 crc kubenswrapper[4815]: I1207 19:16:13.770020 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 07 19:16:13 crc kubenswrapper[4815]: E1207 19:16:13.770237 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 07 19:16:13 crc kubenswrapper[4815]: E1207 19:16:13.770493 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 07 19:16:13 crc kubenswrapper[4815]: I1207 19:16:13.770510 4815 scope.go:117] "RemoveContainer" containerID="9d8bbcb8ed47df52ce8120a29864bc098ad0cce571dad6460aea888eda70ccf8" Dec 07 19:16:13 crc kubenswrapper[4815]: E1207 19:16:13.771168 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zzw6c_openshift-ovn-kubernetes(13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" podUID="13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f" Dec 07 19:16:13 crc kubenswrapper[4815]: I1207 19:16:13.805104 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:13 crc kubenswrapper[4815]: I1207 19:16:13.805195 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:13 crc kubenswrapper[4815]: I1207 19:16:13.805245 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:13 crc kubenswrapper[4815]: I1207 19:16:13.805272 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:13 crc kubenswrapper[4815]: I1207 19:16:13.805319 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:13Z","lastTransitionTime":"2025-12-07T19:16:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:13 crc kubenswrapper[4815]: I1207 19:16:13.908187 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:13 crc kubenswrapper[4815]: I1207 19:16:13.908234 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:13 crc kubenswrapper[4815]: I1207 19:16:13.908252 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:13 crc kubenswrapper[4815]: I1207 19:16:13.908275 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:13 crc kubenswrapper[4815]: I1207 19:16:13.908293 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:13Z","lastTransitionTime":"2025-12-07T19:16:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:14 crc kubenswrapper[4815]: I1207 19:16:14.011528 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:14 crc kubenswrapper[4815]: I1207 19:16:14.011659 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:14 crc kubenswrapper[4815]: I1207 19:16:14.011686 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:14 crc kubenswrapper[4815]: I1207 19:16:14.011717 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:14 crc kubenswrapper[4815]: I1207 19:16:14.011741 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:14Z","lastTransitionTime":"2025-12-07T19:16:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:14 crc kubenswrapper[4815]: I1207 19:16:14.115051 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:14 crc kubenswrapper[4815]: I1207 19:16:14.115136 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:14 crc kubenswrapper[4815]: I1207 19:16:14.115153 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:14 crc kubenswrapper[4815]: I1207 19:16:14.115177 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:14 crc kubenswrapper[4815]: I1207 19:16:14.115193 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:14Z","lastTransitionTime":"2025-12-07T19:16:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:14 crc kubenswrapper[4815]: I1207 19:16:14.217395 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:14 crc kubenswrapper[4815]: I1207 19:16:14.217459 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:14 crc kubenswrapper[4815]: I1207 19:16:14.217479 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:14 crc kubenswrapper[4815]: I1207 19:16:14.217504 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:14 crc kubenswrapper[4815]: I1207 19:16:14.217524 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:14Z","lastTransitionTime":"2025-12-07T19:16:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:14 crc kubenswrapper[4815]: I1207 19:16:14.282966 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-s95hp_0b739f36-d9c4-4fb6-9ead-9df05e283dea/kube-multus/0.log" Dec 07 19:16:14 crc kubenswrapper[4815]: I1207 19:16:14.283021 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-s95hp" event={"ID":"0b739f36-d9c4-4fb6-9ead-9df05e283dea","Type":"ContainerStarted","Data":"7285635ab7710e9071a051b6e49036856b4c60c87b5110debec3bfb20bb0ac97"} Dec 07 19:16:14 crc kubenswrapper[4815]: I1207 19:16:14.321088 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:14 crc kubenswrapper[4815]: I1207 19:16:14.321147 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:14 crc kubenswrapper[4815]: I1207 19:16:14.321164 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:14 crc kubenswrapper[4815]: I1207 19:16:14.321189 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:14 crc kubenswrapper[4815]: I1207 19:16:14.321210 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:14Z","lastTransitionTime":"2025-12-07T19:16:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:14 crc kubenswrapper[4815]: I1207 19:16:14.423707 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:14 crc kubenswrapper[4815]: I1207 19:16:14.423784 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:14 crc kubenswrapper[4815]: I1207 19:16:14.423807 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:14 crc kubenswrapper[4815]: I1207 19:16:14.423840 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:14 crc kubenswrapper[4815]: I1207 19:16:14.423864 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:14Z","lastTransitionTime":"2025-12-07T19:16:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:14 crc kubenswrapper[4815]: I1207 19:16:14.527812 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:14 crc kubenswrapper[4815]: I1207 19:16:14.527858 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:14 crc kubenswrapper[4815]: I1207 19:16:14.527868 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:14 crc kubenswrapper[4815]: I1207 19:16:14.527885 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:14 crc kubenswrapper[4815]: I1207 19:16:14.527900 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:14Z","lastTransitionTime":"2025-12-07T19:16:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:14 crc kubenswrapper[4815]: I1207 19:16:14.631280 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:14 crc kubenswrapper[4815]: I1207 19:16:14.631344 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:14 crc kubenswrapper[4815]: I1207 19:16:14.631366 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:14 crc kubenswrapper[4815]: I1207 19:16:14.631391 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:14 crc kubenswrapper[4815]: I1207 19:16:14.631409 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:14Z","lastTransitionTime":"2025-12-07T19:16:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:14 crc kubenswrapper[4815]: I1207 19:16:14.734550 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:14 crc kubenswrapper[4815]: I1207 19:16:14.734585 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:14 crc kubenswrapper[4815]: I1207 19:16:14.734593 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:14 crc kubenswrapper[4815]: I1207 19:16:14.734609 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:14 crc kubenswrapper[4815]: I1207 19:16:14.734618 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:14Z","lastTransitionTime":"2025-12-07T19:16:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:14 crc kubenswrapper[4815]: I1207 19:16:14.768746 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xbq22" Dec 07 19:16:14 crc kubenswrapper[4815]: I1207 19:16:14.768798 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 07 19:16:14 crc kubenswrapper[4815]: E1207 19:16:14.768871 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xbq22" podUID="201e9ba8-3e19-4555-90f0-587497a2a328" Dec 07 19:16:14 crc kubenswrapper[4815]: E1207 19:16:14.768985 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 07 19:16:14 crc kubenswrapper[4815]: I1207 19:16:14.837469 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:14 crc kubenswrapper[4815]: I1207 19:16:14.837518 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:14 crc kubenswrapper[4815]: I1207 19:16:14.837532 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:14 crc kubenswrapper[4815]: I1207 19:16:14.837554 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:14 crc kubenswrapper[4815]: I1207 19:16:14.837569 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:14Z","lastTransitionTime":"2025-12-07T19:16:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:14 crc kubenswrapper[4815]: I1207 19:16:14.939862 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:14 crc kubenswrapper[4815]: I1207 19:16:14.940160 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:14 crc kubenswrapper[4815]: I1207 19:16:14.940173 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:14 crc kubenswrapper[4815]: I1207 19:16:14.940187 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:14 crc kubenswrapper[4815]: I1207 19:16:14.940197 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:14Z","lastTransitionTime":"2025-12-07T19:16:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:15 crc kubenswrapper[4815]: I1207 19:16:15.041855 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:15 crc kubenswrapper[4815]: I1207 19:16:15.041895 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:15 crc kubenswrapper[4815]: I1207 19:16:15.041905 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:15 crc kubenswrapper[4815]: I1207 19:16:15.041934 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:15 crc kubenswrapper[4815]: I1207 19:16:15.041947 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:15Z","lastTransitionTime":"2025-12-07T19:16:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:15 crc kubenswrapper[4815]: I1207 19:16:15.145167 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:15 crc kubenswrapper[4815]: I1207 19:16:15.145203 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:15 crc kubenswrapper[4815]: I1207 19:16:15.145214 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:15 crc kubenswrapper[4815]: I1207 19:16:15.145230 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:15 crc kubenswrapper[4815]: I1207 19:16:15.145241 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:15Z","lastTransitionTime":"2025-12-07T19:16:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:15 crc kubenswrapper[4815]: I1207 19:16:15.248621 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:15 crc kubenswrapper[4815]: I1207 19:16:15.248664 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:15 crc kubenswrapper[4815]: I1207 19:16:15.248677 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:15 crc kubenswrapper[4815]: I1207 19:16:15.248693 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:15 crc kubenswrapper[4815]: I1207 19:16:15.248706 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:15Z","lastTransitionTime":"2025-12-07T19:16:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:15 crc kubenswrapper[4815]: I1207 19:16:15.301018 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s95hp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b739f36-d9c4-4fb6-9ead-9df05e283dea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:16:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:16:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7285635ab7710e9071a051b6e49036856b4c60c87b5110debec3bfb20bb0ac97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73eedf1d85ce69ddf2fed2445ed9318cb4515aee47f89fe9b1db69bbf213ea2a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-07T19:16:12Z\\\",\\\"message\\\":\\\"2025-12-07T19:15:27+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_5074391b-4630-4c1f-b2fa-f9e22786274e\\\\n2025-12-07T19:15:27+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_5074391b-4630-4c1f-b2fa-f9e22786274e to /host/opt/cni/bin/\\\\n2025-12-07T19:15:27Z [verbose] multus-daemon started\\\\n2025-12-07T19:15:27Z [verbose] Readiness Indicator file check\\\\n2025-12-07T19:16:12Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:16:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prbsc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s95hp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:15Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:15 crc kubenswrapper[4815]: I1207 19:16:15.320088 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gmf4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fefaece-ab52-48e2-9ee9-fb07be1922f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://600d32fcae56dd0fa63421283e9b0fdfc23c4589a1dc4a66e5ff1f4c6a415e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e14454c2c5dbbdcbfd3ff7ec5d618c2dfe48f41af9846b72f61941b878e53d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e14454c2c5dbbdcbfd3ff7ec5d618c2dfe48f41af9846b72f61941b878e53d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f33f1decd49213b8e034c2d1f2a3a51ab99aa18679bd3378becaa27f1ef4b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f33f1decd49213b8e034c2d1f2a3a51ab99aa18679bd3378becaa27f1ef4b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c519c7c8ea1b7405466970c64269b72cc6e8129838764dfc4a662cb4df419c6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c519c7c8ea1b7405466970c64269b72cc6e8129838764dfc4a662cb4df419c6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c9efcd688f4886df56fb4d0e2ff5f5dd0c9e877186e1b53d398b7bed0547609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c9efcd688f4886df56fb4d0e2ff5f5dd0c9e877186e1b53d398b7bed0547609\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a4a622d5231f5bacbb9dba0290a87f10310a0053be2dd692718fcf1002deb44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a4a622d5231f5bacbb9dba0290a87f10310a0053be2dd692718fcf1002deb44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://825649f5768765281bad22f9b86ed162b80cc258ce11a746ec06c9f415a21529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://825649f5768765281bad22f9b86ed162b80cc258ce11a746ec06c9f415a21529\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gmf4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:15Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:15 crc kubenswrapper[4815]: I1207 19:16:15.331084 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tqxds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dfd265a-0c7f-40bd-9226-82c2b1abbeda\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3523e317081c56f05c00d0288c6ad6f1ff04f1346772bc0c86aed488570786a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzfhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a277a724d026909546ae84aee6d8d29fc87f0277f9d6aa45ca99e5f116ac79d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzfhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tqxds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:15Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:15 crc kubenswrapper[4815]: I1207 19:16:15.348745 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"538dccc2-452b-4d13-8b0d-df993c2b69a2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae001d2e4c277c57382ca0c4a1286c3c135f2fc06ca5a78df27cbdab588ce576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f92a2d37af34c519a6b00a9643f4a46de779ae0070adb0826e9f227960fe4a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89a76372fcc5ff7131084098e41183d148096169b9ccaf69d59dbef45dcfd755\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f29e69a7aa018d68c2e39f3e0f7a619fb25262acaebf83b81fabb6f9b87a9892\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:15Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:15 crc kubenswrapper[4815]: I1207 19:16:15.352412 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:15 crc kubenswrapper[4815]: I1207 19:16:15.352621 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:15 crc kubenswrapper[4815]: I1207 19:16:15.352788 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:15 crc kubenswrapper[4815]: I1207 19:16:15.352987 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:15 crc kubenswrapper[4815]: I1207 19:16:15.353170 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:15Z","lastTransitionTime":"2025-12-07T19:16:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:15 crc kubenswrapper[4815]: I1207 19:16:15.363097 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cb6cb3bfed28d77ff74b7a2273ddb477e21f930f79b169bfbd98fa10a5ead5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:15Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:15 crc kubenswrapper[4815]: I1207 19:16:15.381666 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac471afc20373bd22caa4941b5ebe38fdf62ad2145fabb4211fa02ce8728ab1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4050d0d7398492efc8809311a01c1a60986b97b6d177846e70c9283399f37b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c63ad6fab49ca4b8deb896aef3b5ff43c3a1ae1035e7ad94815a938939c57c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://812356ce6808d8d2a5845dec1132733fab86a0d6853951920ce5fb5083551c7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ab1c01dcbfe50580e64d89dc9e14843419588c89d7d792af89034d8f272e44e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f140818c38211cb31d63c0b540b60681a781c349ba0523866a3c7e80f33756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d8bbcb8ed47df52ce8120a29864bc098ad0cce571dad6460aea888eda70ccf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d8bbcb8ed47df52ce8120a29864bc098ad0cce571dad6460aea888eda70ccf8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-07T19:15:59Z\\\",\\\"message\\\":\\\"map[10.217.4.213:80:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {2ead45b3-c313-4fbc-a7bc-2b3c4ffd610c}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1207 19:15:58.661470 6472 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1207 19:15:58.661480 6472 ovn.go:134] Ensuring zone local for Pod openshift-kube-controller-manager/kube-controller-manager-crc in node crc\\\\nI1207 19:15:58.661485 6472 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nI1207 19:15:58.661491 6472 default_network_controller.go:776] Recording success event on pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1207 19:15:58.661489 6472 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1207 19:15:58.658410 6472 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:57Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zzw6c_openshift-ovn-kubernetes(13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba72f8ec4b6a36d0c1a6fd8f9f8e5fad577a00b8ae515738a007b994b989d134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a4a4b2d4e80c078ae857e91491c082c6b1c57a5297c485c31fac0c90996b19f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a4a4b2d4e80c078ae857e91491c082c6b1c57a5297c485c31fac0c90996b19f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zzw6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:15Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:15 crc kubenswrapper[4815]: I1207 19:16:15.395743 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kh7gd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29b15186-a725-4067-ba76-336f580327fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1fa8bda26912926bc52757424a9930e1eca6ee953f28ffdb205c1b8b8f29d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kh7gd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:15Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:15 crc kubenswrapper[4815]: I1207 19:16:15.429660 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nz6q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92d414c2-ecc3-4598-ac9d-b982bfd89c7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://003f0d6dca39f167925bc3a05634d9dd20f23e54135631e0099a56801f9daf3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cp5k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nz6q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:15Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:15 crc kubenswrapper[4815]: I1207 19:16:15.449648 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c752886-7f9b-4605-8358-0fde597c93da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://441c88363ea1cc8305fa21c51c9237798c47d82a207c9e80da98df93027fe4a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb1ec115ead917caf40ce3586653d6eb78e9a290d845f766f8818c2b57ece6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bda919a651aa83d2f9a82f711d0b242e14140e3d13ec5a80f7ec675a4aedb21a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33a4c84a8cd512a90c51f4721d9ddfde2a452e7f03d9809d492d5bccd7cf4aa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33a4c84a8cd512a90c51f4721d9ddfde2a452e7f03d9809d492d5bccd7cf4aa4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:15Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:15 crc kubenswrapper[4815]: I1207 19:16:15.455156 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:15 crc kubenswrapper[4815]: I1207 19:16:15.455221 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:15 crc kubenswrapper[4815]: I1207 19:16:15.455235 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:15 crc kubenswrapper[4815]: I1207 19:16:15.455254 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:15 crc kubenswrapper[4815]: I1207 19:16:15.455265 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:15Z","lastTransitionTime":"2025-12-07T19:16:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:15 crc kubenswrapper[4815]: I1207 19:16:15.472937 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe01e9d3-9839-41ea-82d9-f4b6f48bf3b2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52ab545f838534306ea70cc6e400e9fb56ed71ed8206d72c6626166f3a51a93c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f59886c065844948639bf9828ed20c24546cc38659683c98162568c6b12c029\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bec885c9a80d9b5b1fe630a9b7a3f19b23b0af05f359bd87a664f007812b4c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c76d9f056d22d1be012327374370269c2f6d6b8bd341c98419fcb744370751e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539bb1f0e569004272aac994e13bdc7f17f342f5986f08beca97a4c888e046c2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1207 19:15:09.905103 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1207 19:15:09.906859 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4228347019/tls.crt::/tmp/serving-cert-4228347019/tls.key\\\\\\\"\\\\nI1207 19:15:20.294696 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1207 19:15:20.298748 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1207 19:15:20.298781 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1207 19:15:20.298843 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1207 19:15:20.298860 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1207 19:15:20.305864 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1207 19:15:20.305885 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1207 19:15:20.305891 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1207 19:15:20.305973 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1207 19:15:20.305983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1207 19:15:20.305988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1207 19:15:20.305992 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1207 19:15:20.305997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1207 19:15:20.308727 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://902a764a3f6e3ac55c1817f369dfd69ffbf8528bfe4444a6514cdabd48f09fb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:15Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:15 crc kubenswrapper[4815]: I1207 19:16:15.488167 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:15Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:15 crc kubenswrapper[4815]: I1207 19:16:15.501382 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b57a8653da7e9136a6c54872ff67a01c2d6ad5077842b783ffe74e0069bc8693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:15Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:15 crc kubenswrapper[4815]: I1207 19:16:15.515191 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:15Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:15 crc kubenswrapper[4815]: I1207 19:16:15.527960 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ec34a6371c74ee0778e27d73f5e2948d9ee1bb518f658675df123df60879c61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc20ae853b989c9e9cfeb47a40bd1a02397e3ce0b0b4feec3fcec56ea7be16a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:15Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:15 crc kubenswrapper[4815]: I1207 19:16:15.537950 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:15Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:15 crc kubenswrapper[4815]: I1207 19:16:15.548278 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d662ba2-aa03-4eea-bd30-8ad40638f6c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4939db657209a8744656f63541b50598888981216b000dd4316a9327fdfbcf34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2v8vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55eba62082762b6527923446c2d419d2eb00919331cf8e59bfa9cb27ca65a82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2v8vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gkn4h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:15Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:15 crc kubenswrapper[4815]: I1207 19:16:15.557379 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:15 crc kubenswrapper[4815]: I1207 19:16:15.557417 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:15 crc kubenswrapper[4815]: I1207 19:16:15.557428 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:15 crc kubenswrapper[4815]: I1207 19:16:15.557445 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:15 crc kubenswrapper[4815]: I1207 19:16:15.557457 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:15Z","lastTransitionTime":"2025-12-07T19:16:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:15 crc kubenswrapper[4815]: I1207 19:16:15.559614 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xbq22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"201e9ba8-3e19-4555-90f0-587497a2a328\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z24bd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z24bd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xbq22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:15Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:15 crc kubenswrapper[4815]: I1207 19:16:15.660026 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:15 crc kubenswrapper[4815]: I1207 19:16:15.660123 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:15 crc kubenswrapper[4815]: I1207 19:16:15.660150 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:15 crc kubenswrapper[4815]: I1207 19:16:15.660177 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:15 crc kubenswrapper[4815]: I1207 19:16:15.660194 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:15Z","lastTransitionTime":"2025-12-07T19:16:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:15 crc kubenswrapper[4815]: I1207 19:16:15.764033 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:15 crc kubenswrapper[4815]: I1207 19:16:15.764146 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:15 crc kubenswrapper[4815]: I1207 19:16:15.764165 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:15 crc kubenswrapper[4815]: I1207 19:16:15.764188 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:15 crc kubenswrapper[4815]: I1207 19:16:15.764206 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:15Z","lastTransitionTime":"2025-12-07T19:16:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:15 crc kubenswrapper[4815]: I1207 19:16:15.769075 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 07 19:16:15 crc kubenswrapper[4815]: E1207 19:16:15.769277 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 07 19:16:15 crc kubenswrapper[4815]: I1207 19:16:15.769557 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 07 19:16:15 crc kubenswrapper[4815]: E1207 19:16:15.769726 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 07 19:16:15 crc kubenswrapper[4815]: I1207 19:16:15.787438 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:15Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:15 crc kubenswrapper[4815]: I1207 19:16:15.809446 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ec34a6371c74ee0778e27d73f5e2948d9ee1bb518f658675df123df60879c61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc20ae853b989c9e9cfeb47a40bd1a02397e3ce0b0b4feec3fcec56ea7be16a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:15Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:15 crc kubenswrapper[4815]: I1207 19:16:15.828439 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:15Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:15 crc kubenswrapper[4815]: I1207 19:16:15.842791 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nz6q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92d414c2-ecc3-4598-ac9d-b982bfd89c7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://003f0d6dca39f167925bc3a05634d9dd20f23e54135631e0099a56801f9daf3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cp5k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nz6q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:15Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:15 crc kubenswrapper[4815]: I1207 19:16:15.861993 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c752886-7f9b-4605-8358-0fde597c93da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://441c88363ea1cc8305fa21c51c9237798c47d82a207c9e80da98df93027fe4a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb1ec115ead917caf40ce3586653d6eb78e9a290d845f766f8818c2b57ece6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bda919a651aa83d2f9a82f711d0b242e14140e3d13ec5a80f7ec675a4aedb21a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33a4c84a8cd512a90c51f4721d9ddfde2a452e7f03d9809d492d5bccd7cf4aa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33a4c84a8cd512a90c51f4721d9ddfde2a452e7f03d9809d492d5bccd7cf4aa4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:15Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:15 crc kubenswrapper[4815]: I1207 19:16:15.867352 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:15 crc kubenswrapper[4815]: I1207 19:16:15.867417 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:15 crc kubenswrapper[4815]: I1207 19:16:15.867440 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:15 crc kubenswrapper[4815]: I1207 19:16:15.867469 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:15 crc kubenswrapper[4815]: I1207 19:16:15.867491 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:15Z","lastTransitionTime":"2025-12-07T19:16:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:15 crc kubenswrapper[4815]: I1207 19:16:15.883521 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe01e9d3-9839-41ea-82d9-f4b6f48bf3b2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52ab545f838534306ea70cc6e400e9fb56ed71ed8206d72c6626166f3a51a93c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f59886c065844948639bf9828ed20c24546cc38659683c98162568c6b12c029\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bec885c9a80d9b5b1fe630a9b7a3f19b23b0af05f359bd87a664f007812b4c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c76d9f056d22d1be012327374370269c2f6d6b8bd341c98419fcb744370751e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539bb1f0e569004272aac994e13bdc7f17f342f5986f08beca97a4c888e046c2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1207 19:15:09.905103 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1207 19:15:09.906859 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4228347019/tls.crt::/tmp/serving-cert-4228347019/tls.key\\\\\\\"\\\\nI1207 19:15:20.294696 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1207 19:15:20.298748 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1207 19:15:20.298781 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1207 19:15:20.298843 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1207 19:15:20.298860 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1207 19:15:20.305864 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1207 19:15:20.305885 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1207 19:15:20.305891 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1207 19:15:20.305973 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1207 19:15:20.305983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1207 19:15:20.305988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1207 19:15:20.305992 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1207 19:15:20.305997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1207 19:15:20.308727 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://902a764a3f6e3ac55c1817f369dfd69ffbf8528bfe4444a6514cdabd48f09fb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:15Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:15 crc kubenswrapper[4815]: I1207 19:16:15.907610 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:15Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:15 crc kubenswrapper[4815]: I1207 19:16:15.927420 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b57a8653da7e9136a6c54872ff67a01c2d6ad5077842b783ffe74e0069bc8693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:15Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:15 crc kubenswrapper[4815]: I1207 19:16:15.944494 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d662ba2-aa03-4eea-bd30-8ad40638f6c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4939db657209a8744656f63541b50598888981216b000dd4316a9327fdfbcf34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2v8vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55eba62082762b6527923446c2d419d2eb00919331cf8e59bfa9cb27ca65a82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2v8vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gkn4h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:15Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:15 crc kubenswrapper[4815]: I1207 19:16:15.961110 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xbq22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"201e9ba8-3e19-4555-90f0-587497a2a328\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z24bd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z24bd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xbq22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:15Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:15 crc kubenswrapper[4815]: I1207 19:16:15.975368 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:15 crc kubenswrapper[4815]: I1207 19:16:15.975437 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:15 crc kubenswrapper[4815]: I1207 19:16:15.975460 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:15 crc kubenswrapper[4815]: I1207 19:16:15.975490 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:15 crc kubenswrapper[4815]: I1207 19:16:15.975512 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:15Z","lastTransitionTime":"2025-12-07T19:16:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:15 crc kubenswrapper[4815]: I1207 19:16:15.989909 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tqxds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dfd265a-0c7f-40bd-9226-82c2b1abbeda\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3523e317081c56f05c00d0288c6ad6f1ff04f1346772bc0c86aed488570786a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzfhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a277a724d026909546ae84aee6d8d29fc87f0277f9d6aa45ca99e5f116ac79d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzfhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tqxds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:15Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:16 crc kubenswrapper[4815]: I1207 19:16:16.008319 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s95hp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b739f36-d9c4-4fb6-9ead-9df05e283dea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:16:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:16:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7285635ab7710e9071a051b6e49036856b4c60c87b5110debec3bfb20bb0ac97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73eedf1d85ce69ddf2fed2445ed9318cb4515aee47f89fe9b1db69bbf213ea2a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-07T19:16:12Z\\\",\\\"message\\\":\\\"2025-12-07T19:15:27+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_5074391b-4630-4c1f-b2fa-f9e22786274e\\\\n2025-12-07T19:15:27+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_5074391b-4630-4c1f-b2fa-f9e22786274e to /host/opt/cni/bin/\\\\n2025-12-07T19:15:27Z [verbose] multus-daemon started\\\\n2025-12-07T19:15:27Z [verbose] Readiness Indicator file check\\\\n2025-12-07T19:16:12Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:16:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prbsc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s95hp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:16Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:16 crc kubenswrapper[4815]: I1207 19:16:16.033534 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gmf4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fefaece-ab52-48e2-9ee9-fb07be1922f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://600d32fcae56dd0fa63421283e9b0fdfc23c4589a1dc4a66e5ff1f4c6a415e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e14454c2c5dbbdcbfd3ff7ec5d618c2dfe48f41af9846b72f61941b878e53d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e14454c2c5dbbdcbfd3ff7ec5d618c2dfe48f41af9846b72f61941b878e53d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f33f1decd49213b8e034c2d1f2a3a51ab99aa18679bd3378becaa27f1ef4b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f33f1decd49213b8e034c2d1f2a3a51ab99aa18679bd3378becaa27f1ef4b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c519c7c8ea1b7405466970c64269b72cc6e8129838764dfc4a662cb4df419c6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c519c7c8ea1b7405466970c64269b72cc6e8129838764dfc4a662cb4df419c6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c9efcd688f4886df56fb4d0e2ff5f5dd0c9e877186e1b53d398b7bed0547609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c9efcd688f4886df56fb4d0e2ff5f5dd0c9e877186e1b53d398b7bed0547609\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a4a622d5231f5bacbb9dba0290a87f10310a0053be2dd692718fcf1002deb44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a4a622d5231f5bacbb9dba0290a87f10310a0053be2dd692718fcf1002deb44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://825649f5768765281bad22f9b86ed162b80cc258ce11a746ec06c9f415a21529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://825649f5768765281bad22f9b86ed162b80cc258ce11a746ec06c9f415a21529\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gmf4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:16Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:16 crc kubenswrapper[4815]: I1207 19:16:16.050041 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"538dccc2-452b-4d13-8b0d-df993c2b69a2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae001d2e4c277c57382ca0c4a1286c3c135f2fc06ca5a78df27cbdab588ce576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f92a2d37af34c519a6b00a9643f4a46de779ae0070adb0826e9f227960fe4a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89a76372fcc5ff7131084098e41183d148096169b9ccaf69d59dbef45dcfd755\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f29e69a7aa018d68c2e39f3e0f7a619fb25262acaebf83b81fabb6f9b87a9892\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:16Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:16 crc kubenswrapper[4815]: I1207 19:16:16.066276 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cb6cb3bfed28d77ff74b7a2273ddb477e21f930f79b169bfbd98fa10a5ead5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:16Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:16 crc kubenswrapper[4815]: I1207 19:16:16.078819 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:16 crc kubenswrapper[4815]: I1207 19:16:16.078879 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:16 crc kubenswrapper[4815]: I1207 19:16:16.078900 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:16 crc kubenswrapper[4815]: I1207 19:16:16.078948 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:16 crc kubenswrapper[4815]: I1207 19:16:16.078964 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:16Z","lastTransitionTime":"2025-12-07T19:16:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:16 crc kubenswrapper[4815]: I1207 19:16:16.095092 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac471afc20373bd22caa4941b5ebe38fdf62ad2145fabb4211fa02ce8728ab1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4050d0d7398492efc8809311a01c1a60986b97b6d177846e70c9283399f37b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c63ad6fab49ca4b8deb896aef3b5ff43c3a1ae1035e7ad94815a938939c57c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://812356ce6808d8d2a5845dec1132733fab86a0d6853951920ce5fb5083551c7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ab1c01dcbfe50580e64d89dc9e14843419588c89d7d792af89034d8f272e44e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f140818c38211cb31d63c0b540b60681a781c349ba0523866a3c7e80f33756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d8bbcb8ed47df52ce8120a29864bc098ad0cce571dad6460aea888eda70ccf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d8bbcb8ed47df52ce8120a29864bc098ad0cce571dad6460aea888eda70ccf8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-07T19:15:59Z\\\",\\\"message\\\":\\\"map[10.217.4.213:80:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {2ead45b3-c313-4fbc-a7bc-2b3c4ffd610c}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1207 19:15:58.661470 6472 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1207 19:15:58.661480 6472 ovn.go:134] Ensuring zone local for Pod openshift-kube-controller-manager/kube-controller-manager-crc in node crc\\\\nI1207 19:15:58.661485 6472 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nI1207 19:15:58.661491 6472 default_network_controller.go:776] Recording success event on pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1207 19:15:58.661489 6472 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1207 19:15:58.658410 6472 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:57Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zzw6c_openshift-ovn-kubernetes(13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba72f8ec4b6a36d0c1a6fd8f9f8e5fad577a00b8ae515738a007b994b989d134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a4a4b2d4e80c078ae857e91491c082c6b1c57a5297c485c31fac0c90996b19f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a4a4b2d4e80c078ae857e91491c082c6b1c57a5297c485c31fac0c90996b19f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zzw6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:16Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:16 crc kubenswrapper[4815]: I1207 19:16:16.108024 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kh7gd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29b15186-a725-4067-ba76-336f580327fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1fa8bda26912926bc52757424a9930e1eca6ee953f28ffdb205c1b8b8f29d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kh7gd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:16Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:16 crc kubenswrapper[4815]: I1207 19:16:16.181854 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:16 crc kubenswrapper[4815]: I1207 19:16:16.181890 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:16 crc kubenswrapper[4815]: I1207 19:16:16.181899 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:16 crc kubenswrapper[4815]: I1207 19:16:16.181932 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:16 crc kubenswrapper[4815]: I1207 19:16:16.181941 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:16Z","lastTransitionTime":"2025-12-07T19:16:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:16 crc kubenswrapper[4815]: I1207 19:16:16.284168 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:16 crc kubenswrapper[4815]: I1207 19:16:16.284229 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:16 crc kubenswrapper[4815]: I1207 19:16:16.284246 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:16 crc kubenswrapper[4815]: I1207 19:16:16.284976 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:16 crc kubenswrapper[4815]: I1207 19:16:16.284996 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:16Z","lastTransitionTime":"2025-12-07T19:16:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:16 crc kubenswrapper[4815]: I1207 19:16:16.387158 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:16 crc kubenswrapper[4815]: I1207 19:16:16.387189 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:16 crc kubenswrapper[4815]: I1207 19:16:16.387198 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:16 crc kubenswrapper[4815]: I1207 19:16:16.387212 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:16 crc kubenswrapper[4815]: I1207 19:16:16.387221 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:16Z","lastTransitionTime":"2025-12-07T19:16:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:16 crc kubenswrapper[4815]: I1207 19:16:16.489514 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:16 crc kubenswrapper[4815]: I1207 19:16:16.489830 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:16 crc kubenswrapper[4815]: I1207 19:16:16.489936 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:16 crc kubenswrapper[4815]: I1207 19:16:16.490118 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:16 crc kubenswrapper[4815]: I1207 19:16:16.490205 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:16Z","lastTransitionTime":"2025-12-07T19:16:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:16 crc kubenswrapper[4815]: I1207 19:16:16.593370 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:16 crc kubenswrapper[4815]: I1207 19:16:16.593443 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:16 crc kubenswrapper[4815]: I1207 19:16:16.593464 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:16 crc kubenswrapper[4815]: I1207 19:16:16.593489 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:16 crc kubenswrapper[4815]: I1207 19:16:16.593505 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:16Z","lastTransitionTime":"2025-12-07T19:16:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:16 crc kubenswrapper[4815]: I1207 19:16:16.697155 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:16 crc kubenswrapper[4815]: I1207 19:16:16.697216 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:16 crc kubenswrapper[4815]: I1207 19:16:16.697237 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:16 crc kubenswrapper[4815]: I1207 19:16:16.697261 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:16 crc kubenswrapper[4815]: I1207 19:16:16.697280 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:16Z","lastTransitionTime":"2025-12-07T19:16:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:16 crc kubenswrapper[4815]: I1207 19:16:16.769325 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 07 19:16:16 crc kubenswrapper[4815]: I1207 19:16:16.769194 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xbq22" Dec 07 19:16:16 crc kubenswrapper[4815]: E1207 19:16:16.769967 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 07 19:16:16 crc kubenswrapper[4815]: E1207 19:16:16.770396 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xbq22" podUID="201e9ba8-3e19-4555-90f0-587497a2a328" Dec 07 19:16:16 crc kubenswrapper[4815]: I1207 19:16:16.793731 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Dec 07 19:16:16 crc kubenswrapper[4815]: I1207 19:16:16.800799 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:16 crc kubenswrapper[4815]: I1207 19:16:16.800862 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:16 crc kubenswrapper[4815]: I1207 19:16:16.800883 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:16 crc kubenswrapper[4815]: I1207 19:16:16.800942 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:16 crc kubenswrapper[4815]: I1207 19:16:16.800967 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:16Z","lastTransitionTime":"2025-12-07T19:16:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:16 crc kubenswrapper[4815]: I1207 19:16:16.904263 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:16 crc kubenswrapper[4815]: I1207 19:16:16.904344 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:16 crc kubenswrapper[4815]: I1207 19:16:16.904368 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:16 crc kubenswrapper[4815]: I1207 19:16:16.904392 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:16 crc kubenswrapper[4815]: I1207 19:16:16.904412 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:16Z","lastTransitionTime":"2025-12-07T19:16:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:17 crc kubenswrapper[4815]: I1207 19:16:17.007299 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:17 crc kubenswrapper[4815]: I1207 19:16:17.007369 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:17 crc kubenswrapper[4815]: I1207 19:16:17.007398 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:17 crc kubenswrapper[4815]: I1207 19:16:17.007427 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:17 crc kubenswrapper[4815]: I1207 19:16:17.007449 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:17Z","lastTransitionTime":"2025-12-07T19:16:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:17 crc kubenswrapper[4815]: I1207 19:16:17.111304 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:17 crc kubenswrapper[4815]: I1207 19:16:17.111360 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:17 crc kubenswrapper[4815]: I1207 19:16:17.111378 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:17 crc kubenswrapper[4815]: I1207 19:16:17.111404 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:17 crc kubenswrapper[4815]: I1207 19:16:17.111421 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:17Z","lastTransitionTime":"2025-12-07T19:16:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:17 crc kubenswrapper[4815]: I1207 19:16:17.214729 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:17 crc kubenswrapper[4815]: I1207 19:16:17.215311 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:17 crc kubenswrapper[4815]: I1207 19:16:17.215339 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:17 crc kubenswrapper[4815]: I1207 19:16:17.215374 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:17 crc kubenswrapper[4815]: I1207 19:16:17.215398 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:17Z","lastTransitionTime":"2025-12-07T19:16:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:17 crc kubenswrapper[4815]: I1207 19:16:17.318321 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:17 crc kubenswrapper[4815]: I1207 19:16:17.318375 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:17 crc kubenswrapper[4815]: I1207 19:16:17.318394 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:17 crc kubenswrapper[4815]: I1207 19:16:17.318419 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:17 crc kubenswrapper[4815]: I1207 19:16:17.318436 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:17Z","lastTransitionTime":"2025-12-07T19:16:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:17 crc kubenswrapper[4815]: I1207 19:16:17.421656 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:17 crc kubenswrapper[4815]: I1207 19:16:17.421719 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:17 crc kubenswrapper[4815]: I1207 19:16:17.421737 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:17 crc kubenswrapper[4815]: I1207 19:16:17.421761 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:17 crc kubenswrapper[4815]: I1207 19:16:17.421777 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:17Z","lastTransitionTime":"2025-12-07T19:16:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:17 crc kubenswrapper[4815]: I1207 19:16:17.525229 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:17 crc kubenswrapper[4815]: I1207 19:16:17.525268 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:17 crc kubenswrapper[4815]: I1207 19:16:17.525278 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:17 crc kubenswrapper[4815]: I1207 19:16:17.525293 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:17 crc kubenswrapper[4815]: I1207 19:16:17.525302 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:17Z","lastTransitionTime":"2025-12-07T19:16:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:17 crc kubenswrapper[4815]: I1207 19:16:17.627836 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:17 crc kubenswrapper[4815]: I1207 19:16:17.627879 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:17 crc kubenswrapper[4815]: I1207 19:16:17.627896 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:17 crc kubenswrapper[4815]: I1207 19:16:17.627942 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:17 crc kubenswrapper[4815]: I1207 19:16:17.627959 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:17Z","lastTransitionTime":"2025-12-07T19:16:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:17 crc kubenswrapper[4815]: I1207 19:16:17.731012 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:17 crc kubenswrapper[4815]: I1207 19:16:17.731333 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:17 crc kubenswrapper[4815]: I1207 19:16:17.731416 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:17 crc kubenswrapper[4815]: I1207 19:16:17.731452 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:17 crc kubenswrapper[4815]: I1207 19:16:17.731639 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:17Z","lastTransitionTime":"2025-12-07T19:16:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:17 crc kubenswrapper[4815]: I1207 19:16:17.769201 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 07 19:16:17 crc kubenswrapper[4815]: I1207 19:16:17.769202 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 07 19:16:17 crc kubenswrapper[4815]: E1207 19:16:17.769432 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 07 19:16:17 crc kubenswrapper[4815]: E1207 19:16:17.769585 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 07 19:16:17 crc kubenswrapper[4815]: I1207 19:16:17.834725 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:17 crc kubenswrapper[4815]: I1207 19:16:17.834778 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:17 crc kubenswrapper[4815]: I1207 19:16:17.834804 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:17 crc kubenswrapper[4815]: I1207 19:16:17.834849 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:17 crc kubenswrapper[4815]: I1207 19:16:17.834872 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:17Z","lastTransitionTime":"2025-12-07T19:16:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:17 crc kubenswrapper[4815]: I1207 19:16:17.939299 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:17 crc kubenswrapper[4815]: I1207 19:16:17.939450 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:17 crc kubenswrapper[4815]: I1207 19:16:17.939482 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:17 crc kubenswrapper[4815]: I1207 19:16:17.939508 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:17 crc kubenswrapper[4815]: I1207 19:16:17.939525 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:17Z","lastTransitionTime":"2025-12-07T19:16:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:18 crc kubenswrapper[4815]: I1207 19:16:18.042557 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:18 crc kubenswrapper[4815]: I1207 19:16:18.042634 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:18 crc kubenswrapper[4815]: I1207 19:16:18.042658 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:18 crc kubenswrapper[4815]: I1207 19:16:18.042699 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:18 crc kubenswrapper[4815]: I1207 19:16:18.042728 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:18Z","lastTransitionTime":"2025-12-07T19:16:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:18 crc kubenswrapper[4815]: I1207 19:16:18.145402 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:18 crc kubenswrapper[4815]: I1207 19:16:18.145460 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:18 crc kubenswrapper[4815]: I1207 19:16:18.145476 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:18 crc kubenswrapper[4815]: I1207 19:16:18.145501 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:18 crc kubenswrapper[4815]: I1207 19:16:18.145518 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:18Z","lastTransitionTime":"2025-12-07T19:16:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:18 crc kubenswrapper[4815]: I1207 19:16:18.249284 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:18 crc kubenswrapper[4815]: I1207 19:16:18.249437 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:18 crc kubenswrapper[4815]: I1207 19:16:18.249464 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:18 crc kubenswrapper[4815]: I1207 19:16:18.249495 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:18 crc kubenswrapper[4815]: I1207 19:16:18.249513 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:18Z","lastTransitionTime":"2025-12-07T19:16:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:18 crc kubenswrapper[4815]: I1207 19:16:18.352227 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:18 crc kubenswrapper[4815]: I1207 19:16:18.352289 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:18 crc kubenswrapper[4815]: I1207 19:16:18.352307 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:18 crc kubenswrapper[4815]: I1207 19:16:18.352332 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:18 crc kubenswrapper[4815]: I1207 19:16:18.352354 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:18Z","lastTransitionTime":"2025-12-07T19:16:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:18 crc kubenswrapper[4815]: I1207 19:16:18.455009 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:18 crc kubenswrapper[4815]: I1207 19:16:18.455072 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:18 crc kubenswrapper[4815]: I1207 19:16:18.455091 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:18 crc kubenswrapper[4815]: I1207 19:16:18.455118 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:18 crc kubenswrapper[4815]: I1207 19:16:18.455136 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:18Z","lastTransitionTime":"2025-12-07T19:16:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:18 crc kubenswrapper[4815]: I1207 19:16:18.551468 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:18 crc kubenswrapper[4815]: I1207 19:16:18.551541 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:18 crc kubenswrapper[4815]: I1207 19:16:18.551564 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:18 crc kubenswrapper[4815]: I1207 19:16:18.551590 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:18 crc kubenswrapper[4815]: I1207 19:16:18.551607 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:18Z","lastTransitionTime":"2025-12-07T19:16:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:18 crc kubenswrapper[4815]: E1207 19:16:18.572451 4815 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:16:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:16:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:16:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:16:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:16:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:16:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:16:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:16:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3d749f27-20c8-4d23-ad52-dc6b852bf3b7\\\",\\\"systemUUID\\\":\\\"077277cc-9bde-4aeb-947a-0cf3c49a1ac0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:18Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:18 crc kubenswrapper[4815]: I1207 19:16:18.578892 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:18 crc kubenswrapper[4815]: I1207 19:16:18.578961 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:18 crc kubenswrapper[4815]: I1207 19:16:18.578973 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:18 crc kubenswrapper[4815]: I1207 19:16:18.578991 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:18 crc kubenswrapper[4815]: I1207 19:16:18.579003 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:18Z","lastTransitionTime":"2025-12-07T19:16:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:18 crc kubenswrapper[4815]: E1207 19:16:18.598511 4815 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:16:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:16:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:16:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:16:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:16:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:16:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:16:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:16:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3d749f27-20c8-4d23-ad52-dc6b852bf3b7\\\",\\\"systemUUID\\\":\\\"077277cc-9bde-4aeb-947a-0cf3c49a1ac0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:18Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:18 crc kubenswrapper[4815]: I1207 19:16:18.603571 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:18 crc kubenswrapper[4815]: I1207 19:16:18.603603 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:18 crc kubenswrapper[4815]: I1207 19:16:18.603614 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:18 crc kubenswrapper[4815]: I1207 19:16:18.603629 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:18 crc kubenswrapper[4815]: I1207 19:16:18.603666 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:18Z","lastTransitionTime":"2025-12-07T19:16:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:18 crc kubenswrapper[4815]: E1207 19:16:18.625640 4815 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:16:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:16:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:16:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:16:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:16:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:16:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:16:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:16:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3d749f27-20c8-4d23-ad52-dc6b852bf3b7\\\",\\\"systemUUID\\\":\\\"077277cc-9bde-4aeb-947a-0cf3c49a1ac0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:18Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:18 crc kubenswrapper[4815]: I1207 19:16:18.630399 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:18 crc kubenswrapper[4815]: I1207 19:16:18.630473 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:18 crc kubenswrapper[4815]: I1207 19:16:18.630490 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:18 crc kubenswrapper[4815]: I1207 19:16:18.630509 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:18 crc kubenswrapper[4815]: I1207 19:16:18.630523 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:18Z","lastTransitionTime":"2025-12-07T19:16:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:18 crc kubenswrapper[4815]: E1207 19:16:18.644805 4815 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:16:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:16:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:16:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:16:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:16:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:16:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:16:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:16:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3d749f27-20c8-4d23-ad52-dc6b852bf3b7\\\",\\\"systemUUID\\\":\\\"077277cc-9bde-4aeb-947a-0cf3c49a1ac0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:18Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:18 crc kubenswrapper[4815]: I1207 19:16:18.648910 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:18 crc kubenswrapper[4815]: I1207 19:16:18.649042 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:18 crc kubenswrapper[4815]: I1207 19:16:18.649064 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:18 crc kubenswrapper[4815]: I1207 19:16:18.649095 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:18 crc kubenswrapper[4815]: I1207 19:16:18.649116 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:18Z","lastTransitionTime":"2025-12-07T19:16:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:18 crc kubenswrapper[4815]: E1207 19:16:18.669683 4815 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:16:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:16:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:16:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:16:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:16:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:16:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:16:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:16:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3d749f27-20c8-4d23-ad52-dc6b852bf3b7\\\",\\\"systemUUID\\\":\\\"077277cc-9bde-4aeb-947a-0cf3c49a1ac0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:18Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:18 crc kubenswrapper[4815]: E1207 19:16:18.669870 4815 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 07 19:16:18 crc kubenswrapper[4815]: I1207 19:16:18.672184 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:18 crc kubenswrapper[4815]: I1207 19:16:18.672219 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:18 crc kubenswrapper[4815]: I1207 19:16:18.672233 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:18 crc kubenswrapper[4815]: I1207 19:16:18.672276 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:18 crc kubenswrapper[4815]: I1207 19:16:18.672288 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:18Z","lastTransitionTime":"2025-12-07T19:16:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:18 crc kubenswrapper[4815]: I1207 19:16:18.769700 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 07 19:16:18 crc kubenswrapper[4815]: I1207 19:16:18.769724 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xbq22" Dec 07 19:16:18 crc kubenswrapper[4815]: E1207 19:16:18.770183 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 07 19:16:18 crc kubenswrapper[4815]: E1207 19:16:18.770272 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xbq22" podUID="201e9ba8-3e19-4555-90f0-587497a2a328" Dec 07 19:16:18 crc kubenswrapper[4815]: I1207 19:16:18.780901 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:18 crc kubenswrapper[4815]: I1207 19:16:18.781003 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:18 crc kubenswrapper[4815]: I1207 19:16:18.781021 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:18 crc kubenswrapper[4815]: I1207 19:16:18.781046 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:18 crc kubenswrapper[4815]: I1207 19:16:18.781063 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:18Z","lastTransitionTime":"2025-12-07T19:16:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:18 crc kubenswrapper[4815]: I1207 19:16:18.884737 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:18 crc kubenswrapper[4815]: I1207 19:16:18.884815 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:18 crc kubenswrapper[4815]: I1207 19:16:18.884837 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:18 crc kubenswrapper[4815]: I1207 19:16:18.884866 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:18 crc kubenswrapper[4815]: I1207 19:16:18.884887 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:18Z","lastTransitionTime":"2025-12-07T19:16:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:18 crc kubenswrapper[4815]: I1207 19:16:18.988267 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:18 crc kubenswrapper[4815]: I1207 19:16:18.988392 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:18 crc kubenswrapper[4815]: I1207 19:16:18.988411 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:18 crc kubenswrapper[4815]: I1207 19:16:18.988437 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:18 crc kubenswrapper[4815]: I1207 19:16:18.988453 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:18Z","lastTransitionTime":"2025-12-07T19:16:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:19 crc kubenswrapper[4815]: I1207 19:16:19.091277 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:19 crc kubenswrapper[4815]: I1207 19:16:19.091357 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:19 crc kubenswrapper[4815]: I1207 19:16:19.091381 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:19 crc kubenswrapper[4815]: I1207 19:16:19.091496 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:19 crc kubenswrapper[4815]: I1207 19:16:19.091531 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:19Z","lastTransitionTime":"2025-12-07T19:16:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:19 crc kubenswrapper[4815]: I1207 19:16:19.194802 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:19 crc kubenswrapper[4815]: I1207 19:16:19.194870 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:19 crc kubenswrapper[4815]: I1207 19:16:19.194887 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:19 crc kubenswrapper[4815]: I1207 19:16:19.194911 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:19 crc kubenswrapper[4815]: I1207 19:16:19.194973 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:19Z","lastTransitionTime":"2025-12-07T19:16:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:19 crc kubenswrapper[4815]: I1207 19:16:19.298400 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:19 crc kubenswrapper[4815]: I1207 19:16:19.298457 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:19 crc kubenswrapper[4815]: I1207 19:16:19.298473 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:19 crc kubenswrapper[4815]: I1207 19:16:19.298497 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:19 crc kubenswrapper[4815]: I1207 19:16:19.298513 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:19Z","lastTransitionTime":"2025-12-07T19:16:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:19 crc kubenswrapper[4815]: I1207 19:16:19.402524 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:19 crc kubenswrapper[4815]: I1207 19:16:19.402586 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:19 crc kubenswrapper[4815]: I1207 19:16:19.402626 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:19 crc kubenswrapper[4815]: I1207 19:16:19.402658 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:19 crc kubenswrapper[4815]: I1207 19:16:19.402679 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:19Z","lastTransitionTime":"2025-12-07T19:16:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:19 crc kubenswrapper[4815]: I1207 19:16:19.505861 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:19 crc kubenswrapper[4815]: I1207 19:16:19.505987 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:19 crc kubenswrapper[4815]: I1207 19:16:19.506011 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:19 crc kubenswrapper[4815]: I1207 19:16:19.506044 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:19 crc kubenswrapper[4815]: I1207 19:16:19.506067 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:19Z","lastTransitionTime":"2025-12-07T19:16:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:19 crc kubenswrapper[4815]: I1207 19:16:19.609043 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:19 crc kubenswrapper[4815]: I1207 19:16:19.609088 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:19 crc kubenswrapper[4815]: I1207 19:16:19.609100 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:19 crc kubenswrapper[4815]: I1207 19:16:19.609117 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:19 crc kubenswrapper[4815]: I1207 19:16:19.609128 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:19Z","lastTransitionTime":"2025-12-07T19:16:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:19 crc kubenswrapper[4815]: I1207 19:16:19.712744 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:19 crc kubenswrapper[4815]: I1207 19:16:19.712817 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:19 crc kubenswrapper[4815]: I1207 19:16:19.712838 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:19 crc kubenswrapper[4815]: I1207 19:16:19.712867 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:19 crc kubenswrapper[4815]: I1207 19:16:19.712888 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:19Z","lastTransitionTime":"2025-12-07T19:16:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:19 crc kubenswrapper[4815]: I1207 19:16:19.770298 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 07 19:16:19 crc kubenswrapper[4815]: I1207 19:16:19.770447 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 07 19:16:19 crc kubenswrapper[4815]: E1207 19:16:19.770601 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 07 19:16:19 crc kubenswrapper[4815]: E1207 19:16:19.770609 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 07 19:16:19 crc kubenswrapper[4815]: I1207 19:16:19.815692 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:19 crc kubenswrapper[4815]: I1207 19:16:19.815740 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:19 crc kubenswrapper[4815]: I1207 19:16:19.815756 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:19 crc kubenswrapper[4815]: I1207 19:16:19.815777 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:19 crc kubenswrapper[4815]: I1207 19:16:19.815794 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:19Z","lastTransitionTime":"2025-12-07T19:16:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:19 crc kubenswrapper[4815]: I1207 19:16:19.919872 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:19 crc kubenswrapper[4815]: I1207 19:16:19.920035 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:19 crc kubenswrapper[4815]: I1207 19:16:19.920084 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:19 crc kubenswrapper[4815]: I1207 19:16:19.920109 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:19 crc kubenswrapper[4815]: I1207 19:16:19.920125 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:19Z","lastTransitionTime":"2025-12-07T19:16:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:20 crc kubenswrapper[4815]: I1207 19:16:20.022956 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:20 crc kubenswrapper[4815]: I1207 19:16:20.023012 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:20 crc kubenswrapper[4815]: I1207 19:16:20.023029 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:20 crc kubenswrapper[4815]: I1207 19:16:20.023054 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:20 crc kubenswrapper[4815]: I1207 19:16:20.023071 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:20Z","lastTransitionTime":"2025-12-07T19:16:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:20 crc kubenswrapper[4815]: I1207 19:16:20.126198 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:20 crc kubenswrapper[4815]: I1207 19:16:20.126264 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:20 crc kubenswrapper[4815]: I1207 19:16:20.126282 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:20 crc kubenswrapper[4815]: I1207 19:16:20.126308 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:20 crc kubenswrapper[4815]: I1207 19:16:20.126327 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:20Z","lastTransitionTime":"2025-12-07T19:16:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:20 crc kubenswrapper[4815]: I1207 19:16:20.229515 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:20 crc kubenswrapper[4815]: I1207 19:16:20.229578 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:20 crc kubenswrapper[4815]: I1207 19:16:20.229595 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:20 crc kubenswrapper[4815]: I1207 19:16:20.229618 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:20 crc kubenswrapper[4815]: I1207 19:16:20.229639 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:20Z","lastTransitionTime":"2025-12-07T19:16:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:20 crc kubenswrapper[4815]: I1207 19:16:20.333211 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:20 crc kubenswrapper[4815]: I1207 19:16:20.333302 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:20 crc kubenswrapper[4815]: I1207 19:16:20.333361 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:20 crc kubenswrapper[4815]: I1207 19:16:20.333390 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:20 crc kubenswrapper[4815]: I1207 19:16:20.333448 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:20Z","lastTransitionTime":"2025-12-07T19:16:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:20 crc kubenswrapper[4815]: I1207 19:16:20.436544 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:20 crc kubenswrapper[4815]: I1207 19:16:20.436591 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:20 crc kubenswrapper[4815]: I1207 19:16:20.436606 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:20 crc kubenswrapper[4815]: I1207 19:16:20.436624 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:20 crc kubenswrapper[4815]: I1207 19:16:20.436638 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:20Z","lastTransitionTime":"2025-12-07T19:16:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:20 crc kubenswrapper[4815]: I1207 19:16:20.539049 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:20 crc kubenswrapper[4815]: I1207 19:16:20.539116 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:20 crc kubenswrapper[4815]: I1207 19:16:20.539132 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:20 crc kubenswrapper[4815]: I1207 19:16:20.539156 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:20 crc kubenswrapper[4815]: I1207 19:16:20.539172 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:20Z","lastTransitionTime":"2025-12-07T19:16:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:20 crc kubenswrapper[4815]: I1207 19:16:20.641687 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:20 crc kubenswrapper[4815]: I1207 19:16:20.641733 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:20 crc kubenswrapper[4815]: I1207 19:16:20.641745 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:20 crc kubenswrapper[4815]: I1207 19:16:20.641758 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:20 crc kubenswrapper[4815]: I1207 19:16:20.641766 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:20Z","lastTransitionTime":"2025-12-07T19:16:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:20 crc kubenswrapper[4815]: I1207 19:16:20.744634 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:20 crc kubenswrapper[4815]: I1207 19:16:20.744695 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:20 crc kubenswrapper[4815]: I1207 19:16:20.744720 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:20 crc kubenswrapper[4815]: I1207 19:16:20.744749 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:20 crc kubenswrapper[4815]: I1207 19:16:20.744769 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:20Z","lastTransitionTime":"2025-12-07T19:16:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:20 crc kubenswrapper[4815]: I1207 19:16:20.769132 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xbq22" Dec 07 19:16:20 crc kubenswrapper[4815]: I1207 19:16:20.769210 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 07 19:16:20 crc kubenswrapper[4815]: E1207 19:16:20.769299 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xbq22" podUID="201e9ba8-3e19-4555-90f0-587497a2a328" Dec 07 19:16:20 crc kubenswrapper[4815]: E1207 19:16:20.769404 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 07 19:16:20 crc kubenswrapper[4815]: I1207 19:16:20.848424 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:20 crc kubenswrapper[4815]: I1207 19:16:20.848477 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:20 crc kubenswrapper[4815]: I1207 19:16:20.848495 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:20 crc kubenswrapper[4815]: I1207 19:16:20.848517 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:20 crc kubenswrapper[4815]: I1207 19:16:20.848535 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:20Z","lastTransitionTime":"2025-12-07T19:16:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:20 crc kubenswrapper[4815]: I1207 19:16:20.951352 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:20 crc kubenswrapper[4815]: I1207 19:16:20.951415 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:20 crc kubenswrapper[4815]: I1207 19:16:20.951433 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:20 crc kubenswrapper[4815]: I1207 19:16:20.951454 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:20 crc kubenswrapper[4815]: I1207 19:16:20.951474 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:20Z","lastTransitionTime":"2025-12-07T19:16:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:21 crc kubenswrapper[4815]: I1207 19:16:21.053903 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:21 crc kubenswrapper[4815]: I1207 19:16:21.053966 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:21 crc kubenswrapper[4815]: I1207 19:16:21.053981 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:21 crc kubenswrapper[4815]: I1207 19:16:21.054000 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:21 crc kubenswrapper[4815]: I1207 19:16:21.054014 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:21Z","lastTransitionTime":"2025-12-07T19:16:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:21 crc kubenswrapper[4815]: I1207 19:16:21.157417 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:21 crc kubenswrapper[4815]: I1207 19:16:21.157486 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:21 crc kubenswrapper[4815]: I1207 19:16:21.157505 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:21 crc kubenswrapper[4815]: I1207 19:16:21.157529 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:21 crc kubenswrapper[4815]: I1207 19:16:21.157546 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:21Z","lastTransitionTime":"2025-12-07T19:16:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:21 crc kubenswrapper[4815]: I1207 19:16:21.261756 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:21 crc kubenswrapper[4815]: I1207 19:16:21.261830 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:21 crc kubenswrapper[4815]: I1207 19:16:21.261847 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:21 crc kubenswrapper[4815]: I1207 19:16:21.261871 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:21 crc kubenswrapper[4815]: I1207 19:16:21.261896 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:21Z","lastTransitionTime":"2025-12-07T19:16:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:21 crc kubenswrapper[4815]: I1207 19:16:21.365969 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:21 crc kubenswrapper[4815]: I1207 19:16:21.366355 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:21 crc kubenswrapper[4815]: I1207 19:16:21.366384 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:21 crc kubenswrapper[4815]: I1207 19:16:21.366407 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:21 crc kubenswrapper[4815]: I1207 19:16:21.366424 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:21Z","lastTransitionTime":"2025-12-07T19:16:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:21 crc kubenswrapper[4815]: I1207 19:16:21.469797 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:21 crc kubenswrapper[4815]: I1207 19:16:21.469860 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:21 crc kubenswrapper[4815]: I1207 19:16:21.469884 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:21 crc kubenswrapper[4815]: I1207 19:16:21.469952 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:21 crc kubenswrapper[4815]: I1207 19:16:21.469980 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:21Z","lastTransitionTime":"2025-12-07T19:16:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:21 crc kubenswrapper[4815]: I1207 19:16:21.574098 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:21 crc kubenswrapper[4815]: I1207 19:16:21.574177 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:21 crc kubenswrapper[4815]: I1207 19:16:21.574200 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:21 crc kubenswrapper[4815]: I1207 19:16:21.574232 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:21 crc kubenswrapper[4815]: I1207 19:16:21.574259 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:21Z","lastTransitionTime":"2025-12-07T19:16:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:21 crc kubenswrapper[4815]: I1207 19:16:21.677078 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:21 crc kubenswrapper[4815]: I1207 19:16:21.677136 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:21 crc kubenswrapper[4815]: I1207 19:16:21.677148 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:21 crc kubenswrapper[4815]: I1207 19:16:21.677162 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:21 crc kubenswrapper[4815]: I1207 19:16:21.677170 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:21Z","lastTransitionTime":"2025-12-07T19:16:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:21 crc kubenswrapper[4815]: I1207 19:16:21.769675 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 07 19:16:21 crc kubenswrapper[4815]: E1207 19:16:21.769887 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 07 19:16:21 crc kubenswrapper[4815]: I1207 19:16:21.770229 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 07 19:16:21 crc kubenswrapper[4815]: E1207 19:16:21.770360 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 07 19:16:21 crc kubenswrapper[4815]: I1207 19:16:21.779455 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:21 crc kubenswrapper[4815]: I1207 19:16:21.779484 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:21 crc kubenswrapper[4815]: I1207 19:16:21.779494 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:21 crc kubenswrapper[4815]: I1207 19:16:21.779509 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:21 crc kubenswrapper[4815]: I1207 19:16:21.779519 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:21Z","lastTransitionTime":"2025-12-07T19:16:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:21 crc kubenswrapper[4815]: I1207 19:16:21.882584 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:21 crc kubenswrapper[4815]: I1207 19:16:21.882624 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:21 crc kubenswrapper[4815]: I1207 19:16:21.882636 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:21 crc kubenswrapper[4815]: I1207 19:16:21.882654 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:21 crc kubenswrapper[4815]: I1207 19:16:21.882667 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:21Z","lastTransitionTime":"2025-12-07T19:16:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:21 crc kubenswrapper[4815]: I1207 19:16:21.985024 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:21 crc kubenswrapper[4815]: I1207 19:16:21.985101 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:21 crc kubenswrapper[4815]: I1207 19:16:21.985120 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:21 crc kubenswrapper[4815]: I1207 19:16:21.985145 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:21 crc kubenswrapper[4815]: I1207 19:16:21.985164 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:21Z","lastTransitionTime":"2025-12-07T19:16:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:22 crc kubenswrapper[4815]: I1207 19:16:22.088177 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:22 crc kubenswrapper[4815]: I1207 19:16:22.088299 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:22 crc kubenswrapper[4815]: I1207 19:16:22.088319 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:22 crc kubenswrapper[4815]: I1207 19:16:22.088343 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:22 crc kubenswrapper[4815]: I1207 19:16:22.088362 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:22Z","lastTransitionTime":"2025-12-07T19:16:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:22 crc kubenswrapper[4815]: I1207 19:16:22.191341 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:22 crc kubenswrapper[4815]: I1207 19:16:22.191389 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:22 crc kubenswrapper[4815]: I1207 19:16:22.191406 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:22 crc kubenswrapper[4815]: I1207 19:16:22.191426 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:22 crc kubenswrapper[4815]: I1207 19:16:22.191439 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:22Z","lastTransitionTime":"2025-12-07T19:16:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:22 crc kubenswrapper[4815]: I1207 19:16:22.293983 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:22 crc kubenswrapper[4815]: I1207 19:16:22.294027 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:22 crc kubenswrapper[4815]: I1207 19:16:22.294038 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:22 crc kubenswrapper[4815]: I1207 19:16:22.294054 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:22 crc kubenswrapper[4815]: I1207 19:16:22.294067 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:22Z","lastTransitionTime":"2025-12-07T19:16:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:22 crc kubenswrapper[4815]: I1207 19:16:22.397820 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:22 crc kubenswrapper[4815]: I1207 19:16:22.397863 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:22 crc kubenswrapper[4815]: I1207 19:16:22.397878 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:22 crc kubenswrapper[4815]: I1207 19:16:22.397899 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:22 crc kubenswrapper[4815]: I1207 19:16:22.397937 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:22Z","lastTransitionTime":"2025-12-07T19:16:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:22 crc kubenswrapper[4815]: I1207 19:16:22.500666 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:22 crc kubenswrapper[4815]: I1207 19:16:22.500744 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:22 crc kubenswrapper[4815]: I1207 19:16:22.500769 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:22 crc kubenswrapper[4815]: I1207 19:16:22.500801 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:22 crc kubenswrapper[4815]: I1207 19:16:22.500823 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:22Z","lastTransitionTime":"2025-12-07T19:16:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:22 crc kubenswrapper[4815]: I1207 19:16:22.603854 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:22 crc kubenswrapper[4815]: I1207 19:16:22.603953 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:22 crc kubenswrapper[4815]: I1207 19:16:22.603972 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:22 crc kubenswrapper[4815]: I1207 19:16:22.603998 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:22 crc kubenswrapper[4815]: I1207 19:16:22.604019 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:22Z","lastTransitionTime":"2025-12-07T19:16:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:22 crc kubenswrapper[4815]: I1207 19:16:22.708220 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:22 crc kubenswrapper[4815]: I1207 19:16:22.708284 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:22 crc kubenswrapper[4815]: I1207 19:16:22.708301 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:22 crc kubenswrapper[4815]: I1207 19:16:22.708326 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:22 crc kubenswrapper[4815]: I1207 19:16:22.708344 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:22Z","lastTransitionTime":"2025-12-07T19:16:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:22 crc kubenswrapper[4815]: I1207 19:16:22.769878 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xbq22" Dec 07 19:16:22 crc kubenswrapper[4815]: I1207 19:16:22.769878 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 07 19:16:22 crc kubenswrapper[4815]: E1207 19:16:22.770154 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xbq22" podUID="201e9ba8-3e19-4555-90f0-587497a2a328" Dec 07 19:16:22 crc kubenswrapper[4815]: E1207 19:16:22.770227 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 07 19:16:22 crc kubenswrapper[4815]: I1207 19:16:22.811181 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:22 crc kubenswrapper[4815]: I1207 19:16:22.811228 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:22 crc kubenswrapper[4815]: I1207 19:16:22.811247 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:22 crc kubenswrapper[4815]: I1207 19:16:22.811269 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:22 crc kubenswrapper[4815]: I1207 19:16:22.811286 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:22Z","lastTransitionTime":"2025-12-07T19:16:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:22 crc kubenswrapper[4815]: I1207 19:16:22.914444 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:22 crc kubenswrapper[4815]: I1207 19:16:22.914518 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:22 crc kubenswrapper[4815]: I1207 19:16:22.914543 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:22 crc kubenswrapper[4815]: I1207 19:16:22.914578 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:22 crc kubenswrapper[4815]: I1207 19:16:22.914601 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:22Z","lastTransitionTime":"2025-12-07T19:16:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:23 crc kubenswrapper[4815]: I1207 19:16:23.017837 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:23 crc kubenswrapper[4815]: I1207 19:16:23.017906 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:23 crc kubenswrapper[4815]: I1207 19:16:23.017963 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:23 crc kubenswrapper[4815]: I1207 19:16:23.017993 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:23 crc kubenswrapper[4815]: I1207 19:16:23.018016 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:23Z","lastTransitionTime":"2025-12-07T19:16:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:23 crc kubenswrapper[4815]: I1207 19:16:23.122102 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:23 crc kubenswrapper[4815]: I1207 19:16:23.122151 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:23 crc kubenswrapper[4815]: I1207 19:16:23.122169 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:23 crc kubenswrapper[4815]: I1207 19:16:23.122194 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:23 crc kubenswrapper[4815]: I1207 19:16:23.122211 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:23Z","lastTransitionTime":"2025-12-07T19:16:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:23 crc kubenswrapper[4815]: I1207 19:16:23.225254 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:23 crc kubenswrapper[4815]: I1207 19:16:23.225311 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:23 crc kubenswrapper[4815]: I1207 19:16:23.225326 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:23 crc kubenswrapper[4815]: I1207 19:16:23.225350 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:23 crc kubenswrapper[4815]: I1207 19:16:23.225366 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:23Z","lastTransitionTime":"2025-12-07T19:16:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:23 crc kubenswrapper[4815]: I1207 19:16:23.327708 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:23 crc kubenswrapper[4815]: I1207 19:16:23.327746 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:23 crc kubenswrapper[4815]: I1207 19:16:23.327755 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:23 crc kubenswrapper[4815]: I1207 19:16:23.327768 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:23 crc kubenswrapper[4815]: I1207 19:16:23.327776 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:23Z","lastTransitionTime":"2025-12-07T19:16:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:23 crc kubenswrapper[4815]: I1207 19:16:23.431157 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:23 crc kubenswrapper[4815]: I1207 19:16:23.431230 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:23 crc kubenswrapper[4815]: I1207 19:16:23.431248 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:23 crc kubenswrapper[4815]: I1207 19:16:23.431273 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:23 crc kubenswrapper[4815]: I1207 19:16:23.431289 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:23Z","lastTransitionTime":"2025-12-07T19:16:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:23 crc kubenswrapper[4815]: I1207 19:16:23.534464 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:23 crc kubenswrapper[4815]: I1207 19:16:23.534519 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:23 crc kubenswrapper[4815]: I1207 19:16:23.534535 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:23 crc kubenswrapper[4815]: I1207 19:16:23.534559 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:23 crc kubenswrapper[4815]: I1207 19:16:23.534577 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:23Z","lastTransitionTime":"2025-12-07T19:16:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:23 crc kubenswrapper[4815]: I1207 19:16:23.636695 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:23 crc kubenswrapper[4815]: I1207 19:16:23.636747 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:23 crc kubenswrapper[4815]: I1207 19:16:23.636760 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:23 crc kubenswrapper[4815]: I1207 19:16:23.636778 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:23 crc kubenswrapper[4815]: I1207 19:16:23.636790 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:23Z","lastTransitionTime":"2025-12-07T19:16:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:23 crc kubenswrapper[4815]: I1207 19:16:23.741365 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:23 crc kubenswrapper[4815]: I1207 19:16:23.741438 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:23 crc kubenswrapper[4815]: I1207 19:16:23.741461 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:23 crc kubenswrapper[4815]: I1207 19:16:23.741494 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:23 crc kubenswrapper[4815]: I1207 19:16:23.741517 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:23Z","lastTransitionTime":"2025-12-07T19:16:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:23 crc kubenswrapper[4815]: I1207 19:16:23.769378 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 07 19:16:23 crc kubenswrapper[4815]: I1207 19:16:23.769747 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 07 19:16:23 crc kubenswrapper[4815]: E1207 19:16:23.769980 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 07 19:16:23 crc kubenswrapper[4815]: E1207 19:16:23.770044 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 07 19:16:23 crc kubenswrapper[4815]: I1207 19:16:23.844508 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:23 crc kubenswrapper[4815]: I1207 19:16:23.844604 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:23 crc kubenswrapper[4815]: I1207 19:16:23.844628 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:23 crc kubenswrapper[4815]: I1207 19:16:23.844659 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:23 crc kubenswrapper[4815]: I1207 19:16:23.844681 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:23Z","lastTransitionTime":"2025-12-07T19:16:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:23 crc kubenswrapper[4815]: I1207 19:16:23.947708 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:23 crc kubenswrapper[4815]: I1207 19:16:23.947817 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:23 crc kubenswrapper[4815]: I1207 19:16:23.947843 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:23 crc kubenswrapper[4815]: I1207 19:16:23.947869 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:23 crc kubenswrapper[4815]: I1207 19:16:23.947890 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:23Z","lastTransitionTime":"2025-12-07T19:16:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:24 crc kubenswrapper[4815]: I1207 19:16:24.050652 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:24 crc kubenswrapper[4815]: I1207 19:16:24.050722 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:24 crc kubenswrapper[4815]: I1207 19:16:24.050747 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:24 crc kubenswrapper[4815]: I1207 19:16:24.050775 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:24 crc kubenswrapper[4815]: I1207 19:16:24.050796 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:24Z","lastTransitionTime":"2025-12-07T19:16:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:24 crc kubenswrapper[4815]: I1207 19:16:24.154174 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:24 crc kubenswrapper[4815]: I1207 19:16:24.154247 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:24 crc kubenswrapper[4815]: I1207 19:16:24.154280 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:24 crc kubenswrapper[4815]: I1207 19:16:24.154309 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:24 crc kubenswrapper[4815]: I1207 19:16:24.154330 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:24Z","lastTransitionTime":"2025-12-07T19:16:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:24 crc kubenswrapper[4815]: I1207 19:16:24.257965 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:24 crc kubenswrapper[4815]: I1207 19:16:24.258022 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:24 crc kubenswrapper[4815]: I1207 19:16:24.258041 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:24 crc kubenswrapper[4815]: I1207 19:16:24.258063 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:24 crc kubenswrapper[4815]: I1207 19:16:24.258077 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:24Z","lastTransitionTime":"2025-12-07T19:16:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:24 crc kubenswrapper[4815]: I1207 19:16:24.360878 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:24 crc kubenswrapper[4815]: I1207 19:16:24.361181 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:24 crc kubenswrapper[4815]: I1207 19:16:24.361223 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:24 crc kubenswrapper[4815]: I1207 19:16:24.361254 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:24 crc kubenswrapper[4815]: I1207 19:16:24.361275 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:24Z","lastTransitionTime":"2025-12-07T19:16:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:24 crc kubenswrapper[4815]: I1207 19:16:24.464451 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:24 crc kubenswrapper[4815]: I1207 19:16:24.464519 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:24 crc kubenswrapper[4815]: I1207 19:16:24.464542 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:24 crc kubenswrapper[4815]: I1207 19:16:24.464573 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:24 crc kubenswrapper[4815]: I1207 19:16:24.464599 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:24Z","lastTransitionTime":"2025-12-07T19:16:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:24 crc kubenswrapper[4815]: I1207 19:16:24.567363 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:24 crc kubenswrapper[4815]: I1207 19:16:24.567405 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:24 crc kubenswrapper[4815]: I1207 19:16:24.567414 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:24 crc kubenswrapper[4815]: I1207 19:16:24.567429 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:24 crc kubenswrapper[4815]: I1207 19:16:24.567440 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:24Z","lastTransitionTime":"2025-12-07T19:16:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:24 crc kubenswrapper[4815]: I1207 19:16:24.669544 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:24 crc kubenswrapper[4815]: I1207 19:16:24.669591 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:24 crc kubenswrapper[4815]: I1207 19:16:24.669607 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:24 crc kubenswrapper[4815]: I1207 19:16:24.669690 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:24 crc kubenswrapper[4815]: I1207 19:16:24.669717 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:24Z","lastTransitionTime":"2025-12-07T19:16:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:24 crc kubenswrapper[4815]: I1207 19:16:24.768744 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xbq22" Dec 07 19:16:24 crc kubenswrapper[4815]: E1207 19:16:24.768856 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xbq22" podUID="201e9ba8-3e19-4555-90f0-587497a2a328" Dec 07 19:16:24 crc kubenswrapper[4815]: I1207 19:16:24.768758 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 07 19:16:24 crc kubenswrapper[4815]: E1207 19:16:24.769246 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 07 19:16:24 crc kubenswrapper[4815]: I1207 19:16:24.773207 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:24 crc kubenswrapper[4815]: I1207 19:16:24.773237 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:24 crc kubenswrapper[4815]: I1207 19:16:24.773247 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:24 crc kubenswrapper[4815]: I1207 19:16:24.773260 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:24 crc kubenswrapper[4815]: I1207 19:16:24.773272 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:24Z","lastTransitionTime":"2025-12-07T19:16:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:24 crc kubenswrapper[4815]: I1207 19:16:24.785478 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 07 19:16:24 crc kubenswrapper[4815]: I1207 19:16:24.785541 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 07 19:16:24 crc kubenswrapper[4815]: I1207 19:16:24.785559 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 07 19:16:24 crc kubenswrapper[4815]: E1207 19:16:24.785588 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-07 19:17:28.785563218 +0000 UTC m=+153.364553283 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:16:24 crc kubenswrapper[4815]: E1207 19:16:24.785611 4815 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 07 19:16:24 crc kubenswrapper[4815]: E1207 19:16:24.785638 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-07 19:17:28.78563002 +0000 UTC m=+153.364620055 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 07 19:16:24 crc kubenswrapper[4815]: I1207 19:16:24.785640 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 07 19:16:24 crc kubenswrapper[4815]: I1207 19:16:24.785708 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 07 19:16:24 crc kubenswrapper[4815]: E1207 19:16:24.785767 4815 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 07 19:16:24 crc kubenswrapper[4815]: E1207 19:16:24.785783 4815 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 07 19:16:24 crc kubenswrapper[4815]: E1207 19:16:24.785782 4815 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 07 19:16:24 crc kubenswrapper[4815]: E1207 19:16:24.785830 4815 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 07 19:16:24 crc kubenswrapper[4815]: E1207 19:16:24.785845 4815 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 07 19:16:24 crc kubenswrapper[4815]: E1207 19:16:24.785863 4815 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 07 19:16:24 crc kubenswrapper[4815]: E1207 19:16:24.785881 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-07 19:17:28.785864417 +0000 UTC m=+153.364854482 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 07 19:16:24 crc kubenswrapper[4815]: E1207 19:16:24.785792 4815 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 07 19:16:24 crc kubenswrapper[4815]: E1207 19:16:24.785984 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-07 19:17:28.785959909 +0000 UTC m=+153.364949994 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 07 19:16:24 crc kubenswrapper[4815]: E1207 19:16:24.786033 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-07 19:17:28.785997 +0000 UTC m=+153.364987145 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 07 19:16:24 crc kubenswrapper[4815]: I1207 19:16:24.877005 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:24 crc kubenswrapper[4815]: I1207 19:16:24.877081 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:24 crc kubenswrapper[4815]: I1207 19:16:24.877109 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:24 crc kubenswrapper[4815]: I1207 19:16:24.877145 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:24 crc kubenswrapper[4815]: I1207 19:16:24.877177 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:24Z","lastTransitionTime":"2025-12-07T19:16:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:24 crc kubenswrapper[4815]: I1207 19:16:24.980546 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:24 crc kubenswrapper[4815]: I1207 19:16:24.980620 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:24 crc kubenswrapper[4815]: I1207 19:16:24.980644 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:24 crc kubenswrapper[4815]: I1207 19:16:24.980674 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:24 crc kubenswrapper[4815]: I1207 19:16:24.980694 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:24Z","lastTransitionTime":"2025-12-07T19:16:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:25 crc kubenswrapper[4815]: I1207 19:16:25.084564 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:25 crc kubenswrapper[4815]: I1207 19:16:25.084623 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:25 crc kubenswrapper[4815]: I1207 19:16:25.084640 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:25 crc kubenswrapper[4815]: I1207 19:16:25.084666 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:25 crc kubenswrapper[4815]: I1207 19:16:25.084682 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:25Z","lastTransitionTime":"2025-12-07T19:16:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:25 crc kubenswrapper[4815]: I1207 19:16:25.188317 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:25 crc kubenswrapper[4815]: I1207 19:16:25.188390 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:25 crc kubenswrapper[4815]: I1207 19:16:25.188416 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:25 crc kubenswrapper[4815]: I1207 19:16:25.188446 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:25 crc kubenswrapper[4815]: I1207 19:16:25.188468 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:25Z","lastTransitionTime":"2025-12-07T19:16:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:25 crc kubenswrapper[4815]: I1207 19:16:25.291557 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:25 crc kubenswrapper[4815]: I1207 19:16:25.291645 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:25 crc kubenswrapper[4815]: I1207 19:16:25.291664 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:25 crc kubenswrapper[4815]: I1207 19:16:25.291690 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:25 crc kubenswrapper[4815]: I1207 19:16:25.291710 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:25Z","lastTransitionTime":"2025-12-07T19:16:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:25 crc kubenswrapper[4815]: I1207 19:16:25.394741 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:25 crc kubenswrapper[4815]: I1207 19:16:25.394812 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:25 crc kubenswrapper[4815]: I1207 19:16:25.394832 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:25 crc kubenswrapper[4815]: I1207 19:16:25.394857 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:25 crc kubenswrapper[4815]: I1207 19:16:25.394876 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:25Z","lastTransitionTime":"2025-12-07T19:16:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:25 crc kubenswrapper[4815]: I1207 19:16:25.497702 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:25 crc kubenswrapper[4815]: I1207 19:16:25.497758 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:25 crc kubenswrapper[4815]: I1207 19:16:25.497776 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:25 crc kubenswrapper[4815]: I1207 19:16:25.497800 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:25 crc kubenswrapper[4815]: I1207 19:16:25.497817 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:25Z","lastTransitionTime":"2025-12-07T19:16:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:25 crc kubenswrapper[4815]: I1207 19:16:25.599824 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:25 crc kubenswrapper[4815]: I1207 19:16:25.599853 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:25 crc kubenswrapper[4815]: I1207 19:16:25.599862 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:25 crc kubenswrapper[4815]: I1207 19:16:25.599876 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:25 crc kubenswrapper[4815]: I1207 19:16:25.599885 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:25Z","lastTransitionTime":"2025-12-07T19:16:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:25 crc kubenswrapper[4815]: I1207 19:16:25.702514 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:25 crc kubenswrapper[4815]: I1207 19:16:25.702600 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:25 crc kubenswrapper[4815]: I1207 19:16:25.702706 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:25 crc kubenswrapper[4815]: I1207 19:16:25.702831 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:25 crc kubenswrapper[4815]: I1207 19:16:25.702889 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:25Z","lastTransitionTime":"2025-12-07T19:16:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:25 crc kubenswrapper[4815]: I1207 19:16:25.769063 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 07 19:16:25 crc kubenswrapper[4815]: E1207 19:16:25.769249 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 07 19:16:25 crc kubenswrapper[4815]: I1207 19:16:25.769383 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 07 19:16:25 crc kubenswrapper[4815]: E1207 19:16:25.769438 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 07 19:16:25 crc kubenswrapper[4815]: I1207 19:16:25.789355 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c752886-7f9b-4605-8358-0fde597c93da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://441c88363ea1cc8305fa21c51c9237798c47d82a207c9e80da98df93027fe4a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb1ec115ead917caf40ce3586653d6eb78e9a290d845f766f8818c2b57ece6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bda919a651aa83d2f9a82f711d0b242e14140e3d13ec5a80f7ec675a4aedb21a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33a4c84a8cd512a90c51f4721d9ddfde2a452e7f03d9809d492d5bccd7cf4aa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33a4c84a8cd512a90c51f4721d9ddfde2a452e7f03d9809d492d5bccd7cf4aa4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:25Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:25 crc kubenswrapper[4815]: I1207 19:16:25.804446 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe01e9d3-9839-41ea-82d9-f4b6f48bf3b2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52ab545f838534306ea70cc6e400e9fb56ed71ed8206d72c6626166f3a51a93c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f59886c065844948639bf9828ed20c24546cc38659683c98162568c6b12c029\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bec885c9a80d9b5b1fe630a9b7a3f19b23b0af05f359bd87a664f007812b4c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c76d9f056d22d1be012327374370269c2f6d6b8bd341c98419fcb744370751e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539bb1f0e569004272aac994e13bdc7f17f342f5986f08beca97a4c888e046c2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1207 19:15:09.905103 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1207 19:15:09.906859 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4228347019/tls.crt::/tmp/serving-cert-4228347019/tls.key\\\\\\\"\\\\nI1207 19:15:20.294696 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1207 19:15:20.298748 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1207 19:15:20.298781 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1207 19:15:20.298843 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1207 19:15:20.298860 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1207 19:15:20.305864 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1207 19:15:20.305885 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1207 19:15:20.305891 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1207 19:15:20.305973 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1207 19:15:20.305983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1207 19:15:20.305988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1207 19:15:20.305992 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1207 19:15:20.305997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1207 19:15:20.308727 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://902a764a3f6e3ac55c1817f369dfd69ffbf8528bfe4444a6514cdabd48f09fb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:25Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:25 crc kubenswrapper[4815]: I1207 19:16:25.806873 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:25 crc kubenswrapper[4815]: I1207 19:16:25.806980 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:25 crc kubenswrapper[4815]: I1207 19:16:25.807004 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:25 crc kubenswrapper[4815]: I1207 19:16:25.807041 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:25 crc kubenswrapper[4815]: I1207 19:16:25.807065 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:25Z","lastTransitionTime":"2025-12-07T19:16:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:25 crc kubenswrapper[4815]: I1207 19:16:25.824822 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:25Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:25 crc kubenswrapper[4815]: I1207 19:16:25.844151 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b57a8653da7e9136a6c54872ff67a01c2d6ad5077842b783ffe74e0069bc8693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:25Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:25 crc kubenswrapper[4815]: I1207 19:16:25.859013 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:25Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:25 crc kubenswrapper[4815]: I1207 19:16:25.879014 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ec34a6371c74ee0778e27d73f5e2948d9ee1bb518f658675df123df60879c61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc20ae853b989c9e9cfeb47a40bd1a02397e3ce0b0b4feec3fcec56ea7be16a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:25Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:25 crc kubenswrapper[4815]: I1207 19:16:25.899202 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:25Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:25 crc kubenswrapper[4815]: I1207 19:16:25.910981 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:25 crc kubenswrapper[4815]: I1207 19:16:25.911039 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:25 crc kubenswrapper[4815]: I1207 19:16:25.911058 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:25 crc kubenswrapper[4815]: I1207 19:16:25.911087 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:25 crc kubenswrapper[4815]: I1207 19:16:25.911111 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:25Z","lastTransitionTime":"2025-12-07T19:16:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:25 crc kubenswrapper[4815]: I1207 19:16:25.917196 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nz6q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92d414c2-ecc3-4598-ac9d-b982bfd89c7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://003f0d6dca39f167925bc3a05634d9dd20f23e54135631e0099a56801f9daf3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cp5k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nz6q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:25Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:25 crc kubenswrapper[4815]: I1207 19:16:25.935494 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d662ba2-aa03-4eea-bd30-8ad40638f6c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4939db657209a8744656f63541b50598888981216b000dd4316a9327fdfbcf34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2v8vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55eba62082762b6527923446c2d419d2eb00919331cf8e59bfa9cb27ca65a82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2v8vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gkn4h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:25Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:25 crc kubenswrapper[4815]: I1207 19:16:25.954441 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xbq22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"201e9ba8-3e19-4555-90f0-587497a2a328\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z24bd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z24bd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xbq22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:25Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:25 crc kubenswrapper[4815]: I1207 19:16:25.990497 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0499c01-b137-4349-ad4b-5570742d072a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c5e629885bc5c80c370216c2ad383a9c258c28ef825a043812dbb045e61aa1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db98a67270148a5b0749995280d4e5591b1682e847d56b06654319b528928abf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f2e12c3c62b663f9ccb2d51e3e9d80e25024d21bd1c539f999401238ee2003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3af37bcac7eb4c2aae020c71eb18933c2d3a283f59b3dbb02dba4dfdee1c3928\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7bfebca60e63011d26025b24bad72ec02c223b2d889f6985504a371645bcfc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0faf51cb907b5a90d7d1856e10ed14b8550be7fea5ff9b912ad8fbb96d1a67ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0faf51cb907b5a90d7d1856e10ed14b8550be7fea5ff9b912ad8fbb96d1a67ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55f8d454d8472869d96503b5e9e62e8f8f3a41fa94d4d4e6f38e3c8446b025c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55f8d454d8472869d96503b5e9e62e8f8f3a41fa94d4d4e6f38e3c8446b025c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:14:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://df109e5aa0deba7a690ea1f5f160ab06d8a3a3e0aa3536b620a7ff738e61c701\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df109e5aa0deba7a690ea1f5f160ab06d8a3a3e0aa3536b620a7ff738e61c701\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:25Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:26 crc kubenswrapper[4815]: I1207 19:16:26.004950 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s95hp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b739f36-d9c4-4fb6-9ead-9df05e283dea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:16:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:16:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7285635ab7710e9071a051b6e49036856b4c60c87b5110debec3bfb20bb0ac97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73eedf1d85ce69ddf2fed2445ed9318cb4515aee47f89fe9b1db69bbf213ea2a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-07T19:16:12Z\\\",\\\"message\\\":\\\"2025-12-07T19:15:27+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_5074391b-4630-4c1f-b2fa-f9e22786274e\\\\n2025-12-07T19:15:27+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_5074391b-4630-4c1f-b2fa-f9e22786274e to /host/opt/cni/bin/\\\\n2025-12-07T19:15:27Z [verbose] multus-daemon started\\\\n2025-12-07T19:15:27Z [verbose] Readiness Indicator file check\\\\n2025-12-07T19:16:12Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:16:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prbsc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s95hp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:26Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:26 crc kubenswrapper[4815]: I1207 19:16:26.013550 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:26 crc kubenswrapper[4815]: I1207 19:16:26.013611 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:26 crc kubenswrapper[4815]: I1207 19:16:26.013628 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:26 crc kubenswrapper[4815]: I1207 19:16:26.013653 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:26 crc kubenswrapper[4815]: I1207 19:16:26.013670 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:26Z","lastTransitionTime":"2025-12-07T19:16:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:26 crc kubenswrapper[4815]: I1207 19:16:26.020765 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gmf4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fefaece-ab52-48e2-9ee9-fb07be1922f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://600d32fcae56dd0fa63421283e9b0fdfc23c4589a1dc4a66e5ff1f4c6a415e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e14454c2c5dbbdcbfd3ff7ec5d618c2dfe48f41af9846b72f61941b878e53d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e14454c2c5dbbdcbfd3ff7ec5d618c2dfe48f41af9846b72f61941b878e53d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f33f1decd49213b8e034c2d1f2a3a51ab99aa18679bd3378becaa27f1ef4b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f33f1decd49213b8e034c2d1f2a3a51ab99aa18679bd3378becaa27f1ef4b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c519c7c8ea1b7405466970c64269b72cc6e8129838764dfc4a662cb4df419c6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c519c7c8ea1b7405466970c64269b72cc6e8129838764dfc4a662cb4df419c6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c9efcd688f4886df56fb4d0e2ff5f5dd0c9e877186e1b53d398b7bed0547609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c9efcd688f4886df56fb4d0e2ff5f5dd0c9e877186e1b53d398b7bed0547609\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a4a622d5231f5bacbb9dba0290a87f10310a0053be2dd692718fcf1002deb44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a4a622d5231f5bacbb9dba0290a87f10310a0053be2dd692718fcf1002deb44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://825649f5768765281bad22f9b86ed162b80cc258ce11a746ec06c9f415a21529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://825649f5768765281bad22f9b86ed162b80cc258ce11a746ec06c9f415a21529\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gmf4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:26Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:26 crc kubenswrapper[4815]: I1207 19:16:26.040570 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tqxds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dfd265a-0c7f-40bd-9226-82c2b1abbeda\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3523e317081c56f05c00d0288c6ad6f1ff04f1346772bc0c86aed488570786a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzfhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a277a724d026909546ae84aee6d8d29fc87f0277f9d6aa45ca99e5f116ac79d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzfhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tqxds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:26Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:26 crc kubenswrapper[4815]: I1207 19:16:26.058078 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"538dccc2-452b-4d13-8b0d-df993c2b69a2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae001d2e4c277c57382ca0c4a1286c3c135f2fc06ca5a78df27cbdab588ce576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f92a2d37af34c519a6b00a9643f4a46de779ae0070adb0826e9f227960fe4a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89a76372fcc5ff7131084098e41183d148096169b9ccaf69d59dbef45dcfd755\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f29e69a7aa018d68c2e39f3e0f7a619fb25262acaebf83b81fabb6f9b87a9892\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:26Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:26 crc kubenswrapper[4815]: I1207 19:16:26.072016 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cb6cb3bfed28d77ff74b7a2273ddb477e21f930f79b169bfbd98fa10a5ead5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:26Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:26 crc kubenswrapper[4815]: I1207 19:16:26.094564 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac471afc20373bd22caa4941b5ebe38fdf62ad2145fabb4211fa02ce8728ab1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4050d0d7398492efc8809311a01c1a60986b97b6d177846e70c9283399f37b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c63ad6fab49ca4b8deb896aef3b5ff43c3a1ae1035e7ad94815a938939c57c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://812356ce6808d8d2a5845dec1132733fab86a0d6853951920ce5fb5083551c7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ab1c01dcbfe50580e64d89dc9e14843419588c89d7d792af89034d8f272e44e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f140818c38211cb31d63c0b540b60681a781c349ba0523866a3c7e80f33756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d8bbcb8ed47df52ce8120a29864bc098ad0cce571dad6460aea888eda70ccf8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d8bbcb8ed47df52ce8120a29864bc098ad0cce571dad6460aea888eda70ccf8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-07T19:15:59Z\\\",\\\"message\\\":\\\"map[10.217.4.213:80:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {2ead45b3-c313-4fbc-a7bc-2b3c4ffd610c}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1207 19:15:58.661470 6472 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1207 19:15:58.661480 6472 ovn.go:134] Ensuring zone local for Pod openshift-kube-controller-manager/kube-controller-manager-crc in node crc\\\\nI1207 19:15:58.661485 6472 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nI1207 19:15:58.661491 6472 default_network_controller.go:776] Recording success event on pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1207 19:15:58.661489 6472 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1207 19:15:58.658410 6472 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:57Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zzw6c_openshift-ovn-kubernetes(13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba72f8ec4b6a36d0c1a6fd8f9f8e5fad577a00b8ae515738a007b994b989d134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a4a4b2d4e80c078ae857e91491c082c6b1c57a5297c485c31fac0c90996b19f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a4a4b2d4e80c078ae857e91491c082c6b1c57a5297c485c31fac0c90996b19f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zzw6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:26Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:26 crc kubenswrapper[4815]: I1207 19:16:26.106354 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kh7gd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29b15186-a725-4067-ba76-336f580327fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1fa8bda26912926bc52757424a9930e1eca6ee953f28ffdb205c1b8b8f29d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kh7gd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:26Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:26 crc kubenswrapper[4815]: I1207 19:16:26.116804 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:26 crc kubenswrapper[4815]: I1207 19:16:26.116871 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:26 crc kubenswrapper[4815]: I1207 19:16:26.116889 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:26 crc kubenswrapper[4815]: I1207 19:16:26.116942 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:26 crc kubenswrapper[4815]: I1207 19:16:26.116962 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:26Z","lastTransitionTime":"2025-12-07T19:16:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:26 crc kubenswrapper[4815]: I1207 19:16:26.219396 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:26 crc kubenswrapper[4815]: I1207 19:16:26.219433 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:26 crc kubenswrapper[4815]: I1207 19:16:26.219445 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:26 crc kubenswrapper[4815]: I1207 19:16:26.219461 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:26 crc kubenswrapper[4815]: I1207 19:16:26.219474 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:26Z","lastTransitionTime":"2025-12-07T19:16:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:26 crc kubenswrapper[4815]: I1207 19:16:26.322165 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:26 crc kubenswrapper[4815]: I1207 19:16:26.322349 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:26 crc kubenswrapper[4815]: I1207 19:16:26.323581 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:26 crc kubenswrapper[4815]: I1207 19:16:26.323611 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:26 crc kubenswrapper[4815]: I1207 19:16:26.323631 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:26Z","lastTransitionTime":"2025-12-07T19:16:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:26 crc kubenswrapper[4815]: I1207 19:16:26.426950 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:26 crc kubenswrapper[4815]: I1207 19:16:26.427024 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:26 crc kubenswrapper[4815]: I1207 19:16:26.427045 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:26 crc kubenswrapper[4815]: I1207 19:16:26.427074 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:26 crc kubenswrapper[4815]: I1207 19:16:26.427097 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:26Z","lastTransitionTime":"2025-12-07T19:16:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:26 crc kubenswrapper[4815]: I1207 19:16:26.530321 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:26 crc kubenswrapper[4815]: I1207 19:16:26.530372 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:26 crc kubenswrapper[4815]: I1207 19:16:26.530400 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:26 crc kubenswrapper[4815]: I1207 19:16:26.530444 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:26 crc kubenswrapper[4815]: I1207 19:16:26.530468 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:26Z","lastTransitionTime":"2025-12-07T19:16:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:26 crc kubenswrapper[4815]: I1207 19:16:26.633534 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:26 crc kubenswrapper[4815]: I1207 19:16:26.633592 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:26 crc kubenswrapper[4815]: I1207 19:16:26.633613 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:26 crc kubenswrapper[4815]: I1207 19:16:26.633645 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:26 crc kubenswrapper[4815]: I1207 19:16:26.633666 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:26Z","lastTransitionTime":"2025-12-07T19:16:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:26 crc kubenswrapper[4815]: I1207 19:16:26.736374 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:26 crc kubenswrapper[4815]: I1207 19:16:26.736417 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:26 crc kubenswrapper[4815]: I1207 19:16:26.736430 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:26 crc kubenswrapper[4815]: I1207 19:16:26.736448 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:26 crc kubenswrapper[4815]: I1207 19:16:26.736463 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:26Z","lastTransitionTime":"2025-12-07T19:16:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:26 crc kubenswrapper[4815]: I1207 19:16:26.769432 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 07 19:16:26 crc kubenswrapper[4815]: I1207 19:16:26.769489 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xbq22" Dec 07 19:16:26 crc kubenswrapper[4815]: E1207 19:16:26.769577 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 07 19:16:26 crc kubenswrapper[4815]: E1207 19:16:26.769634 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xbq22" podUID="201e9ba8-3e19-4555-90f0-587497a2a328" Dec 07 19:16:26 crc kubenswrapper[4815]: I1207 19:16:26.839394 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:26 crc kubenswrapper[4815]: I1207 19:16:26.839447 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:26 crc kubenswrapper[4815]: I1207 19:16:26.839462 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:26 crc kubenswrapper[4815]: I1207 19:16:26.839483 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:26 crc kubenswrapper[4815]: I1207 19:16:26.839497 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:26Z","lastTransitionTime":"2025-12-07T19:16:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:26 crc kubenswrapper[4815]: I1207 19:16:26.943020 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:26 crc kubenswrapper[4815]: I1207 19:16:26.946606 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:26 crc kubenswrapper[4815]: I1207 19:16:26.947482 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:26 crc kubenswrapper[4815]: I1207 19:16:26.947579 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:26 crc kubenswrapper[4815]: I1207 19:16:26.947605 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:26Z","lastTransitionTime":"2025-12-07T19:16:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:27 crc kubenswrapper[4815]: I1207 19:16:27.051838 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:27 crc kubenswrapper[4815]: I1207 19:16:27.051963 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:27 crc kubenswrapper[4815]: I1207 19:16:27.051982 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:27 crc kubenswrapper[4815]: I1207 19:16:27.052006 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:27 crc kubenswrapper[4815]: I1207 19:16:27.052023 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:27Z","lastTransitionTime":"2025-12-07T19:16:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:27 crc kubenswrapper[4815]: I1207 19:16:27.155297 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:27 crc kubenswrapper[4815]: I1207 19:16:27.155362 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:27 crc kubenswrapper[4815]: I1207 19:16:27.155380 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:27 crc kubenswrapper[4815]: I1207 19:16:27.155405 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:27 crc kubenswrapper[4815]: I1207 19:16:27.155424 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:27Z","lastTransitionTime":"2025-12-07T19:16:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:27 crc kubenswrapper[4815]: I1207 19:16:27.258962 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:27 crc kubenswrapper[4815]: I1207 19:16:27.259021 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:27 crc kubenswrapper[4815]: I1207 19:16:27.259040 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:27 crc kubenswrapper[4815]: I1207 19:16:27.259066 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:27 crc kubenswrapper[4815]: I1207 19:16:27.259084 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:27Z","lastTransitionTime":"2025-12-07T19:16:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:27 crc kubenswrapper[4815]: I1207 19:16:27.362394 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:27 crc kubenswrapper[4815]: I1207 19:16:27.362454 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:27 crc kubenswrapper[4815]: I1207 19:16:27.362472 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:27 crc kubenswrapper[4815]: I1207 19:16:27.362496 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:27 crc kubenswrapper[4815]: I1207 19:16:27.362514 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:27Z","lastTransitionTime":"2025-12-07T19:16:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:27 crc kubenswrapper[4815]: I1207 19:16:27.465411 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:27 crc kubenswrapper[4815]: I1207 19:16:27.465466 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:27 crc kubenswrapper[4815]: I1207 19:16:27.465490 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:27 crc kubenswrapper[4815]: I1207 19:16:27.465519 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:27 crc kubenswrapper[4815]: I1207 19:16:27.465538 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:27Z","lastTransitionTime":"2025-12-07T19:16:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:27 crc kubenswrapper[4815]: I1207 19:16:27.569271 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:27 crc kubenswrapper[4815]: I1207 19:16:27.569346 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:27 crc kubenswrapper[4815]: I1207 19:16:27.569371 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:27 crc kubenswrapper[4815]: I1207 19:16:27.569400 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:27 crc kubenswrapper[4815]: I1207 19:16:27.569423 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:27Z","lastTransitionTime":"2025-12-07T19:16:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:27 crc kubenswrapper[4815]: I1207 19:16:27.671866 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:27 crc kubenswrapper[4815]: I1207 19:16:27.671949 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:27 crc kubenswrapper[4815]: I1207 19:16:27.671968 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:27 crc kubenswrapper[4815]: I1207 19:16:27.671995 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:27 crc kubenswrapper[4815]: I1207 19:16:27.672012 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:27Z","lastTransitionTime":"2025-12-07T19:16:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:27 crc kubenswrapper[4815]: I1207 19:16:27.770520 4815 scope.go:117] "RemoveContainer" containerID="9d8bbcb8ed47df52ce8120a29864bc098ad0cce571dad6460aea888eda70ccf8" Dec 07 19:16:27 crc kubenswrapper[4815]: I1207 19:16:27.771027 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 07 19:16:27 crc kubenswrapper[4815]: I1207 19:16:27.771195 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 07 19:16:27 crc kubenswrapper[4815]: E1207 19:16:27.771364 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 07 19:16:27 crc kubenswrapper[4815]: E1207 19:16:27.771496 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 07 19:16:27 crc kubenswrapper[4815]: I1207 19:16:27.776707 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:27 crc kubenswrapper[4815]: I1207 19:16:27.776750 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:27 crc kubenswrapper[4815]: I1207 19:16:27.776767 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:27 crc kubenswrapper[4815]: I1207 19:16:27.776791 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:27 crc kubenswrapper[4815]: I1207 19:16:27.776808 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:27Z","lastTransitionTime":"2025-12-07T19:16:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:27 crc kubenswrapper[4815]: I1207 19:16:27.879650 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:27 crc kubenswrapper[4815]: I1207 19:16:27.879726 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:27 crc kubenswrapper[4815]: I1207 19:16:27.879745 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:27 crc kubenswrapper[4815]: I1207 19:16:27.879771 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:27 crc kubenswrapper[4815]: I1207 19:16:27.879797 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:27Z","lastTransitionTime":"2025-12-07T19:16:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:27 crc kubenswrapper[4815]: I1207 19:16:27.983103 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:27 crc kubenswrapper[4815]: I1207 19:16:27.983206 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:27 crc kubenswrapper[4815]: I1207 19:16:27.983225 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:27 crc kubenswrapper[4815]: I1207 19:16:27.983252 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:27 crc kubenswrapper[4815]: I1207 19:16:27.983303 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:27Z","lastTransitionTime":"2025-12-07T19:16:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:28 crc kubenswrapper[4815]: I1207 19:16:28.087261 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:28 crc kubenswrapper[4815]: I1207 19:16:28.087316 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:28 crc kubenswrapper[4815]: I1207 19:16:28.087333 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:28 crc kubenswrapper[4815]: I1207 19:16:28.087358 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:28 crc kubenswrapper[4815]: I1207 19:16:28.087375 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:28Z","lastTransitionTime":"2025-12-07T19:16:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:28 crc kubenswrapper[4815]: I1207 19:16:28.191313 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:28 crc kubenswrapper[4815]: I1207 19:16:28.191370 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:28 crc kubenswrapper[4815]: I1207 19:16:28.191388 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:28 crc kubenswrapper[4815]: I1207 19:16:28.191415 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:28 crc kubenswrapper[4815]: I1207 19:16:28.191432 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:28Z","lastTransitionTime":"2025-12-07T19:16:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:28 crc kubenswrapper[4815]: I1207 19:16:28.295215 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:28 crc kubenswrapper[4815]: I1207 19:16:28.295300 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:28 crc kubenswrapper[4815]: I1207 19:16:28.295336 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:28 crc kubenswrapper[4815]: I1207 19:16:28.295366 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:28 crc kubenswrapper[4815]: I1207 19:16:28.295386 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:28Z","lastTransitionTime":"2025-12-07T19:16:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:28 crc kubenswrapper[4815]: I1207 19:16:28.399074 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:28 crc kubenswrapper[4815]: I1207 19:16:28.399493 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:28 crc kubenswrapper[4815]: I1207 19:16:28.399659 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:28 crc kubenswrapper[4815]: I1207 19:16:28.399825 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:28 crc kubenswrapper[4815]: I1207 19:16:28.399850 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:28Z","lastTransitionTime":"2025-12-07T19:16:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:28 crc kubenswrapper[4815]: I1207 19:16:28.502888 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:28 crc kubenswrapper[4815]: I1207 19:16:28.502975 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:28 crc kubenswrapper[4815]: I1207 19:16:28.502998 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:28 crc kubenswrapper[4815]: I1207 19:16:28.503022 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:28 crc kubenswrapper[4815]: I1207 19:16:28.503040 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:28Z","lastTransitionTime":"2025-12-07T19:16:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:28 crc kubenswrapper[4815]: I1207 19:16:28.606316 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:28 crc kubenswrapper[4815]: I1207 19:16:28.606366 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:28 crc kubenswrapper[4815]: I1207 19:16:28.606389 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:28 crc kubenswrapper[4815]: I1207 19:16:28.606414 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:28 crc kubenswrapper[4815]: I1207 19:16:28.606430 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:28Z","lastTransitionTime":"2025-12-07T19:16:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:28 crc kubenswrapper[4815]: I1207 19:16:28.709695 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:28 crc kubenswrapper[4815]: I1207 19:16:28.710670 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:28 crc kubenswrapper[4815]: I1207 19:16:28.710882 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:28 crc kubenswrapper[4815]: I1207 19:16:28.711138 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:28 crc kubenswrapper[4815]: I1207 19:16:28.711312 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:28Z","lastTransitionTime":"2025-12-07T19:16:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:28 crc kubenswrapper[4815]: I1207 19:16:28.769150 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 07 19:16:28 crc kubenswrapper[4815]: I1207 19:16:28.769187 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xbq22" Dec 07 19:16:28 crc kubenswrapper[4815]: E1207 19:16:28.769293 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 07 19:16:28 crc kubenswrapper[4815]: E1207 19:16:28.769513 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xbq22" podUID="201e9ba8-3e19-4555-90f0-587497a2a328" Dec 07 19:16:28 crc kubenswrapper[4815]: I1207 19:16:28.805525 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:28 crc kubenswrapper[4815]: I1207 19:16:28.805593 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:28 crc kubenswrapper[4815]: I1207 19:16:28.805615 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:28 crc kubenswrapper[4815]: I1207 19:16:28.805654 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:28 crc kubenswrapper[4815]: I1207 19:16:28.805677 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:28Z","lastTransitionTime":"2025-12-07T19:16:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:28 crc kubenswrapper[4815]: E1207 19:16:28.829109 4815 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:16:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:16:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:16:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:16:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:16:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:16:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:16:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:16:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3d749f27-20c8-4d23-ad52-dc6b852bf3b7\\\",\\\"systemUUID\\\":\\\"077277cc-9bde-4aeb-947a-0cf3c49a1ac0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:28Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:28 crc kubenswrapper[4815]: I1207 19:16:28.833616 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:28 crc kubenswrapper[4815]: I1207 19:16:28.833651 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:28 crc kubenswrapper[4815]: I1207 19:16:28.833679 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:28 crc kubenswrapper[4815]: I1207 19:16:28.833698 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:28 crc kubenswrapper[4815]: I1207 19:16:28.833714 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:28Z","lastTransitionTime":"2025-12-07T19:16:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:28 crc kubenswrapper[4815]: E1207 19:16:28.847186 4815 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:16:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:16:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:16:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:16:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:16:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:16:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:16:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:16:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3d749f27-20c8-4d23-ad52-dc6b852bf3b7\\\",\\\"systemUUID\\\":\\\"077277cc-9bde-4aeb-947a-0cf3c49a1ac0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:28Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:28 crc kubenswrapper[4815]: I1207 19:16:28.851173 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:28 crc kubenswrapper[4815]: I1207 19:16:28.851222 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:28 crc kubenswrapper[4815]: I1207 19:16:28.851240 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:28 crc kubenswrapper[4815]: I1207 19:16:28.851266 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:28 crc kubenswrapper[4815]: I1207 19:16:28.851284 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:28Z","lastTransitionTime":"2025-12-07T19:16:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:28 crc kubenswrapper[4815]: E1207 19:16:28.871014 4815 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:16:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:16:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:16:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:16:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:16:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:16:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:16:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:16:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3d749f27-20c8-4d23-ad52-dc6b852bf3b7\\\",\\\"systemUUID\\\":\\\"077277cc-9bde-4aeb-947a-0cf3c49a1ac0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:28Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:28 crc kubenswrapper[4815]: I1207 19:16:28.875294 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:28 crc kubenswrapper[4815]: I1207 19:16:28.875378 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:28 crc kubenswrapper[4815]: I1207 19:16:28.875395 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:28 crc kubenswrapper[4815]: I1207 19:16:28.875415 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:28 crc kubenswrapper[4815]: I1207 19:16:28.875429 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:28Z","lastTransitionTime":"2025-12-07T19:16:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:28 crc kubenswrapper[4815]: E1207 19:16:28.893262 4815 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:16:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:16:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:16:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:16:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:16:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:16:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:16:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:16:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3d749f27-20c8-4d23-ad52-dc6b852bf3b7\\\",\\\"systemUUID\\\":\\\"077277cc-9bde-4aeb-947a-0cf3c49a1ac0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:28Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:28 crc kubenswrapper[4815]: I1207 19:16:28.899505 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:28 crc kubenswrapper[4815]: I1207 19:16:28.899554 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:28 crc kubenswrapper[4815]: I1207 19:16:28.899569 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:28 crc kubenswrapper[4815]: I1207 19:16:28.899589 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:28 crc kubenswrapper[4815]: I1207 19:16:28.899602 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:28Z","lastTransitionTime":"2025-12-07T19:16:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:28 crc kubenswrapper[4815]: E1207 19:16:28.914156 4815 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:16:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:16:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:16:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:16:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:16:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:16:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:16:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:16:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3d749f27-20c8-4d23-ad52-dc6b852bf3b7\\\",\\\"systemUUID\\\":\\\"077277cc-9bde-4aeb-947a-0cf3c49a1ac0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:28Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:28 crc kubenswrapper[4815]: E1207 19:16:28.914326 4815 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 07 19:16:28 crc kubenswrapper[4815]: I1207 19:16:28.916077 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:28 crc kubenswrapper[4815]: I1207 19:16:28.916117 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:28 crc kubenswrapper[4815]: I1207 19:16:28.916133 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:28 crc kubenswrapper[4815]: I1207 19:16:28.916156 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:28 crc kubenswrapper[4815]: I1207 19:16:28.916170 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:28Z","lastTransitionTime":"2025-12-07T19:16:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:29 crc kubenswrapper[4815]: I1207 19:16:29.018831 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:29 crc kubenswrapper[4815]: I1207 19:16:29.018865 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:29 crc kubenswrapper[4815]: I1207 19:16:29.018876 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:29 crc kubenswrapper[4815]: I1207 19:16:29.018893 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:29 crc kubenswrapper[4815]: I1207 19:16:29.018903 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:29Z","lastTransitionTime":"2025-12-07T19:16:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:29 crc kubenswrapper[4815]: I1207 19:16:29.120941 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:29 crc kubenswrapper[4815]: I1207 19:16:29.120967 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:29 crc kubenswrapper[4815]: I1207 19:16:29.120976 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:29 crc kubenswrapper[4815]: I1207 19:16:29.120989 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:29 crc kubenswrapper[4815]: I1207 19:16:29.120998 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:29Z","lastTransitionTime":"2025-12-07T19:16:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:29 crc kubenswrapper[4815]: I1207 19:16:29.223819 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:29 crc kubenswrapper[4815]: I1207 19:16:29.223852 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:29 crc kubenswrapper[4815]: I1207 19:16:29.223864 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:29 crc kubenswrapper[4815]: I1207 19:16:29.223881 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:29 crc kubenswrapper[4815]: I1207 19:16:29.223893 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:29Z","lastTransitionTime":"2025-12-07T19:16:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:29 crc kubenswrapper[4815]: I1207 19:16:29.326860 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:29 crc kubenswrapper[4815]: I1207 19:16:29.326908 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:29 crc kubenswrapper[4815]: I1207 19:16:29.326957 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:29 crc kubenswrapper[4815]: I1207 19:16:29.326980 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:29 crc kubenswrapper[4815]: I1207 19:16:29.326998 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:29Z","lastTransitionTime":"2025-12-07T19:16:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:29 crc kubenswrapper[4815]: I1207 19:16:29.372469 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zzw6c_13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f/ovnkube-controller/2.log" Dec 07 19:16:29 crc kubenswrapper[4815]: I1207 19:16:29.375315 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" event={"ID":"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f","Type":"ContainerStarted","Data":"ebff9b10eeedac44403b96b81a1731f6d5569cf3097b0387d72e11c1d602d51b"} Dec 07 19:16:29 crc kubenswrapper[4815]: I1207 19:16:29.375775 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" Dec 07 19:16:29 crc kubenswrapper[4815]: I1207 19:16:29.405478 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac471afc20373bd22caa4941b5ebe38fdf62ad2145fabb4211fa02ce8728ab1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4050d0d7398492efc8809311a01c1a60986b97b6d177846e70c9283399f37b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c63ad6fab49ca4b8deb896aef3b5ff43c3a1ae1035e7ad94815a938939c57c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://812356ce6808d8d2a5845dec1132733fab86a0d6853951920ce5fb5083551c7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ab1c01dcbfe50580e64d89dc9e14843419588c89d7d792af89034d8f272e44e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f140818c38211cb31d63c0b540b60681a781c349ba0523866a3c7e80f33756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebff9b10eeedac44403b96b81a1731f6d5569cf3097b0387d72e11c1d602d51b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d8bbcb8ed47df52ce8120a29864bc098ad0cce571dad6460aea888eda70ccf8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-07T19:15:59Z\\\",\\\"message\\\":\\\"map[10.217.4.213:80:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {2ead45b3-c313-4fbc-a7bc-2b3c4ffd610c}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1207 19:15:58.661470 6472 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1207 19:15:58.661480 6472 ovn.go:134] Ensuring zone local for Pod openshift-kube-controller-manager/kube-controller-manager-crc in node crc\\\\nI1207 19:15:58.661485 6472 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nI1207 19:15:58.661491 6472 default_network_controller.go:776] Recording success event on pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1207 19:15:58.661489 6472 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1207 19:15:58.658410 6472 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:57Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:16:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba72f8ec4b6a36d0c1a6fd8f9f8e5fad577a00b8ae515738a007b994b989d134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a4a4b2d4e80c078ae857e91491c082c6b1c57a5297c485c31fac0c90996b19f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a4a4b2d4e80c078ae857e91491c082c6b1c57a5297c485c31fac0c90996b19f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zzw6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:29Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:29 crc kubenswrapper[4815]: I1207 19:16:29.415425 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kh7gd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29b15186-a725-4067-ba76-336f580327fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1fa8bda26912926bc52757424a9930e1eca6ee953f28ffdb205c1b8b8f29d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kh7gd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:29Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:29 crc kubenswrapper[4815]: I1207 19:16:29.429817 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:29 crc kubenswrapper[4815]: I1207 19:16:29.429854 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:29 crc kubenswrapper[4815]: I1207 19:16:29.429865 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:29 crc kubenswrapper[4815]: I1207 19:16:29.429887 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:29 crc kubenswrapper[4815]: I1207 19:16:29.429900 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:29Z","lastTransitionTime":"2025-12-07T19:16:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:29 crc kubenswrapper[4815]: I1207 19:16:29.432869 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"538dccc2-452b-4d13-8b0d-df993c2b69a2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae001d2e4c277c57382ca0c4a1286c3c135f2fc06ca5a78df27cbdab588ce576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f92a2d37af34c519a6b00a9643f4a46de779ae0070adb0826e9f227960fe4a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89a76372fcc5ff7131084098e41183d148096169b9ccaf69d59dbef45dcfd755\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f29e69a7aa018d68c2e39f3e0f7a619fb25262acaebf83b81fabb6f9b87a9892\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:29Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:29 crc kubenswrapper[4815]: I1207 19:16:29.452643 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cb6cb3bfed28d77ff74b7a2273ddb477e21f930f79b169bfbd98fa10a5ead5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:29Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:29 crc kubenswrapper[4815]: I1207 19:16:29.469584 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:29Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:29 crc kubenswrapper[4815]: I1207 19:16:29.486987 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b57a8653da7e9136a6c54872ff67a01c2d6ad5077842b783ffe74e0069bc8693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:29Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:29 crc kubenswrapper[4815]: I1207 19:16:29.501583 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:29Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:29 crc kubenswrapper[4815]: I1207 19:16:29.517204 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ec34a6371c74ee0778e27d73f5e2948d9ee1bb518f658675df123df60879c61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc20ae853b989c9e9cfeb47a40bd1a02397e3ce0b0b4feec3fcec56ea7be16a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:29Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:29 crc kubenswrapper[4815]: I1207 19:16:29.528284 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:29Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:29 crc kubenswrapper[4815]: I1207 19:16:29.531612 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:29 crc kubenswrapper[4815]: I1207 19:16:29.531674 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:29 crc kubenswrapper[4815]: I1207 19:16:29.531692 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:29 crc kubenswrapper[4815]: I1207 19:16:29.531717 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:29 crc kubenswrapper[4815]: I1207 19:16:29.531737 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:29Z","lastTransitionTime":"2025-12-07T19:16:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:29 crc kubenswrapper[4815]: I1207 19:16:29.541839 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nz6q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92d414c2-ecc3-4598-ac9d-b982bfd89c7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://003f0d6dca39f167925bc3a05634d9dd20f23e54135631e0099a56801f9daf3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cp5k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nz6q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:29Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:29 crc kubenswrapper[4815]: I1207 19:16:29.560161 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c752886-7f9b-4605-8358-0fde597c93da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://441c88363ea1cc8305fa21c51c9237798c47d82a207c9e80da98df93027fe4a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb1ec115ead917caf40ce3586653d6eb78e9a290d845f766f8818c2b57ece6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bda919a651aa83d2f9a82f711d0b242e14140e3d13ec5a80f7ec675a4aedb21a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33a4c84a8cd512a90c51f4721d9ddfde2a452e7f03d9809d492d5bccd7cf4aa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33a4c84a8cd512a90c51f4721d9ddfde2a452e7f03d9809d492d5bccd7cf4aa4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:29Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:29 crc kubenswrapper[4815]: I1207 19:16:29.578486 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe01e9d3-9839-41ea-82d9-f4b6f48bf3b2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52ab545f838534306ea70cc6e400e9fb56ed71ed8206d72c6626166f3a51a93c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f59886c065844948639bf9828ed20c24546cc38659683c98162568c6b12c029\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bec885c9a80d9b5b1fe630a9b7a3f19b23b0af05f359bd87a664f007812b4c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c76d9f056d22d1be012327374370269c2f6d6b8bd341c98419fcb744370751e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539bb1f0e569004272aac994e13bdc7f17f342f5986f08beca97a4c888e046c2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1207 19:15:09.905103 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1207 19:15:09.906859 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4228347019/tls.crt::/tmp/serving-cert-4228347019/tls.key\\\\\\\"\\\\nI1207 19:15:20.294696 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1207 19:15:20.298748 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1207 19:15:20.298781 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1207 19:15:20.298843 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1207 19:15:20.298860 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1207 19:15:20.305864 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1207 19:15:20.305885 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1207 19:15:20.305891 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1207 19:15:20.305973 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1207 19:15:20.305983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1207 19:15:20.305988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1207 19:15:20.305992 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1207 19:15:20.305997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1207 19:15:20.308727 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://902a764a3f6e3ac55c1817f369dfd69ffbf8528bfe4444a6514cdabd48f09fb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:29Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:29 crc kubenswrapper[4815]: I1207 19:16:29.591334 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d662ba2-aa03-4eea-bd30-8ad40638f6c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4939db657209a8744656f63541b50598888981216b000dd4316a9327fdfbcf34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2v8vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55eba62082762b6527923446c2d419d2eb00919331cf8e59bfa9cb27ca65a82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2v8vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gkn4h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:29Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:29 crc kubenswrapper[4815]: I1207 19:16:29.601591 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xbq22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"201e9ba8-3e19-4555-90f0-587497a2a328\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z24bd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z24bd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xbq22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:29Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:29 crc kubenswrapper[4815]: I1207 19:16:29.614726 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s95hp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b739f36-d9c4-4fb6-9ead-9df05e283dea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:16:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:16:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7285635ab7710e9071a051b6e49036856b4c60c87b5110debec3bfb20bb0ac97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73eedf1d85ce69ddf2fed2445ed9318cb4515aee47f89fe9b1db69bbf213ea2a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-07T19:16:12Z\\\",\\\"message\\\":\\\"2025-12-07T19:15:27+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_5074391b-4630-4c1f-b2fa-f9e22786274e\\\\n2025-12-07T19:15:27+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_5074391b-4630-4c1f-b2fa-f9e22786274e to /host/opt/cni/bin/\\\\n2025-12-07T19:15:27Z [verbose] multus-daemon started\\\\n2025-12-07T19:15:27Z [verbose] Readiness Indicator file check\\\\n2025-12-07T19:16:12Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:16:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prbsc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s95hp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:29Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:29 crc kubenswrapper[4815]: I1207 19:16:29.633521 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gmf4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fefaece-ab52-48e2-9ee9-fb07be1922f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://600d32fcae56dd0fa63421283e9b0fdfc23c4589a1dc4a66e5ff1f4c6a415e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e14454c2c5dbbdcbfd3ff7ec5d618c2dfe48f41af9846b72f61941b878e53d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e14454c2c5dbbdcbfd3ff7ec5d618c2dfe48f41af9846b72f61941b878e53d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f33f1decd49213b8e034c2d1f2a3a51ab99aa18679bd3378becaa27f1ef4b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f33f1decd49213b8e034c2d1f2a3a51ab99aa18679bd3378becaa27f1ef4b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c519c7c8ea1b7405466970c64269b72cc6e8129838764dfc4a662cb4df419c6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c519c7c8ea1b7405466970c64269b72cc6e8129838764dfc4a662cb4df419c6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c9efcd688f4886df56fb4d0e2ff5f5dd0c9e877186e1b53d398b7bed0547609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c9efcd688f4886df56fb4d0e2ff5f5dd0c9e877186e1b53d398b7bed0547609\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a4a622d5231f5bacbb9dba0290a87f10310a0053be2dd692718fcf1002deb44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a4a622d5231f5bacbb9dba0290a87f10310a0053be2dd692718fcf1002deb44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://825649f5768765281bad22f9b86ed162b80cc258ce11a746ec06c9f415a21529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://825649f5768765281bad22f9b86ed162b80cc258ce11a746ec06c9f415a21529\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gmf4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:29Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:29 crc kubenswrapper[4815]: I1207 19:16:29.634003 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:29 crc kubenswrapper[4815]: I1207 19:16:29.634053 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:29 crc kubenswrapper[4815]: I1207 19:16:29.634071 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:29 crc kubenswrapper[4815]: I1207 19:16:29.634097 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:29 crc kubenswrapper[4815]: I1207 19:16:29.634117 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:29Z","lastTransitionTime":"2025-12-07T19:16:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:29 crc kubenswrapper[4815]: I1207 19:16:29.647184 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tqxds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dfd265a-0c7f-40bd-9226-82c2b1abbeda\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3523e317081c56f05c00d0288c6ad6f1ff04f1346772bc0c86aed488570786a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzfhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a277a724d026909546ae84aee6d8d29fc87f0277f9d6aa45ca99e5f116ac79d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzfhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tqxds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:29Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:29 crc kubenswrapper[4815]: I1207 19:16:29.677605 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0499c01-b137-4349-ad4b-5570742d072a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c5e629885bc5c80c370216c2ad383a9c258c28ef825a043812dbb045e61aa1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db98a67270148a5b0749995280d4e5591b1682e847d56b06654319b528928abf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f2e12c3c62b663f9ccb2d51e3e9d80e25024d21bd1c539f999401238ee2003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3af37bcac7eb4c2aae020c71eb18933c2d3a283f59b3dbb02dba4dfdee1c3928\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7bfebca60e63011d26025b24bad72ec02c223b2d889f6985504a371645bcfc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0faf51cb907b5a90d7d1856e10ed14b8550be7fea5ff9b912ad8fbb96d1a67ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0faf51cb907b5a90d7d1856e10ed14b8550be7fea5ff9b912ad8fbb96d1a67ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55f8d454d8472869d96503b5e9e62e8f8f3a41fa94d4d4e6f38e3c8446b025c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55f8d454d8472869d96503b5e9e62e8f8f3a41fa94d4d4e6f38e3c8446b025c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:14:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://df109e5aa0deba7a690ea1f5f160ab06d8a3a3e0aa3536b620a7ff738e61c701\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df109e5aa0deba7a690ea1f5f160ab06d8a3a3e0aa3536b620a7ff738e61c701\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:29Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:29 crc kubenswrapper[4815]: I1207 19:16:29.736428 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:29 crc kubenswrapper[4815]: I1207 19:16:29.736519 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:29 crc kubenswrapper[4815]: I1207 19:16:29.736538 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:29 crc kubenswrapper[4815]: I1207 19:16:29.736563 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:29 crc kubenswrapper[4815]: I1207 19:16:29.736581 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:29Z","lastTransitionTime":"2025-12-07T19:16:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:29 crc kubenswrapper[4815]: I1207 19:16:29.769520 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 07 19:16:29 crc kubenswrapper[4815]: E1207 19:16:29.769875 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 07 19:16:29 crc kubenswrapper[4815]: I1207 19:16:29.769973 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 07 19:16:29 crc kubenswrapper[4815]: E1207 19:16:29.771173 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 07 19:16:29 crc kubenswrapper[4815]: I1207 19:16:29.787182 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Dec 07 19:16:29 crc kubenswrapper[4815]: I1207 19:16:29.839741 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:29 crc kubenswrapper[4815]: I1207 19:16:29.839798 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:29 crc kubenswrapper[4815]: I1207 19:16:29.839815 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:29 crc kubenswrapper[4815]: I1207 19:16:29.839838 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:29 crc kubenswrapper[4815]: I1207 19:16:29.839855 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:29Z","lastTransitionTime":"2025-12-07T19:16:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:29 crc kubenswrapper[4815]: I1207 19:16:29.943561 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:29 crc kubenswrapper[4815]: I1207 19:16:29.943624 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:29 crc kubenswrapper[4815]: I1207 19:16:29.943645 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:29 crc kubenswrapper[4815]: I1207 19:16:29.943671 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:29 crc kubenswrapper[4815]: I1207 19:16:29.943688 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:29Z","lastTransitionTime":"2025-12-07T19:16:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:30 crc kubenswrapper[4815]: I1207 19:16:30.047068 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:30 crc kubenswrapper[4815]: I1207 19:16:30.047387 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:30 crc kubenswrapper[4815]: I1207 19:16:30.047407 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:30 crc kubenswrapper[4815]: I1207 19:16:30.047434 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:30 crc kubenswrapper[4815]: I1207 19:16:30.047452 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:30Z","lastTransitionTime":"2025-12-07T19:16:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:30 crc kubenswrapper[4815]: I1207 19:16:30.150431 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:30 crc kubenswrapper[4815]: I1207 19:16:30.150508 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:30 crc kubenswrapper[4815]: I1207 19:16:30.150527 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:30 crc kubenswrapper[4815]: I1207 19:16:30.150555 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:30 crc kubenswrapper[4815]: I1207 19:16:30.150592 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:30Z","lastTransitionTime":"2025-12-07T19:16:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:30 crc kubenswrapper[4815]: I1207 19:16:30.254078 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:30 crc kubenswrapper[4815]: I1207 19:16:30.254137 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:30 crc kubenswrapper[4815]: I1207 19:16:30.254155 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:30 crc kubenswrapper[4815]: I1207 19:16:30.254180 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:30 crc kubenswrapper[4815]: I1207 19:16:30.254202 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:30Z","lastTransitionTime":"2025-12-07T19:16:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:30 crc kubenswrapper[4815]: I1207 19:16:30.357524 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:30 crc kubenswrapper[4815]: I1207 19:16:30.357577 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:30 crc kubenswrapper[4815]: I1207 19:16:30.357594 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:30 crc kubenswrapper[4815]: I1207 19:16:30.357618 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:30 crc kubenswrapper[4815]: I1207 19:16:30.357635 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:30Z","lastTransitionTime":"2025-12-07T19:16:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:30 crc kubenswrapper[4815]: I1207 19:16:30.381321 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zzw6c_13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f/ovnkube-controller/3.log" Dec 07 19:16:30 crc kubenswrapper[4815]: I1207 19:16:30.382556 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zzw6c_13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f/ovnkube-controller/2.log" Dec 07 19:16:30 crc kubenswrapper[4815]: I1207 19:16:30.387055 4815 generic.go:334] "Generic (PLEG): container finished" podID="13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f" containerID="ebff9b10eeedac44403b96b81a1731f6d5569cf3097b0387d72e11c1d602d51b" exitCode=1 Dec 07 19:16:30 crc kubenswrapper[4815]: I1207 19:16:30.387132 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" event={"ID":"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f","Type":"ContainerDied","Data":"ebff9b10eeedac44403b96b81a1731f6d5569cf3097b0387d72e11c1d602d51b"} Dec 07 19:16:30 crc kubenswrapper[4815]: I1207 19:16:30.387229 4815 scope.go:117] "RemoveContainer" containerID="9d8bbcb8ed47df52ce8120a29864bc098ad0cce571dad6460aea888eda70ccf8" Dec 07 19:16:30 crc kubenswrapper[4815]: I1207 19:16:30.388513 4815 scope.go:117] "RemoveContainer" containerID="ebff9b10eeedac44403b96b81a1731f6d5569cf3097b0387d72e11c1d602d51b" Dec 07 19:16:30 crc kubenswrapper[4815]: E1207 19:16:30.388808 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-zzw6c_openshift-ovn-kubernetes(13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" podUID="13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f" Dec 07 19:16:30 crc kubenswrapper[4815]: I1207 19:16:30.411296 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:30Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:30 crc kubenswrapper[4815]: I1207 19:16:30.437766 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b57a8653da7e9136a6c54872ff67a01c2d6ad5077842b783ffe74e0069bc8693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:30Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:30 crc kubenswrapper[4815]: I1207 19:16:30.451025 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:30Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:30 crc kubenswrapper[4815]: I1207 19:16:30.460141 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:30 crc kubenswrapper[4815]: I1207 19:16:30.460189 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:30 crc kubenswrapper[4815]: I1207 19:16:30.460199 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:30 crc kubenswrapper[4815]: I1207 19:16:30.460214 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:30 crc kubenswrapper[4815]: I1207 19:16:30.460227 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:30Z","lastTransitionTime":"2025-12-07T19:16:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:30 crc kubenswrapper[4815]: I1207 19:16:30.469897 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ec34a6371c74ee0778e27d73f5e2948d9ee1bb518f658675df123df60879c61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc20ae853b989c9e9cfeb47a40bd1a02397e3ce0b0b4feec3fcec56ea7be16a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:30Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:30 crc kubenswrapper[4815]: I1207 19:16:30.484099 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:30Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:30 crc kubenswrapper[4815]: I1207 19:16:30.495412 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nz6q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92d414c2-ecc3-4598-ac9d-b982bfd89c7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://003f0d6dca39f167925bc3a05634d9dd20f23e54135631e0099a56801f9daf3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cp5k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nz6q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:30Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:30 crc kubenswrapper[4815]: I1207 19:16:30.513681 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c752886-7f9b-4605-8358-0fde597c93da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://441c88363ea1cc8305fa21c51c9237798c47d82a207c9e80da98df93027fe4a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb1ec115ead917caf40ce3586653d6eb78e9a290d845f766f8818c2b57ece6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bda919a651aa83d2f9a82f711d0b242e14140e3d13ec5a80f7ec675a4aedb21a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33a4c84a8cd512a90c51f4721d9ddfde2a452e7f03d9809d492d5bccd7cf4aa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33a4c84a8cd512a90c51f4721d9ddfde2a452e7f03d9809d492d5bccd7cf4aa4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:30Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:30 crc kubenswrapper[4815]: I1207 19:16:30.528422 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe01e9d3-9839-41ea-82d9-f4b6f48bf3b2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52ab545f838534306ea70cc6e400e9fb56ed71ed8206d72c6626166f3a51a93c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f59886c065844948639bf9828ed20c24546cc38659683c98162568c6b12c029\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bec885c9a80d9b5b1fe630a9b7a3f19b23b0af05f359bd87a664f007812b4c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c76d9f056d22d1be012327374370269c2f6d6b8bd341c98419fcb744370751e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539bb1f0e569004272aac994e13bdc7f17f342f5986f08beca97a4c888e046c2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1207 19:15:09.905103 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1207 19:15:09.906859 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4228347019/tls.crt::/tmp/serving-cert-4228347019/tls.key\\\\\\\"\\\\nI1207 19:15:20.294696 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1207 19:15:20.298748 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1207 19:15:20.298781 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1207 19:15:20.298843 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1207 19:15:20.298860 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1207 19:15:20.305864 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1207 19:15:20.305885 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1207 19:15:20.305891 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1207 19:15:20.305973 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1207 19:15:20.305983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1207 19:15:20.305988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1207 19:15:20.305992 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1207 19:15:20.305997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1207 19:15:20.308727 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://902a764a3f6e3ac55c1817f369dfd69ffbf8528bfe4444a6514cdabd48f09fb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:30Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:30 crc kubenswrapper[4815]: I1207 19:16:30.540981 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d662ba2-aa03-4eea-bd30-8ad40638f6c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4939db657209a8744656f63541b50598888981216b000dd4316a9327fdfbcf34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2v8vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55eba62082762b6527923446c2d419d2eb00919331cf8e59bfa9cb27ca65a82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2v8vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gkn4h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:30Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:30 crc kubenswrapper[4815]: I1207 19:16:30.557585 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xbq22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"201e9ba8-3e19-4555-90f0-587497a2a328\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z24bd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z24bd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xbq22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:30Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:30 crc kubenswrapper[4815]: I1207 19:16:30.563039 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:30 crc kubenswrapper[4815]: I1207 19:16:30.563120 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:30 crc kubenswrapper[4815]: I1207 19:16:30.563147 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:30 crc kubenswrapper[4815]: I1207 19:16:30.563179 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:30 crc kubenswrapper[4815]: I1207 19:16:30.563202 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:30Z","lastTransitionTime":"2025-12-07T19:16:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:30 crc kubenswrapper[4815]: I1207 19:16:30.579711 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s95hp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b739f36-d9c4-4fb6-9ead-9df05e283dea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:16:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:16:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7285635ab7710e9071a051b6e49036856b4c60c87b5110debec3bfb20bb0ac97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73eedf1d85ce69ddf2fed2445ed9318cb4515aee47f89fe9b1db69bbf213ea2a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-07T19:16:12Z\\\",\\\"message\\\":\\\"2025-12-07T19:15:27+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_5074391b-4630-4c1f-b2fa-f9e22786274e\\\\n2025-12-07T19:15:27+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_5074391b-4630-4c1f-b2fa-f9e22786274e to /host/opt/cni/bin/\\\\n2025-12-07T19:15:27Z [verbose] multus-daemon started\\\\n2025-12-07T19:15:27Z [verbose] Readiness Indicator file check\\\\n2025-12-07T19:16:12Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:16:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prbsc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s95hp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:30Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:30 crc kubenswrapper[4815]: I1207 19:16:30.600834 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gmf4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fefaece-ab52-48e2-9ee9-fb07be1922f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://600d32fcae56dd0fa63421283e9b0fdfc23c4589a1dc4a66e5ff1f4c6a415e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e14454c2c5dbbdcbfd3ff7ec5d618c2dfe48f41af9846b72f61941b878e53d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e14454c2c5dbbdcbfd3ff7ec5d618c2dfe48f41af9846b72f61941b878e53d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f33f1decd49213b8e034c2d1f2a3a51ab99aa18679bd3378becaa27f1ef4b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f33f1decd49213b8e034c2d1f2a3a51ab99aa18679bd3378becaa27f1ef4b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c519c7c8ea1b7405466970c64269b72cc6e8129838764dfc4a662cb4df419c6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c519c7c8ea1b7405466970c64269b72cc6e8129838764dfc4a662cb4df419c6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c9efcd688f4886df56fb4d0e2ff5f5dd0c9e877186e1b53d398b7bed0547609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c9efcd688f4886df56fb4d0e2ff5f5dd0c9e877186e1b53d398b7bed0547609\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a4a622d5231f5bacbb9dba0290a87f10310a0053be2dd692718fcf1002deb44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a4a622d5231f5bacbb9dba0290a87f10310a0053be2dd692718fcf1002deb44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://825649f5768765281bad22f9b86ed162b80cc258ce11a746ec06c9f415a21529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://825649f5768765281bad22f9b86ed162b80cc258ce11a746ec06c9f415a21529\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gmf4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:30Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:30 crc kubenswrapper[4815]: I1207 19:16:30.615579 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tqxds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dfd265a-0c7f-40bd-9226-82c2b1abbeda\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3523e317081c56f05c00d0288c6ad6f1ff04f1346772bc0c86aed488570786a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzfhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a277a724d026909546ae84aee6d8d29fc87f0277f9d6aa45ca99e5f116ac79d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzfhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tqxds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:30Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:30 crc kubenswrapper[4815]: I1207 19:16:30.627810 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bbbc6dc9-0ce5-4bbb-a7f4-0bd7c158f2ee\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc18ff4f1a0a1e7aae38ce7dad5fbea485553d57aac5dbd709fe94d69b4ac6d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6721c873a3c399141d67e58a9eb8d54614c64a48d6a3e373b6b74b884de6450\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6721c873a3c399141d67e58a9eb8d54614c64a48d6a3e373b6b74b884de6450\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:30Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:30 crc kubenswrapper[4815]: I1207 19:16:30.650793 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0499c01-b137-4349-ad4b-5570742d072a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c5e629885bc5c80c370216c2ad383a9c258c28ef825a043812dbb045e61aa1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db98a67270148a5b0749995280d4e5591b1682e847d56b06654319b528928abf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f2e12c3c62b663f9ccb2d51e3e9d80e25024d21bd1c539f999401238ee2003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3af37bcac7eb4c2aae020c71eb18933c2d3a283f59b3dbb02dba4dfdee1c3928\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7bfebca60e63011d26025b24bad72ec02c223b2d889f6985504a371645bcfc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0faf51cb907b5a90d7d1856e10ed14b8550be7fea5ff9b912ad8fbb96d1a67ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0faf51cb907b5a90d7d1856e10ed14b8550be7fea5ff9b912ad8fbb96d1a67ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55f8d454d8472869d96503b5e9e62e8f8f3a41fa94d4d4e6f38e3c8446b025c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55f8d454d8472869d96503b5e9e62e8f8f3a41fa94d4d4e6f38e3c8446b025c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:14:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://df109e5aa0deba7a690ea1f5f160ab06d8a3a3e0aa3536b620a7ff738e61c701\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df109e5aa0deba7a690ea1f5f160ab06d8a3a3e0aa3536b620a7ff738e61c701\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:30Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:30 crc kubenswrapper[4815]: I1207 19:16:30.665264 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:30 crc kubenswrapper[4815]: I1207 19:16:30.665297 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:30 crc kubenswrapper[4815]: I1207 19:16:30.665309 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:30 crc kubenswrapper[4815]: I1207 19:16:30.665328 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:30 crc kubenswrapper[4815]: I1207 19:16:30.665338 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:30Z","lastTransitionTime":"2025-12-07T19:16:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:30 crc kubenswrapper[4815]: I1207 19:16:30.672369 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac471afc20373bd22caa4941b5ebe38fdf62ad2145fabb4211fa02ce8728ab1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4050d0d7398492efc8809311a01c1a60986b97b6d177846e70c9283399f37b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c63ad6fab49ca4b8deb896aef3b5ff43c3a1ae1035e7ad94815a938939c57c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://812356ce6808d8d2a5845dec1132733fab86a0d6853951920ce5fb5083551c7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ab1c01dcbfe50580e64d89dc9e14843419588c89d7d792af89034d8f272e44e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f140818c38211cb31d63c0b540b60681a781c349ba0523866a3c7e80f33756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebff9b10eeedac44403b96b81a1731f6d5569cf3097b0387d72e11c1d602d51b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d8bbcb8ed47df52ce8120a29864bc098ad0cce571dad6460aea888eda70ccf8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-07T19:15:59Z\\\",\\\"message\\\":\\\"map[10.217.4.213:80:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {2ead45b3-c313-4fbc-a7bc-2b3c4ffd610c}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1207 19:15:58.661470 6472 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1207 19:15:58.661480 6472 ovn.go:134] Ensuring zone local for Pod openshift-kube-controller-manager/kube-controller-manager-crc in node crc\\\\nI1207 19:15:58.661485 6472 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nI1207 19:15:58.661491 6472 default_network_controller.go:776] Recording success event on pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1207 19:15:58.661489 6472 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1207 19:15:58.658410 6472 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:57Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebff9b10eeedac44403b96b81a1731f6d5569cf3097b0387d72e11c1d602d51b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-07T19:16:29Z\\\",\\\"message\\\":\\\"etwork-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI1207 19:16:29.461728 6776 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI1207 19:16:29.461761 6776 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI1207 19:16:29.461789 6776 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/machine-api-operator\\\\\\\"}\\\\nI1207 19:16:29.461806 6776 services_controller.go:360] Finished syncing service machine-api-operator on namespace openshift-machine-api for network=default : 867.943µs\\\\nI1207 19:16:29.461815 6776 factory.go:1336] Added *v1.Node event handler 7\\\\nI1207 19:16:29.461853 6776 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1207 19:16:29.462259 6776 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1207 19:16:29.462362 6776 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1207 19:16:29.462404 6776 ovnkube.go:599] Stopped ovnkube\\\\nI1207 19:16:29.462446 6776 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1207 19:16:29.462529 6776 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-07T19:16:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba72f8ec4b6a36d0c1a6fd8f9f8e5fad577a00b8ae515738a007b994b989d134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a4a4b2d4e80c078ae857e91491c082c6b1c57a5297c485c31fac0c90996b19f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a4a4b2d4e80c078ae857e91491c082c6b1c57a5297c485c31fac0c90996b19f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zzw6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:30Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:30 crc kubenswrapper[4815]: I1207 19:16:30.685941 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kh7gd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29b15186-a725-4067-ba76-336f580327fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1fa8bda26912926bc52757424a9930e1eca6ee953f28ffdb205c1b8b8f29d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kh7gd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:30Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:30 crc kubenswrapper[4815]: I1207 19:16:30.701484 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"538dccc2-452b-4d13-8b0d-df993c2b69a2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae001d2e4c277c57382ca0c4a1286c3c135f2fc06ca5a78df27cbdab588ce576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f92a2d37af34c519a6b00a9643f4a46de779ae0070adb0826e9f227960fe4a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89a76372fcc5ff7131084098e41183d148096169b9ccaf69d59dbef45dcfd755\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f29e69a7aa018d68c2e39f3e0f7a619fb25262acaebf83b81fabb6f9b87a9892\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:30Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:30 crc kubenswrapper[4815]: I1207 19:16:30.711276 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cb6cb3bfed28d77ff74b7a2273ddb477e21f930f79b169bfbd98fa10a5ead5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:30Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:30 crc kubenswrapper[4815]: I1207 19:16:30.768168 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:30 crc kubenswrapper[4815]: I1207 19:16:30.768239 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:30 crc kubenswrapper[4815]: I1207 19:16:30.768266 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:30 crc kubenswrapper[4815]: I1207 19:16:30.768295 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:30 crc kubenswrapper[4815]: I1207 19:16:30.768316 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:30Z","lastTransitionTime":"2025-12-07T19:16:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:30 crc kubenswrapper[4815]: I1207 19:16:30.769306 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 07 19:16:30 crc kubenswrapper[4815]: I1207 19:16:30.769411 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xbq22" Dec 07 19:16:30 crc kubenswrapper[4815]: E1207 19:16:30.769508 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 07 19:16:30 crc kubenswrapper[4815]: E1207 19:16:30.769648 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xbq22" podUID="201e9ba8-3e19-4555-90f0-587497a2a328" Dec 07 19:16:30 crc kubenswrapper[4815]: I1207 19:16:30.871315 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:30 crc kubenswrapper[4815]: I1207 19:16:30.871373 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:30 crc kubenswrapper[4815]: I1207 19:16:30.871392 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:30 crc kubenswrapper[4815]: I1207 19:16:30.871416 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:30 crc kubenswrapper[4815]: I1207 19:16:30.871431 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:30Z","lastTransitionTime":"2025-12-07T19:16:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:30 crc kubenswrapper[4815]: I1207 19:16:30.974859 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:30 crc kubenswrapper[4815]: I1207 19:16:30.974973 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:30 crc kubenswrapper[4815]: I1207 19:16:30.975000 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:30 crc kubenswrapper[4815]: I1207 19:16:30.975030 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:30 crc kubenswrapper[4815]: I1207 19:16:30.975052 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:30Z","lastTransitionTime":"2025-12-07T19:16:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:31 crc kubenswrapper[4815]: I1207 19:16:31.078502 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:31 crc kubenswrapper[4815]: I1207 19:16:31.078555 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:31 crc kubenswrapper[4815]: I1207 19:16:31.078572 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:31 crc kubenswrapper[4815]: I1207 19:16:31.078597 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:31 crc kubenswrapper[4815]: I1207 19:16:31.078615 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:31Z","lastTransitionTime":"2025-12-07T19:16:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:31 crc kubenswrapper[4815]: I1207 19:16:31.182828 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:31 crc kubenswrapper[4815]: I1207 19:16:31.182982 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:31 crc kubenswrapper[4815]: I1207 19:16:31.183008 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:31 crc kubenswrapper[4815]: I1207 19:16:31.183037 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:31 crc kubenswrapper[4815]: I1207 19:16:31.183061 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:31Z","lastTransitionTime":"2025-12-07T19:16:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:31 crc kubenswrapper[4815]: I1207 19:16:31.286226 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:31 crc kubenswrapper[4815]: I1207 19:16:31.286275 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:31 crc kubenswrapper[4815]: I1207 19:16:31.286291 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:31 crc kubenswrapper[4815]: I1207 19:16:31.286317 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:31 crc kubenswrapper[4815]: I1207 19:16:31.286337 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:31Z","lastTransitionTime":"2025-12-07T19:16:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:31 crc kubenswrapper[4815]: I1207 19:16:31.388364 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:31 crc kubenswrapper[4815]: I1207 19:16:31.388426 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:31 crc kubenswrapper[4815]: I1207 19:16:31.388444 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:31 crc kubenswrapper[4815]: I1207 19:16:31.388470 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:31 crc kubenswrapper[4815]: I1207 19:16:31.388487 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:31Z","lastTransitionTime":"2025-12-07T19:16:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:31 crc kubenswrapper[4815]: I1207 19:16:31.395315 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zzw6c_13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f/ovnkube-controller/3.log" Dec 07 19:16:31 crc kubenswrapper[4815]: I1207 19:16:31.398869 4815 scope.go:117] "RemoveContainer" containerID="ebff9b10eeedac44403b96b81a1731f6d5569cf3097b0387d72e11c1d602d51b" Dec 07 19:16:31 crc kubenswrapper[4815]: E1207 19:16:31.399022 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-zzw6c_openshift-ovn-kubernetes(13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" podUID="13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f" Dec 07 19:16:31 crc kubenswrapper[4815]: I1207 19:16:31.412400 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d662ba2-aa03-4eea-bd30-8ad40638f6c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4939db657209a8744656f63541b50598888981216b000dd4316a9327fdfbcf34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2v8vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55eba62082762b6527923446c2d419d2eb00919331cf8e59bfa9cb27ca65a82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2v8vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gkn4h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:31Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:31 crc kubenswrapper[4815]: I1207 19:16:31.423582 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xbq22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"201e9ba8-3e19-4555-90f0-587497a2a328\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z24bd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z24bd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xbq22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:31Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:31 crc kubenswrapper[4815]: I1207 19:16:31.435315 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bbbc6dc9-0ce5-4bbb-a7f4-0bd7c158f2ee\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc18ff4f1a0a1e7aae38ce7dad5fbea485553d57aac5dbd709fe94d69b4ac6d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6721c873a3c399141d67e58a9eb8d54614c64a48d6a3e373b6b74b884de6450\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6721c873a3c399141d67e58a9eb8d54614c64a48d6a3e373b6b74b884de6450\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:31Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:31 crc kubenswrapper[4815]: I1207 19:16:31.465084 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0499c01-b137-4349-ad4b-5570742d072a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c5e629885bc5c80c370216c2ad383a9c258c28ef825a043812dbb045e61aa1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db98a67270148a5b0749995280d4e5591b1682e847d56b06654319b528928abf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f2e12c3c62b663f9ccb2d51e3e9d80e25024d21bd1c539f999401238ee2003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3af37bcac7eb4c2aae020c71eb18933c2d3a283f59b3dbb02dba4dfdee1c3928\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7bfebca60e63011d26025b24bad72ec02c223b2d889f6985504a371645bcfc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0faf51cb907b5a90d7d1856e10ed14b8550be7fea5ff9b912ad8fbb96d1a67ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0faf51cb907b5a90d7d1856e10ed14b8550be7fea5ff9b912ad8fbb96d1a67ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55f8d454d8472869d96503b5e9e62e8f8f3a41fa94d4d4e6f38e3c8446b025c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55f8d454d8472869d96503b5e9e62e8f8f3a41fa94d4d4e6f38e3c8446b025c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:14:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://df109e5aa0deba7a690ea1f5f160ab06d8a3a3e0aa3536b620a7ff738e61c701\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df109e5aa0deba7a690ea1f5f160ab06d8a3a3e0aa3536b620a7ff738e61c701\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:31Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:31 crc kubenswrapper[4815]: I1207 19:16:31.483888 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s95hp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b739f36-d9c4-4fb6-9ead-9df05e283dea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:16:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:16:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7285635ab7710e9071a051b6e49036856b4c60c87b5110debec3bfb20bb0ac97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73eedf1d85ce69ddf2fed2445ed9318cb4515aee47f89fe9b1db69bbf213ea2a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-07T19:16:12Z\\\",\\\"message\\\":\\\"2025-12-07T19:15:27+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_5074391b-4630-4c1f-b2fa-f9e22786274e\\\\n2025-12-07T19:15:27+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_5074391b-4630-4c1f-b2fa-f9e22786274e to /host/opt/cni/bin/\\\\n2025-12-07T19:15:27Z [verbose] multus-daemon started\\\\n2025-12-07T19:15:27Z [verbose] Readiness Indicator file check\\\\n2025-12-07T19:16:12Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:16:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prbsc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s95hp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:31Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:31 crc kubenswrapper[4815]: I1207 19:16:31.490899 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:31 crc kubenswrapper[4815]: I1207 19:16:31.490999 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:31 crc kubenswrapper[4815]: I1207 19:16:31.491019 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:31 crc kubenswrapper[4815]: I1207 19:16:31.491044 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:31 crc kubenswrapper[4815]: I1207 19:16:31.491061 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:31Z","lastTransitionTime":"2025-12-07T19:16:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:31 crc kubenswrapper[4815]: I1207 19:16:31.500880 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gmf4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fefaece-ab52-48e2-9ee9-fb07be1922f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://600d32fcae56dd0fa63421283e9b0fdfc23c4589a1dc4a66e5ff1f4c6a415e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e14454c2c5dbbdcbfd3ff7ec5d618c2dfe48f41af9846b72f61941b878e53d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e14454c2c5dbbdcbfd3ff7ec5d618c2dfe48f41af9846b72f61941b878e53d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f33f1decd49213b8e034c2d1f2a3a51ab99aa18679bd3378becaa27f1ef4b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f33f1decd49213b8e034c2d1f2a3a51ab99aa18679bd3378becaa27f1ef4b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c519c7c8ea1b7405466970c64269b72cc6e8129838764dfc4a662cb4df419c6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c519c7c8ea1b7405466970c64269b72cc6e8129838764dfc4a662cb4df419c6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c9efcd688f4886df56fb4d0e2ff5f5dd0c9e877186e1b53d398b7bed0547609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c9efcd688f4886df56fb4d0e2ff5f5dd0c9e877186e1b53d398b7bed0547609\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a4a622d5231f5bacbb9dba0290a87f10310a0053be2dd692718fcf1002deb44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a4a622d5231f5bacbb9dba0290a87f10310a0053be2dd692718fcf1002deb44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://825649f5768765281bad22f9b86ed162b80cc258ce11a746ec06c9f415a21529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://825649f5768765281bad22f9b86ed162b80cc258ce11a746ec06c9f415a21529\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gmf4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:31Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:31 crc kubenswrapper[4815]: I1207 19:16:31.514098 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tqxds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dfd265a-0c7f-40bd-9226-82c2b1abbeda\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3523e317081c56f05c00d0288c6ad6f1ff04f1346772bc0c86aed488570786a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzfhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a277a724d026909546ae84aee6d8d29fc87f0277f9d6aa45ca99e5f116ac79d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzfhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tqxds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:31Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:31 crc kubenswrapper[4815]: I1207 19:16:31.529589 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"538dccc2-452b-4d13-8b0d-df993c2b69a2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae001d2e4c277c57382ca0c4a1286c3c135f2fc06ca5a78df27cbdab588ce576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f92a2d37af34c519a6b00a9643f4a46de779ae0070adb0826e9f227960fe4a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89a76372fcc5ff7131084098e41183d148096169b9ccaf69d59dbef45dcfd755\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f29e69a7aa018d68c2e39f3e0f7a619fb25262acaebf83b81fabb6f9b87a9892\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:31Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:31 crc kubenswrapper[4815]: I1207 19:16:31.545652 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cb6cb3bfed28d77ff74b7a2273ddb477e21f930f79b169bfbd98fa10a5ead5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:31Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:31 crc kubenswrapper[4815]: I1207 19:16:31.578958 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac471afc20373bd22caa4941b5ebe38fdf62ad2145fabb4211fa02ce8728ab1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4050d0d7398492efc8809311a01c1a60986b97b6d177846e70c9283399f37b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c63ad6fab49ca4b8deb896aef3b5ff43c3a1ae1035e7ad94815a938939c57c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://812356ce6808d8d2a5845dec1132733fab86a0d6853951920ce5fb5083551c7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ab1c01dcbfe50580e64d89dc9e14843419588c89d7d792af89034d8f272e44e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f140818c38211cb31d63c0b540b60681a781c349ba0523866a3c7e80f33756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebff9b10eeedac44403b96b81a1731f6d5569cf3097b0387d72e11c1d602d51b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebff9b10eeedac44403b96b81a1731f6d5569cf3097b0387d72e11c1d602d51b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-07T19:16:29Z\\\",\\\"message\\\":\\\"etwork-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI1207 19:16:29.461728 6776 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI1207 19:16:29.461761 6776 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI1207 19:16:29.461789 6776 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/machine-api-operator\\\\\\\"}\\\\nI1207 19:16:29.461806 6776 services_controller.go:360] Finished syncing service machine-api-operator on namespace openshift-machine-api for network=default : 867.943µs\\\\nI1207 19:16:29.461815 6776 factory.go:1336] Added *v1.Node event handler 7\\\\nI1207 19:16:29.461853 6776 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1207 19:16:29.462259 6776 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1207 19:16:29.462362 6776 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1207 19:16:29.462404 6776 ovnkube.go:599] Stopped ovnkube\\\\nI1207 19:16:29.462446 6776 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1207 19:16:29.462529 6776 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-07T19:16:28Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-zzw6c_openshift-ovn-kubernetes(13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba72f8ec4b6a36d0c1a6fd8f9f8e5fad577a00b8ae515738a007b994b989d134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a4a4b2d4e80c078ae857e91491c082c6b1c57a5297c485c31fac0c90996b19f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a4a4b2d4e80c078ae857e91491c082c6b1c57a5297c485c31fac0c90996b19f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zzw6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:31Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:31 crc kubenswrapper[4815]: I1207 19:16:31.593589 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:31 crc kubenswrapper[4815]: I1207 19:16:31.593626 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:31 crc kubenswrapper[4815]: I1207 19:16:31.593637 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:31 crc kubenswrapper[4815]: I1207 19:16:31.593653 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:31 crc kubenswrapper[4815]: I1207 19:16:31.593665 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:31Z","lastTransitionTime":"2025-12-07T19:16:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:31 crc kubenswrapper[4815]: I1207 19:16:31.595645 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kh7gd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29b15186-a725-4067-ba76-336f580327fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1fa8bda26912926bc52757424a9930e1eca6ee953f28ffdb205c1b8b8f29d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kh7gd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:31Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:31 crc kubenswrapper[4815]: I1207 19:16:31.611467 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ec34a6371c74ee0778e27d73f5e2948d9ee1bb518f658675df123df60879c61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc20ae853b989c9e9cfeb47a40bd1a02397e3ce0b0b4feec3fcec56ea7be16a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:31Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:31 crc kubenswrapper[4815]: I1207 19:16:31.632444 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:31Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:31 crc kubenswrapper[4815]: I1207 19:16:31.650182 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nz6q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92d414c2-ecc3-4598-ac9d-b982bfd89c7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://003f0d6dca39f167925bc3a05634d9dd20f23e54135631e0099a56801f9daf3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cp5k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nz6q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:31Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:31 crc kubenswrapper[4815]: I1207 19:16:31.667424 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c752886-7f9b-4605-8358-0fde597c93da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://441c88363ea1cc8305fa21c51c9237798c47d82a207c9e80da98df93027fe4a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb1ec115ead917caf40ce3586653d6eb78e9a290d845f766f8818c2b57ece6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bda919a651aa83d2f9a82f711d0b242e14140e3d13ec5a80f7ec675a4aedb21a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33a4c84a8cd512a90c51f4721d9ddfde2a452e7f03d9809d492d5bccd7cf4aa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33a4c84a8cd512a90c51f4721d9ddfde2a452e7f03d9809d492d5bccd7cf4aa4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:31Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:31 crc kubenswrapper[4815]: I1207 19:16:31.688523 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe01e9d3-9839-41ea-82d9-f4b6f48bf3b2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52ab545f838534306ea70cc6e400e9fb56ed71ed8206d72c6626166f3a51a93c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f59886c065844948639bf9828ed20c24546cc38659683c98162568c6b12c029\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bec885c9a80d9b5b1fe630a9b7a3f19b23b0af05f359bd87a664f007812b4c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c76d9f056d22d1be012327374370269c2f6d6b8bd341c98419fcb744370751e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539bb1f0e569004272aac994e13bdc7f17f342f5986f08beca97a4c888e046c2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1207 19:15:09.905103 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1207 19:15:09.906859 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4228347019/tls.crt::/tmp/serving-cert-4228347019/tls.key\\\\\\\"\\\\nI1207 19:15:20.294696 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1207 19:15:20.298748 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1207 19:15:20.298781 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1207 19:15:20.298843 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1207 19:15:20.298860 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1207 19:15:20.305864 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1207 19:15:20.305885 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1207 19:15:20.305891 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1207 19:15:20.305973 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1207 19:15:20.305983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1207 19:15:20.305988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1207 19:15:20.305992 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1207 19:15:20.305997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1207 19:15:20.308727 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://902a764a3f6e3ac55c1817f369dfd69ffbf8528bfe4444a6514cdabd48f09fb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:31Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:31 crc kubenswrapper[4815]: I1207 19:16:31.696322 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:31 crc kubenswrapper[4815]: I1207 19:16:31.696509 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:31 crc kubenswrapper[4815]: I1207 19:16:31.696644 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:31 crc kubenswrapper[4815]: I1207 19:16:31.696785 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:31 crc kubenswrapper[4815]: I1207 19:16:31.696905 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:31Z","lastTransitionTime":"2025-12-07T19:16:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:31 crc kubenswrapper[4815]: I1207 19:16:31.701652 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:31Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:31 crc kubenswrapper[4815]: I1207 19:16:31.716946 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b57a8653da7e9136a6c54872ff67a01c2d6ad5077842b783ffe74e0069bc8693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:31Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:31 crc kubenswrapper[4815]: I1207 19:16:31.736132 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:31Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:31 crc kubenswrapper[4815]: I1207 19:16:31.769804 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 07 19:16:31 crc kubenswrapper[4815]: I1207 19:16:31.770022 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 07 19:16:31 crc kubenswrapper[4815]: E1207 19:16:31.770114 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 07 19:16:31 crc kubenswrapper[4815]: E1207 19:16:31.770291 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 07 19:16:31 crc kubenswrapper[4815]: I1207 19:16:31.799442 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:31 crc kubenswrapper[4815]: I1207 19:16:31.799589 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:31 crc kubenswrapper[4815]: I1207 19:16:31.799709 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:31 crc kubenswrapper[4815]: I1207 19:16:31.799803 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:31 crc kubenswrapper[4815]: I1207 19:16:31.799907 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:31Z","lastTransitionTime":"2025-12-07T19:16:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:31 crc kubenswrapper[4815]: I1207 19:16:31.902526 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:31 crc kubenswrapper[4815]: I1207 19:16:31.902598 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:31 crc kubenswrapper[4815]: I1207 19:16:31.902620 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:31 crc kubenswrapper[4815]: I1207 19:16:31.902649 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:31 crc kubenswrapper[4815]: I1207 19:16:31.902667 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:31Z","lastTransitionTime":"2025-12-07T19:16:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:32 crc kubenswrapper[4815]: I1207 19:16:32.006041 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:32 crc kubenswrapper[4815]: I1207 19:16:32.006099 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:32 crc kubenswrapper[4815]: I1207 19:16:32.006120 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:32 crc kubenswrapper[4815]: I1207 19:16:32.006146 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:32 crc kubenswrapper[4815]: I1207 19:16:32.006165 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:32Z","lastTransitionTime":"2025-12-07T19:16:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:32 crc kubenswrapper[4815]: I1207 19:16:32.108698 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:32 crc kubenswrapper[4815]: I1207 19:16:32.109066 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:32 crc kubenswrapper[4815]: I1207 19:16:32.109286 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:32 crc kubenswrapper[4815]: I1207 19:16:32.109463 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:32 crc kubenswrapper[4815]: I1207 19:16:32.109633 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:32Z","lastTransitionTime":"2025-12-07T19:16:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:32 crc kubenswrapper[4815]: I1207 19:16:32.212408 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:32 crc kubenswrapper[4815]: I1207 19:16:32.212832 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:32 crc kubenswrapper[4815]: I1207 19:16:32.213081 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:32 crc kubenswrapper[4815]: I1207 19:16:32.213304 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:32 crc kubenswrapper[4815]: I1207 19:16:32.213517 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:32Z","lastTransitionTime":"2025-12-07T19:16:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:32 crc kubenswrapper[4815]: I1207 19:16:32.316497 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:32 crc kubenswrapper[4815]: I1207 19:16:32.316558 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:32 crc kubenswrapper[4815]: I1207 19:16:32.316614 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:32 crc kubenswrapper[4815]: I1207 19:16:32.316642 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:32 crc kubenswrapper[4815]: I1207 19:16:32.316659 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:32Z","lastTransitionTime":"2025-12-07T19:16:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:32 crc kubenswrapper[4815]: I1207 19:16:32.419310 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:32 crc kubenswrapper[4815]: I1207 19:16:32.419365 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:32 crc kubenswrapper[4815]: I1207 19:16:32.419385 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:32 crc kubenswrapper[4815]: I1207 19:16:32.419409 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:32 crc kubenswrapper[4815]: I1207 19:16:32.419426 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:32Z","lastTransitionTime":"2025-12-07T19:16:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:32 crc kubenswrapper[4815]: I1207 19:16:32.522369 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:32 crc kubenswrapper[4815]: I1207 19:16:32.522440 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:32 crc kubenswrapper[4815]: I1207 19:16:32.522464 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:32 crc kubenswrapper[4815]: I1207 19:16:32.522489 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:32 crc kubenswrapper[4815]: I1207 19:16:32.522507 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:32Z","lastTransitionTime":"2025-12-07T19:16:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:32 crc kubenswrapper[4815]: I1207 19:16:32.625750 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:32 crc kubenswrapper[4815]: I1207 19:16:32.625820 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:32 crc kubenswrapper[4815]: I1207 19:16:32.625844 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:32 crc kubenswrapper[4815]: I1207 19:16:32.625873 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:32 crc kubenswrapper[4815]: I1207 19:16:32.625895 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:32Z","lastTransitionTime":"2025-12-07T19:16:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:32 crc kubenswrapper[4815]: I1207 19:16:32.729109 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:32 crc kubenswrapper[4815]: I1207 19:16:32.729174 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:32 crc kubenswrapper[4815]: I1207 19:16:32.729196 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:32 crc kubenswrapper[4815]: I1207 19:16:32.729219 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:32 crc kubenswrapper[4815]: I1207 19:16:32.729235 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:32Z","lastTransitionTime":"2025-12-07T19:16:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:32 crc kubenswrapper[4815]: I1207 19:16:32.769606 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 07 19:16:32 crc kubenswrapper[4815]: I1207 19:16:32.769690 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xbq22" Dec 07 19:16:32 crc kubenswrapper[4815]: E1207 19:16:32.769784 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 07 19:16:32 crc kubenswrapper[4815]: E1207 19:16:32.769970 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xbq22" podUID="201e9ba8-3e19-4555-90f0-587497a2a328" Dec 07 19:16:32 crc kubenswrapper[4815]: I1207 19:16:32.831429 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:32 crc kubenswrapper[4815]: I1207 19:16:32.831485 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:32 crc kubenswrapper[4815]: I1207 19:16:32.831502 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:32 crc kubenswrapper[4815]: I1207 19:16:32.831527 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:32 crc kubenswrapper[4815]: I1207 19:16:32.831543 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:32Z","lastTransitionTime":"2025-12-07T19:16:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:32 crc kubenswrapper[4815]: I1207 19:16:32.935373 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:32 crc kubenswrapper[4815]: I1207 19:16:32.935438 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:32 crc kubenswrapper[4815]: I1207 19:16:32.935456 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:32 crc kubenswrapper[4815]: I1207 19:16:32.935505 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:32 crc kubenswrapper[4815]: I1207 19:16:32.935523 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:32Z","lastTransitionTime":"2025-12-07T19:16:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:33 crc kubenswrapper[4815]: I1207 19:16:33.039079 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:33 crc kubenswrapper[4815]: I1207 19:16:33.039136 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:33 crc kubenswrapper[4815]: I1207 19:16:33.039147 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:33 crc kubenswrapper[4815]: I1207 19:16:33.039171 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:33 crc kubenswrapper[4815]: I1207 19:16:33.039185 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:33Z","lastTransitionTime":"2025-12-07T19:16:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:33 crc kubenswrapper[4815]: I1207 19:16:33.142595 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:33 crc kubenswrapper[4815]: I1207 19:16:33.142673 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:33 crc kubenswrapper[4815]: I1207 19:16:33.142697 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:33 crc kubenswrapper[4815]: I1207 19:16:33.142726 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:33 crc kubenswrapper[4815]: I1207 19:16:33.142750 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:33Z","lastTransitionTime":"2025-12-07T19:16:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:33 crc kubenswrapper[4815]: I1207 19:16:33.246478 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:33 crc kubenswrapper[4815]: I1207 19:16:33.246555 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:33 crc kubenswrapper[4815]: I1207 19:16:33.246578 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:33 crc kubenswrapper[4815]: I1207 19:16:33.246608 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:33 crc kubenswrapper[4815]: I1207 19:16:33.246635 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:33Z","lastTransitionTime":"2025-12-07T19:16:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:33 crc kubenswrapper[4815]: I1207 19:16:33.350029 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:33 crc kubenswrapper[4815]: I1207 19:16:33.350106 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:33 crc kubenswrapper[4815]: I1207 19:16:33.350130 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:33 crc kubenswrapper[4815]: I1207 19:16:33.350158 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:33 crc kubenswrapper[4815]: I1207 19:16:33.350180 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:33Z","lastTransitionTime":"2025-12-07T19:16:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:33 crc kubenswrapper[4815]: I1207 19:16:33.452578 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:33 crc kubenswrapper[4815]: I1207 19:16:33.452613 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:33 crc kubenswrapper[4815]: I1207 19:16:33.452621 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:33 crc kubenswrapper[4815]: I1207 19:16:33.452635 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:33 crc kubenswrapper[4815]: I1207 19:16:33.452664 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:33Z","lastTransitionTime":"2025-12-07T19:16:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:33 crc kubenswrapper[4815]: I1207 19:16:33.555839 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:33 crc kubenswrapper[4815]: I1207 19:16:33.555880 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:33 crc kubenswrapper[4815]: I1207 19:16:33.555891 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:33 crc kubenswrapper[4815]: I1207 19:16:33.555905 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:33 crc kubenswrapper[4815]: I1207 19:16:33.555940 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:33Z","lastTransitionTime":"2025-12-07T19:16:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:33 crc kubenswrapper[4815]: I1207 19:16:33.663587 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:33 crc kubenswrapper[4815]: I1207 19:16:33.663623 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:33 crc kubenswrapper[4815]: I1207 19:16:33.663635 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:33 crc kubenswrapper[4815]: I1207 19:16:33.663651 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:33 crc kubenswrapper[4815]: I1207 19:16:33.663662 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:33Z","lastTransitionTime":"2025-12-07T19:16:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:33 crc kubenswrapper[4815]: I1207 19:16:33.767064 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:33 crc kubenswrapper[4815]: I1207 19:16:33.767116 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:33 crc kubenswrapper[4815]: I1207 19:16:33.767133 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:33 crc kubenswrapper[4815]: I1207 19:16:33.767154 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:33 crc kubenswrapper[4815]: I1207 19:16:33.767168 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:33Z","lastTransitionTime":"2025-12-07T19:16:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:33 crc kubenswrapper[4815]: I1207 19:16:33.769382 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 07 19:16:33 crc kubenswrapper[4815]: I1207 19:16:33.769382 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 07 19:16:33 crc kubenswrapper[4815]: E1207 19:16:33.769540 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 07 19:16:33 crc kubenswrapper[4815]: E1207 19:16:33.769591 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 07 19:16:33 crc kubenswrapper[4815]: I1207 19:16:33.869804 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:33 crc kubenswrapper[4815]: I1207 19:16:33.869839 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:33 crc kubenswrapper[4815]: I1207 19:16:33.869848 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:33 crc kubenswrapper[4815]: I1207 19:16:33.869860 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:33 crc kubenswrapper[4815]: I1207 19:16:33.869868 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:33Z","lastTransitionTime":"2025-12-07T19:16:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:33 crc kubenswrapper[4815]: I1207 19:16:33.972757 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:33 crc kubenswrapper[4815]: I1207 19:16:33.972804 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:33 crc kubenswrapper[4815]: I1207 19:16:33.972821 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:33 crc kubenswrapper[4815]: I1207 19:16:33.972841 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:33 crc kubenswrapper[4815]: I1207 19:16:33.972854 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:33Z","lastTransitionTime":"2025-12-07T19:16:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:34 crc kubenswrapper[4815]: I1207 19:16:34.076526 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:34 crc kubenswrapper[4815]: I1207 19:16:34.076566 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:34 crc kubenswrapper[4815]: I1207 19:16:34.076575 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:34 crc kubenswrapper[4815]: I1207 19:16:34.076589 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:34 crc kubenswrapper[4815]: I1207 19:16:34.076597 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:34Z","lastTransitionTime":"2025-12-07T19:16:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:34 crc kubenswrapper[4815]: I1207 19:16:34.179429 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:34 crc kubenswrapper[4815]: I1207 19:16:34.179504 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:34 crc kubenswrapper[4815]: I1207 19:16:34.179526 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:34 crc kubenswrapper[4815]: I1207 19:16:34.179554 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:34 crc kubenswrapper[4815]: I1207 19:16:34.179577 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:34Z","lastTransitionTime":"2025-12-07T19:16:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:34 crc kubenswrapper[4815]: I1207 19:16:34.282543 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:34 crc kubenswrapper[4815]: I1207 19:16:34.282621 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:34 crc kubenswrapper[4815]: I1207 19:16:34.282644 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:34 crc kubenswrapper[4815]: I1207 19:16:34.282673 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:34 crc kubenswrapper[4815]: I1207 19:16:34.282697 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:34Z","lastTransitionTime":"2025-12-07T19:16:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:34 crc kubenswrapper[4815]: I1207 19:16:34.385815 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:34 crc kubenswrapper[4815]: I1207 19:16:34.385873 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:34 crc kubenswrapper[4815]: I1207 19:16:34.385893 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:34 crc kubenswrapper[4815]: I1207 19:16:34.385955 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:34 crc kubenswrapper[4815]: I1207 19:16:34.385976 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:34Z","lastTransitionTime":"2025-12-07T19:16:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:34 crc kubenswrapper[4815]: I1207 19:16:34.488628 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:34 crc kubenswrapper[4815]: I1207 19:16:34.488681 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:34 crc kubenswrapper[4815]: I1207 19:16:34.488693 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:34 crc kubenswrapper[4815]: I1207 19:16:34.488745 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:34 crc kubenswrapper[4815]: I1207 19:16:34.488756 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:34Z","lastTransitionTime":"2025-12-07T19:16:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:34 crc kubenswrapper[4815]: I1207 19:16:34.592285 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:34 crc kubenswrapper[4815]: I1207 19:16:34.592344 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:34 crc kubenswrapper[4815]: I1207 19:16:34.592362 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:34 crc kubenswrapper[4815]: I1207 19:16:34.592386 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:34 crc kubenswrapper[4815]: I1207 19:16:34.592409 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:34Z","lastTransitionTime":"2025-12-07T19:16:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:34 crc kubenswrapper[4815]: I1207 19:16:34.695717 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:34 crc kubenswrapper[4815]: I1207 19:16:34.695772 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:34 crc kubenswrapper[4815]: I1207 19:16:34.695790 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:34 crc kubenswrapper[4815]: I1207 19:16:34.695817 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:34 crc kubenswrapper[4815]: I1207 19:16:34.695839 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:34Z","lastTransitionTime":"2025-12-07T19:16:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:34 crc kubenswrapper[4815]: I1207 19:16:34.768909 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 07 19:16:34 crc kubenswrapper[4815]: E1207 19:16:34.769037 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 07 19:16:34 crc kubenswrapper[4815]: I1207 19:16:34.769164 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xbq22" Dec 07 19:16:34 crc kubenswrapper[4815]: E1207 19:16:34.769303 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xbq22" podUID="201e9ba8-3e19-4555-90f0-587497a2a328" Dec 07 19:16:34 crc kubenswrapper[4815]: I1207 19:16:34.798495 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:34 crc kubenswrapper[4815]: I1207 19:16:34.798547 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:34 crc kubenswrapper[4815]: I1207 19:16:34.798566 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:34 crc kubenswrapper[4815]: I1207 19:16:34.798589 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:34 crc kubenswrapper[4815]: I1207 19:16:34.798607 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:34Z","lastTransitionTime":"2025-12-07T19:16:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:34 crc kubenswrapper[4815]: I1207 19:16:34.901439 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:34 crc kubenswrapper[4815]: I1207 19:16:34.901494 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:34 crc kubenswrapper[4815]: I1207 19:16:34.901514 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:34 crc kubenswrapper[4815]: I1207 19:16:34.901538 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:34 crc kubenswrapper[4815]: I1207 19:16:34.901555 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:34Z","lastTransitionTime":"2025-12-07T19:16:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:35 crc kubenswrapper[4815]: I1207 19:16:35.003650 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:35 crc kubenswrapper[4815]: I1207 19:16:35.003708 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:35 crc kubenswrapper[4815]: I1207 19:16:35.003719 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:35 crc kubenswrapper[4815]: I1207 19:16:35.003735 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:35 crc kubenswrapper[4815]: I1207 19:16:35.003744 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:35Z","lastTransitionTime":"2025-12-07T19:16:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:35 crc kubenswrapper[4815]: I1207 19:16:35.107127 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:35 crc kubenswrapper[4815]: I1207 19:16:35.107183 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:35 crc kubenswrapper[4815]: I1207 19:16:35.107200 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:35 crc kubenswrapper[4815]: I1207 19:16:35.107223 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:35 crc kubenswrapper[4815]: I1207 19:16:35.107296 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:35Z","lastTransitionTime":"2025-12-07T19:16:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:35 crc kubenswrapper[4815]: I1207 19:16:35.210663 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:35 crc kubenswrapper[4815]: I1207 19:16:35.210719 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:35 crc kubenswrapper[4815]: I1207 19:16:35.210742 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:35 crc kubenswrapper[4815]: I1207 19:16:35.210766 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:35 crc kubenswrapper[4815]: I1207 19:16:35.210784 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:35Z","lastTransitionTime":"2025-12-07T19:16:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:35 crc kubenswrapper[4815]: I1207 19:16:35.313796 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:35 crc kubenswrapper[4815]: I1207 19:16:35.313856 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:35 crc kubenswrapper[4815]: I1207 19:16:35.313874 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:35 crc kubenswrapper[4815]: I1207 19:16:35.313897 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:35 crc kubenswrapper[4815]: I1207 19:16:35.313949 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:35Z","lastTransitionTime":"2025-12-07T19:16:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:35 crc kubenswrapper[4815]: I1207 19:16:35.416404 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:35 crc kubenswrapper[4815]: I1207 19:16:35.416460 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:35 crc kubenswrapper[4815]: I1207 19:16:35.416500 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:35 crc kubenswrapper[4815]: I1207 19:16:35.416526 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:35 crc kubenswrapper[4815]: I1207 19:16:35.416542 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:35Z","lastTransitionTime":"2025-12-07T19:16:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:35 crc kubenswrapper[4815]: I1207 19:16:35.520036 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:35 crc kubenswrapper[4815]: I1207 19:16:35.520096 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:35 crc kubenswrapper[4815]: I1207 19:16:35.520114 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:35 crc kubenswrapper[4815]: I1207 19:16:35.520140 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:35 crc kubenswrapper[4815]: I1207 19:16:35.520156 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:35Z","lastTransitionTime":"2025-12-07T19:16:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:35 crc kubenswrapper[4815]: I1207 19:16:35.622822 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:35 crc kubenswrapper[4815]: I1207 19:16:35.622881 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:35 crc kubenswrapper[4815]: I1207 19:16:35.622898 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:35 crc kubenswrapper[4815]: I1207 19:16:35.622958 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:35 crc kubenswrapper[4815]: I1207 19:16:35.622984 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:35Z","lastTransitionTime":"2025-12-07T19:16:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:35 crc kubenswrapper[4815]: I1207 19:16:35.725828 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:35 crc kubenswrapper[4815]: I1207 19:16:35.725882 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:35 crc kubenswrapper[4815]: I1207 19:16:35.725900 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:35 crc kubenswrapper[4815]: I1207 19:16:35.725957 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:35 crc kubenswrapper[4815]: I1207 19:16:35.725976 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:35Z","lastTransitionTime":"2025-12-07T19:16:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:35 crc kubenswrapper[4815]: I1207 19:16:35.769643 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 07 19:16:35 crc kubenswrapper[4815]: E1207 19:16:35.769809 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 07 19:16:35 crc kubenswrapper[4815]: I1207 19:16:35.769838 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 07 19:16:35 crc kubenswrapper[4815]: E1207 19:16:35.769972 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 07 19:16:35 crc kubenswrapper[4815]: I1207 19:16:35.785959 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d662ba2-aa03-4eea-bd30-8ad40638f6c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4939db657209a8744656f63541b50598888981216b000dd4316a9327fdfbcf34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2v8vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55eba62082762b6527923446c2d419d2eb00919331cf8e59bfa9cb27ca65a82e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2v8vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-gkn4h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:35Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:35 crc kubenswrapper[4815]: I1207 19:16:35.800714 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-xbq22" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"201e9ba8-3e19-4555-90f0-587497a2a328\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z24bd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z24bd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-xbq22\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:35Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:35 crc kubenswrapper[4815]: I1207 19:16:35.816311 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bbbc6dc9-0ce5-4bbb-a7f4-0bd7c158f2ee\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc18ff4f1a0a1e7aae38ce7dad5fbea485553d57aac5dbd709fe94d69b4ac6d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6721c873a3c399141d67e58a9eb8d54614c64a48d6a3e373b6b74b884de6450\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6721c873a3c399141d67e58a9eb8d54614c64a48d6a3e373b6b74b884de6450\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:35Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:35 crc kubenswrapper[4815]: I1207 19:16:35.838827 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:35 crc kubenswrapper[4815]: I1207 19:16:35.838897 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:35 crc kubenswrapper[4815]: I1207 19:16:35.838935 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:35 crc kubenswrapper[4815]: I1207 19:16:35.838961 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:35 crc kubenswrapper[4815]: I1207 19:16:35.838978 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:35Z","lastTransitionTime":"2025-12-07T19:16:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:35 crc kubenswrapper[4815]: I1207 19:16:35.855273 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0499c01-b137-4349-ad4b-5570742d072a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c5e629885bc5c80c370216c2ad383a9c258c28ef825a043812dbb045e61aa1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db98a67270148a5b0749995280d4e5591b1682e847d56b06654319b528928abf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7f2e12c3c62b663f9ccb2d51e3e9d80e25024d21bd1c539f999401238ee2003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3af37bcac7eb4c2aae020c71eb18933c2d3a283f59b3dbb02dba4dfdee1c3928\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7bfebca60e63011d26025b24bad72ec02c223b2d889f6985504a371645bcfc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0faf51cb907b5a90d7d1856e10ed14b8550be7fea5ff9b912ad8fbb96d1a67ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0faf51cb907b5a90d7d1856e10ed14b8550be7fea5ff9b912ad8fbb96d1a67ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55f8d454d8472869d96503b5e9e62e8f8f3a41fa94d4d4e6f38e3c8446b025c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b55f8d454d8472869d96503b5e9e62e8f8f3a41fa94d4d4e6f38e3c8446b025c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:14:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://df109e5aa0deba7a690ea1f5f160ab06d8a3a3e0aa3536b620a7ff738e61c701\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://df109e5aa0deba7a690ea1f5f160ab06d8a3a3e0aa3536b620a7ff738e61c701\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:35Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:35 crc kubenswrapper[4815]: I1207 19:16:35.872734 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-s95hp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0b739f36-d9c4-4fb6-9ead-9df05e283dea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:16:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:16:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7285635ab7710e9071a051b6e49036856b4c60c87b5110debec3bfb20bb0ac97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73eedf1d85ce69ddf2fed2445ed9318cb4515aee47f89fe9b1db69bbf213ea2a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-07T19:16:12Z\\\",\\\"message\\\":\\\"2025-12-07T19:15:27+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_5074391b-4630-4c1f-b2fa-f9e22786274e\\\\n2025-12-07T19:15:27+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_5074391b-4630-4c1f-b2fa-f9e22786274e to /host/opt/cni/bin/\\\\n2025-12-07T19:15:27Z [verbose] multus-daemon started\\\\n2025-12-07T19:15:27Z [verbose] Readiness Indicator file check\\\\n2025-12-07T19:16:12Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:16:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-prbsc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-s95hp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:35Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:35 crc kubenswrapper[4815]: I1207 19:16:35.891583 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gmf4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fefaece-ab52-48e2-9ee9-fb07be1922f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://600d32fcae56dd0fa63421283e9b0fdfc23c4589a1dc4a66e5ff1f4c6a415e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e14454c2c5dbbdcbfd3ff7ec5d618c2dfe48f41af9846b72f61941b878e53d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e14454c2c5dbbdcbfd3ff7ec5d618c2dfe48f41af9846b72f61941b878e53d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99f33f1decd49213b8e034c2d1f2a3a51ab99aa18679bd3378becaa27f1ef4b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99f33f1decd49213b8e034c2d1f2a3a51ab99aa18679bd3378becaa27f1ef4b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c519c7c8ea1b7405466970c64269b72cc6e8129838764dfc4a662cb4df419c6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c519c7c8ea1b7405466970c64269b72cc6e8129838764dfc4a662cb4df419c6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c9efcd688f4886df56fb4d0e2ff5f5dd0c9e877186e1b53d398b7bed0547609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8c9efcd688f4886df56fb4d0e2ff5f5dd0c9e877186e1b53d398b7bed0547609\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a4a622d5231f5bacbb9dba0290a87f10310a0053be2dd692718fcf1002deb44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a4a622d5231f5bacbb9dba0290a87f10310a0053be2dd692718fcf1002deb44\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://825649f5768765281bad22f9b86ed162b80cc258ce11a746ec06c9f415a21529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://825649f5768765281bad22f9b86ed162b80cc258ce11a746ec06c9f415a21529\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-25wn4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gmf4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:35Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:35 crc kubenswrapper[4815]: I1207 19:16:35.908607 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tqxds" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5dfd265a-0c7f-40bd-9226-82c2b1abbeda\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3523e317081c56f05c00d0288c6ad6f1ff04f1346772bc0c86aed488570786a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzfhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a277a724d026909546ae84aee6d8d29fc87f0277f9d6aa45ca99e5f116ac79d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rzfhb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-tqxds\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:35Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:35 crc kubenswrapper[4815]: I1207 19:16:35.929261 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"538dccc2-452b-4d13-8b0d-df993c2b69a2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae001d2e4c277c57382ca0c4a1286c3c135f2fc06ca5a78df27cbdab588ce576\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f92a2d37af34c519a6b00a9643f4a46de779ae0070adb0826e9f227960fe4a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89a76372fcc5ff7131084098e41183d148096169b9ccaf69d59dbef45dcfd755\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f29e69a7aa018d68c2e39f3e0f7a619fb25262acaebf83b81fabb6f9b87a9892\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:35Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:35 crc kubenswrapper[4815]: I1207 19:16:35.943020 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:35 crc kubenswrapper[4815]: I1207 19:16:35.943066 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:35 crc kubenswrapper[4815]: I1207 19:16:35.943079 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:35 crc kubenswrapper[4815]: I1207 19:16:35.943097 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:35 crc kubenswrapper[4815]: I1207 19:16:35.943113 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:35Z","lastTransitionTime":"2025-12-07T19:16:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:35 crc kubenswrapper[4815]: I1207 19:16:35.947429 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cb6cb3bfed28d77ff74b7a2273ddb477e21f930f79b169bfbd98fa10a5ead5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:35Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:35 crc kubenswrapper[4815]: I1207 19:16:35.982499 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac471afc20373bd22caa4941b5ebe38fdf62ad2145fabb4211fa02ce8728ab1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4050d0d7398492efc8809311a01c1a60986b97b6d177846e70c9283399f37b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c63ad6fab49ca4b8deb896aef3b5ff43c3a1ae1035e7ad94815a938939c57c58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://812356ce6808d8d2a5845dec1132733fab86a0d6853951920ce5fb5083551c7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ab1c01dcbfe50580e64d89dc9e14843419588c89d7d792af89034d8f272e44e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60f140818c38211cb31d63c0b540b60681a781c349ba0523866a3c7e80f33756\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebff9b10eeedac44403b96b81a1731f6d5569cf3097b0387d72e11c1d602d51b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebff9b10eeedac44403b96b81a1731f6d5569cf3097b0387d72e11c1d602d51b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-07T19:16:29Z\\\",\\\"message\\\":\\\"etwork-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI1207 19:16:29.461728 6776 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI1207 19:16:29.461761 6776 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI1207 19:16:29.461789 6776 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/machine-api-operator\\\\\\\"}\\\\nI1207 19:16:29.461806 6776 services_controller.go:360] Finished syncing service machine-api-operator on namespace openshift-machine-api for network=default : 867.943µs\\\\nI1207 19:16:29.461815 6776 factory.go:1336] Added *v1.Node event handler 7\\\\nI1207 19:16:29.461853 6776 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1207 19:16:29.462259 6776 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1207 19:16:29.462362 6776 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1207 19:16:29.462404 6776 ovnkube.go:599] Stopped ovnkube\\\\nI1207 19:16:29.462446 6776 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1207 19:16:29.462529 6776 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-07T19:16:28Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-zzw6c_openshift-ovn-kubernetes(13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba72f8ec4b6a36d0c1a6fd8f9f8e5fad577a00b8ae515738a007b994b989d134\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a4a4b2d4e80c078ae857e91491c082c6b1c57a5297c485c31fac0c90996b19f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a4a4b2d4e80c078ae857e91491c082c6b1c57a5297c485c31fac0c90996b19f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:15:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:15:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tddph\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zzw6c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:35Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:35 crc kubenswrapper[4815]: I1207 19:16:35.999363 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kh7gd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29b15186-a725-4067-ba76-336f580327fa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1fa8bda26912926bc52757424a9930e1eca6ee953f28ffdb205c1b8b8f29d13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7rzjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:28Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kh7gd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:35Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:36 crc kubenswrapper[4815]: I1207 19:16:36.018113 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:36Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:36 crc kubenswrapper[4815]: I1207 19:16:36.032906 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-nz6q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92d414c2-ecc3-4598-ac9d-b982bfd89c7e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://003f0d6dca39f167925bc3a05634d9dd20f23e54135631e0099a56801f9daf3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cp5k8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:15:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-nz6q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:36Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:36 crc kubenswrapper[4815]: I1207 19:16:36.048667 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:36 crc kubenswrapper[4815]: I1207 19:16:36.048756 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:36 crc kubenswrapper[4815]: I1207 19:16:36.048838 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:36 crc kubenswrapper[4815]: I1207 19:16:36.048986 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:36 crc kubenswrapper[4815]: I1207 19:16:36.049098 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:36Z","lastTransitionTime":"2025-12-07T19:16:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:36 crc kubenswrapper[4815]: I1207 19:16:36.051729 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c752886-7f9b-4605-8358-0fde597c93da\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://441c88363ea1cc8305fa21c51c9237798c47d82a207c9e80da98df93027fe4a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb1ec115ead917caf40ce3586653d6eb78e9a290d845f766f8818c2b57ece6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bda919a651aa83d2f9a82f711d0b242e14140e3d13ec5a80f7ec675a4aedb21a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33a4c84a8cd512a90c51f4721d9ddfde2a452e7f03d9809d492d5bccd7cf4aa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33a4c84a8cd512a90c51f4721d9ddfde2a452e7f03d9809d492d5bccd7cf4aa4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:36Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:36 crc kubenswrapper[4815]: I1207 19:16:36.074601 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe01e9d3-9839-41ea-82d9-f4b6f48bf3b2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-07T19:14:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52ab545f838534306ea70cc6e400e9fb56ed71ed8206d72c6626166f3a51a93c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f59886c065844948639bf9828ed20c24546cc38659683c98162568c6b12c029\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bec885c9a80d9b5b1fe630a9b7a3f19b23b0af05f359bd87a664f007812b4c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c76d9f056d22d1be012327374370269c2f6d6b8bd341c98419fcb744370751e3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://539bb1f0e569004272aac994e13bdc7f17f342f5986f08beca97a4c888e046c2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1207 19:15:09.905103 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1207 19:15:09.906859 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4228347019/tls.crt::/tmp/serving-cert-4228347019/tls.key\\\\\\\"\\\\nI1207 19:15:20.294696 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1207 19:15:20.298748 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1207 19:15:20.298781 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1207 19:15:20.298843 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1207 19:15:20.298860 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1207 19:15:20.305864 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1207 19:15:20.305885 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1207 19:15:20.305891 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1207 19:15:20.305973 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1207 19:15:20.305983 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1207 19:15:20.305988 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1207 19:15:20.305992 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1207 19:15:20.305997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1207 19:15:20.308727 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://902a764a3f6e3ac55c1817f369dfd69ffbf8528bfe4444a6514cdabd48f09fb5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:14:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-07T19:14:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-07T19:14:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-07T19:14:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:36Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:36 crc kubenswrapper[4815]: I1207 19:16:36.097246 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:36Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:36 crc kubenswrapper[4815]: I1207 19:16:36.119605 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b57a8653da7e9136a6c54872ff67a01c2d6ad5077842b783ffe74e0069bc8693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:36Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:36 crc kubenswrapper[4815]: I1207 19:16:36.141505 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:20Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:36Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:36 crc kubenswrapper[4815]: I1207 19:16:36.151784 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:36 crc kubenswrapper[4815]: I1207 19:16:36.151840 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:36 crc kubenswrapper[4815]: I1207 19:16:36.151855 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:36 crc kubenswrapper[4815]: I1207 19:16:36.151877 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:36 crc kubenswrapper[4815]: I1207 19:16:36.151889 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:36Z","lastTransitionTime":"2025-12-07T19:16:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:36 crc kubenswrapper[4815]: I1207 19:16:36.161310 4815 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-07T19:15:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ec34a6371c74ee0778e27d73f5e2948d9ee1bb518f658675df123df60879c61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc20ae853b989c9e9cfeb47a40bd1a02397e3ce0b0b4feec3fcec56ea7be16a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-07T19:15:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:36Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:36 crc kubenswrapper[4815]: I1207 19:16:36.254807 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:36 crc kubenswrapper[4815]: I1207 19:16:36.254887 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:36 crc kubenswrapper[4815]: I1207 19:16:36.254905 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:36 crc kubenswrapper[4815]: I1207 19:16:36.254951 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:36 crc kubenswrapper[4815]: I1207 19:16:36.254968 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:36Z","lastTransitionTime":"2025-12-07T19:16:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:36 crc kubenswrapper[4815]: I1207 19:16:36.358122 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:36 crc kubenswrapper[4815]: I1207 19:16:36.358185 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:36 crc kubenswrapper[4815]: I1207 19:16:36.358202 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:36 crc kubenswrapper[4815]: I1207 19:16:36.358226 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:36 crc kubenswrapper[4815]: I1207 19:16:36.358246 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:36Z","lastTransitionTime":"2025-12-07T19:16:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:36 crc kubenswrapper[4815]: I1207 19:16:36.460851 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:36 crc kubenswrapper[4815]: I1207 19:16:36.460978 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:36 crc kubenswrapper[4815]: I1207 19:16:36.461007 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:36 crc kubenswrapper[4815]: I1207 19:16:36.461036 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:36 crc kubenswrapper[4815]: I1207 19:16:36.461058 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:36Z","lastTransitionTime":"2025-12-07T19:16:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:36 crc kubenswrapper[4815]: I1207 19:16:36.564240 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:36 crc kubenswrapper[4815]: I1207 19:16:36.564306 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:36 crc kubenswrapper[4815]: I1207 19:16:36.564325 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:36 crc kubenswrapper[4815]: I1207 19:16:36.564349 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:36 crc kubenswrapper[4815]: I1207 19:16:36.564367 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:36Z","lastTransitionTime":"2025-12-07T19:16:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:36 crc kubenswrapper[4815]: I1207 19:16:36.667431 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:36 crc kubenswrapper[4815]: I1207 19:16:36.667497 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:36 crc kubenswrapper[4815]: I1207 19:16:36.667516 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:36 crc kubenswrapper[4815]: I1207 19:16:36.667541 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:36 crc kubenswrapper[4815]: I1207 19:16:36.667558 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:36Z","lastTransitionTime":"2025-12-07T19:16:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:36 crc kubenswrapper[4815]: I1207 19:16:36.768953 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 07 19:16:36 crc kubenswrapper[4815]: I1207 19:16:36.769006 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xbq22" Dec 07 19:16:36 crc kubenswrapper[4815]: E1207 19:16:36.769117 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 07 19:16:36 crc kubenswrapper[4815]: E1207 19:16:36.769319 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xbq22" podUID="201e9ba8-3e19-4555-90f0-587497a2a328" Dec 07 19:16:36 crc kubenswrapper[4815]: I1207 19:16:36.770365 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:36 crc kubenswrapper[4815]: I1207 19:16:36.770440 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:36 crc kubenswrapper[4815]: I1207 19:16:36.770466 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:36 crc kubenswrapper[4815]: I1207 19:16:36.770492 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:36 crc kubenswrapper[4815]: I1207 19:16:36.770511 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:36Z","lastTransitionTime":"2025-12-07T19:16:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:36 crc kubenswrapper[4815]: I1207 19:16:36.873301 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:36 crc kubenswrapper[4815]: I1207 19:16:36.873360 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:36 crc kubenswrapper[4815]: I1207 19:16:36.873384 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:36 crc kubenswrapper[4815]: I1207 19:16:36.873412 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:36 crc kubenswrapper[4815]: I1207 19:16:36.873431 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:36Z","lastTransitionTime":"2025-12-07T19:16:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:36 crc kubenswrapper[4815]: I1207 19:16:36.976406 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:36 crc kubenswrapper[4815]: I1207 19:16:36.976464 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:36 crc kubenswrapper[4815]: I1207 19:16:36.976481 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:36 crc kubenswrapper[4815]: I1207 19:16:36.976506 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:36 crc kubenswrapper[4815]: I1207 19:16:36.976524 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:36Z","lastTransitionTime":"2025-12-07T19:16:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:37 crc kubenswrapper[4815]: I1207 19:16:37.079094 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:37 crc kubenswrapper[4815]: I1207 19:16:37.079142 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:37 crc kubenswrapper[4815]: I1207 19:16:37.079160 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:37 crc kubenswrapper[4815]: I1207 19:16:37.079181 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:37 crc kubenswrapper[4815]: I1207 19:16:37.079197 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:37Z","lastTransitionTime":"2025-12-07T19:16:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:37 crc kubenswrapper[4815]: I1207 19:16:37.182194 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:37 crc kubenswrapper[4815]: I1207 19:16:37.182248 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:37 crc kubenswrapper[4815]: I1207 19:16:37.182265 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:37 crc kubenswrapper[4815]: I1207 19:16:37.182292 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:37 crc kubenswrapper[4815]: I1207 19:16:37.182309 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:37Z","lastTransitionTime":"2025-12-07T19:16:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:37 crc kubenswrapper[4815]: I1207 19:16:37.289735 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:37 crc kubenswrapper[4815]: I1207 19:16:37.290652 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:37 crc kubenswrapper[4815]: I1207 19:16:37.290683 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:37 crc kubenswrapper[4815]: I1207 19:16:37.290708 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:37 crc kubenswrapper[4815]: I1207 19:16:37.290729 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:37Z","lastTransitionTime":"2025-12-07T19:16:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:37 crc kubenswrapper[4815]: I1207 19:16:37.393896 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:37 crc kubenswrapper[4815]: I1207 19:16:37.393988 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:37 crc kubenswrapper[4815]: I1207 19:16:37.394008 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:37 crc kubenswrapper[4815]: I1207 19:16:37.394038 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:37 crc kubenswrapper[4815]: I1207 19:16:37.394060 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:37Z","lastTransitionTime":"2025-12-07T19:16:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:37 crc kubenswrapper[4815]: I1207 19:16:37.497085 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:37 crc kubenswrapper[4815]: I1207 19:16:37.497134 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:37 crc kubenswrapper[4815]: I1207 19:16:37.497151 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:37 crc kubenswrapper[4815]: I1207 19:16:37.497176 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:37 crc kubenswrapper[4815]: I1207 19:16:37.497192 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:37Z","lastTransitionTime":"2025-12-07T19:16:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:37 crc kubenswrapper[4815]: I1207 19:16:37.600123 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:37 crc kubenswrapper[4815]: I1207 19:16:37.600183 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:37 crc kubenswrapper[4815]: I1207 19:16:37.600201 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:37 crc kubenswrapper[4815]: I1207 19:16:37.600227 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:37 crc kubenswrapper[4815]: I1207 19:16:37.600247 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:37Z","lastTransitionTime":"2025-12-07T19:16:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:37 crc kubenswrapper[4815]: I1207 19:16:37.703464 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:37 crc kubenswrapper[4815]: I1207 19:16:37.703757 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:37 crc kubenswrapper[4815]: I1207 19:16:37.703976 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:37 crc kubenswrapper[4815]: I1207 19:16:37.704156 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:37 crc kubenswrapper[4815]: I1207 19:16:37.704334 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:37Z","lastTransitionTime":"2025-12-07T19:16:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:37 crc kubenswrapper[4815]: I1207 19:16:37.769784 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 07 19:16:37 crc kubenswrapper[4815]: I1207 19:16:37.770314 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 07 19:16:37 crc kubenswrapper[4815]: E1207 19:16:37.770594 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 07 19:16:37 crc kubenswrapper[4815]: E1207 19:16:37.771189 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 07 19:16:37 crc kubenswrapper[4815]: I1207 19:16:37.807651 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:37 crc kubenswrapper[4815]: I1207 19:16:37.807711 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:37 crc kubenswrapper[4815]: I1207 19:16:37.807731 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:37 crc kubenswrapper[4815]: I1207 19:16:37.807756 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:37 crc kubenswrapper[4815]: I1207 19:16:37.807773 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:37Z","lastTransitionTime":"2025-12-07T19:16:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:37 crc kubenswrapper[4815]: I1207 19:16:37.911270 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:37 crc kubenswrapper[4815]: I1207 19:16:37.911325 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:37 crc kubenswrapper[4815]: I1207 19:16:37.911350 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:37 crc kubenswrapper[4815]: I1207 19:16:37.911380 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:37 crc kubenswrapper[4815]: I1207 19:16:37.911403 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:37Z","lastTransitionTime":"2025-12-07T19:16:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:38 crc kubenswrapper[4815]: I1207 19:16:38.013942 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:38 crc kubenswrapper[4815]: I1207 19:16:38.014005 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:38 crc kubenswrapper[4815]: I1207 19:16:38.014026 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:38 crc kubenswrapper[4815]: I1207 19:16:38.014054 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:38 crc kubenswrapper[4815]: I1207 19:16:38.014073 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:38Z","lastTransitionTime":"2025-12-07T19:16:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:38 crc kubenswrapper[4815]: I1207 19:16:38.117395 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:38 crc kubenswrapper[4815]: I1207 19:16:38.117454 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:38 crc kubenswrapper[4815]: I1207 19:16:38.117476 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:38 crc kubenswrapper[4815]: I1207 19:16:38.117504 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:38 crc kubenswrapper[4815]: I1207 19:16:38.117522 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:38Z","lastTransitionTime":"2025-12-07T19:16:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:38 crc kubenswrapper[4815]: I1207 19:16:38.220732 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:38 crc kubenswrapper[4815]: I1207 19:16:38.220787 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:38 crc kubenswrapper[4815]: I1207 19:16:38.220804 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:38 crc kubenswrapper[4815]: I1207 19:16:38.220826 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:38 crc kubenswrapper[4815]: I1207 19:16:38.220844 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:38Z","lastTransitionTime":"2025-12-07T19:16:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:38 crc kubenswrapper[4815]: I1207 19:16:38.323762 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:38 crc kubenswrapper[4815]: I1207 19:16:38.323838 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:38 crc kubenswrapper[4815]: I1207 19:16:38.323873 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:38 crc kubenswrapper[4815]: I1207 19:16:38.323898 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:38 crc kubenswrapper[4815]: I1207 19:16:38.323959 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:38Z","lastTransitionTime":"2025-12-07T19:16:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:38 crc kubenswrapper[4815]: I1207 19:16:38.426320 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:38 crc kubenswrapper[4815]: I1207 19:16:38.426378 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:38 crc kubenswrapper[4815]: I1207 19:16:38.426396 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:38 crc kubenswrapper[4815]: I1207 19:16:38.426419 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:38 crc kubenswrapper[4815]: I1207 19:16:38.426436 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:38Z","lastTransitionTime":"2025-12-07T19:16:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:38 crc kubenswrapper[4815]: I1207 19:16:38.529211 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:38 crc kubenswrapper[4815]: I1207 19:16:38.529278 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:38 crc kubenswrapper[4815]: I1207 19:16:38.529302 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:38 crc kubenswrapper[4815]: I1207 19:16:38.529329 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:38 crc kubenswrapper[4815]: I1207 19:16:38.529348 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:38Z","lastTransitionTime":"2025-12-07T19:16:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:38 crc kubenswrapper[4815]: I1207 19:16:38.631686 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:38 crc kubenswrapper[4815]: I1207 19:16:38.631721 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:38 crc kubenswrapper[4815]: I1207 19:16:38.631729 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:38 crc kubenswrapper[4815]: I1207 19:16:38.631746 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:38 crc kubenswrapper[4815]: I1207 19:16:38.631755 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:38Z","lastTransitionTime":"2025-12-07T19:16:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:38 crc kubenswrapper[4815]: I1207 19:16:38.734476 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:38 crc kubenswrapper[4815]: I1207 19:16:38.734516 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:38 crc kubenswrapper[4815]: I1207 19:16:38.734528 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:38 crc kubenswrapper[4815]: I1207 19:16:38.734545 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:38 crc kubenswrapper[4815]: I1207 19:16:38.734558 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:38Z","lastTransitionTime":"2025-12-07T19:16:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:38 crc kubenswrapper[4815]: I1207 19:16:38.769373 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xbq22" Dec 07 19:16:38 crc kubenswrapper[4815]: I1207 19:16:38.769433 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 07 19:16:38 crc kubenswrapper[4815]: E1207 19:16:38.769563 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xbq22" podUID="201e9ba8-3e19-4555-90f0-587497a2a328" Dec 07 19:16:38 crc kubenswrapper[4815]: E1207 19:16:38.769803 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 07 19:16:38 crc kubenswrapper[4815]: I1207 19:16:38.837872 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:38 crc kubenswrapper[4815]: I1207 19:16:38.837963 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:38 crc kubenswrapper[4815]: I1207 19:16:38.837988 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:38 crc kubenswrapper[4815]: I1207 19:16:38.838017 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:38 crc kubenswrapper[4815]: I1207 19:16:38.838036 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:38Z","lastTransitionTime":"2025-12-07T19:16:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:38 crc kubenswrapper[4815]: I1207 19:16:38.941004 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:38 crc kubenswrapper[4815]: I1207 19:16:38.941035 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:38 crc kubenswrapper[4815]: I1207 19:16:38.941044 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:38 crc kubenswrapper[4815]: I1207 19:16:38.941057 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:38 crc kubenswrapper[4815]: I1207 19:16:38.941066 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:38Z","lastTransitionTime":"2025-12-07T19:16:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:39 crc kubenswrapper[4815]: I1207 19:16:39.044100 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:39 crc kubenswrapper[4815]: I1207 19:16:39.044143 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:39 crc kubenswrapper[4815]: I1207 19:16:39.044157 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:39 crc kubenswrapper[4815]: I1207 19:16:39.044177 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:39 crc kubenswrapper[4815]: I1207 19:16:39.044193 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:39Z","lastTransitionTime":"2025-12-07T19:16:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:39 crc kubenswrapper[4815]: I1207 19:16:39.112319 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:39 crc kubenswrapper[4815]: I1207 19:16:39.112598 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:39 crc kubenswrapper[4815]: I1207 19:16:39.112821 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:39 crc kubenswrapper[4815]: I1207 19:16:39.113082 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:39 crc kubenswrapper[4815]: I1207 19:16:39.113286 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:39Z","lastTransitionTime":"2025-12-07T19:16:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:39 crc kubenswrapper[4815]: E1207 19:16:39.134563 4815 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:16:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:16:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:16:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:16:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:16:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:16:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:16:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:16:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3d749f27-20c8-4d23-ad52-dc6b852bf3b7\\\",\\\"systemUUID\\\":\\\"077277cc-9bde-4aeb-947a-0cf3c49a1ac0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:39Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:39 crc kubenswrapper[4815]: I1207 19:16:39.140410 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:39 crc kubenswrapper[4815]: I1207 19:16:39.140504 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:39 crc kubenswrapper[4815]: I1207 19:16:39.140609 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:39 crc kubenswrapper[4815]: I1207 19:16:39.140652 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:39 crc kubenswrapper[4815]: I1207 19:16:39.140689 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:39Z","lastTransitionTime":"2025-12-07T19:16:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:39 crc kubenswrapper[4815]: E1207 19:16:39.161302 4815 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:16:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:16:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:16:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:16:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:16:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:16:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:16:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:16:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3d749f27-20c8-4d23-ad52-dc6b852bf3b7\\\",\\\"systemUUID\\\":\\\"077277cc-9bde-4aeb-947a-0cf3c49a1ac0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:39Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:39 crc kubenswrapper[4815]: I1207 19:16:39.167058 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:39 crc kubenswrapper[4815]: I1207 19:16:39.167291 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:39 crc kubenswrapper[4815]: I1207 19:16:39.167476 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:39 crc kubenswrapper[4815]: I1207 19:16:39.167688 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:39 crc kubenswrapper[4815]: I1207 19:16:39.167879 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:39Z","lastTransitionTime":"2025-12-07T19:16:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:39 crc kubenswrapper[4815]: E1207 19:16:39.187731 4815 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:16:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:16:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:16:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:16:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:16:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:16:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:16:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:16:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3d749f27-20c8-4d23-ad52-dc6b852bf3b7\\\",\\\"systemUUID\\\":\\\"077277cc-9bde-4aeb-947a-0cf3c49a1ac0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:39Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:39 crc kubenswrapper[4815]: I1207 19:16:39.192475 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:39 crc kubenswrapper[4815]: I1207 19:16:39.192738 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:39 crc kubenswrapper[4815]: I1207 19:16:39.192987 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:39 crc kubenswrapper[4815]: I1207 19:16:39.193217 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:39 crc kubenswrapper[4815]: I1207 19:16:39.193398 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:39Z","lastTransitionTime":"2025-12-07T19:16:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:39 crc kubenswrapper[4815]: E1207 19:16:39.210440 4815 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:16:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:16:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:16:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:16:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:16:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:16:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:16:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:16:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3d749f27-20c8-4d23-ad52-dc6b852bf3b7\\\",\\\"systemUUID\\\":\\\"077277cc-9bde-4aeb-947a-0cf3c49a1ac0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:39Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:39 crc kubenswrapper[4815]: I1207 19:16:39.215906 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:39 crc kubenswrapper[4815]: I1207 19:16:39.216144 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:39 crc kubenswrapper[4815]: I1207 19:16:39.216281 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:39 crc kubenswrapper[4815]: I1207 19:16:39.216427 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:39 crc kubenswrapper[4815]: I1207 19:16:39.216550 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:39Z","lastTransitionTime":"2025-12-07T19:16:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:39 crc kubenswrapper[4815]: E1207 19:16:39.232762 4815 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:16:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:16:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:16:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:16:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:16:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:16:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:16:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-07T19:16:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3d749f27-20c8-4d23-ad52-dc6b852bf3b7\\\",\\\"systemUUID\\\":\\\"077277cc-9bde-4aeb-947a-0cf3c49a1ac0\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-07T19:16:39Z is after 2025-08-24T17:21:41Z" Dec 07 19:16:39 crc kubenswrapper[4815]: E1207 19:16:39.233215 4815 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 07 19:16:39 crc kubenswrapper[4815]: I1207 19:16:39.235173 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:39 crc kubenswrapper[4815]: I1207 19:16:39.235272 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:39 crc kubenswrapper[4815]: I1207 19:16:39.235295 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:39 crc kubenswrapper[4815]: I1207 19:16:39.235322 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:39 crc kubenswrapper[4815]: I1207 19:16:39.235340 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:39Z","lastTransitionTime":"2025-12-07T19:16:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:39 crc kubenswrapper[4815]: I1207 19:16:39.338735 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:39 crc kubenswrapper[4815]: I1207 19:16:39.338798 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:39 crc kubenswrapper[4815]: I1207 19:16:39.338814 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:39 crc kubenswrapper[4815]: I1207 19:16:39.338837 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:39 crc kubenswrapper[4815]: I1207 19:16:39.338854 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:39Z","lastTransitionTime":"2025-12-07T19:16:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:39 crc kubenswrapper[4815]: I1207 19:16:39.442119 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:39 crc kubenswrapper[4815]: I1207 19:16:39.442169 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:39 crc kubenswrapper[4815]: I1207 19:16:39.442186 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:39 crc kubenswrapper[4815]: I1207 19:16:39.442211 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:39 crc kubenswrapper[4815]: I1207 19:16:39.442229 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:39Z","lastTransitionTime":"2025-12-07T19:16:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:39 crc kubenswrapper[4815]: I1207 19:16:39.545434 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:39 crc kubenswrapper[4815]: I1207 19:16:39.545477 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:39 crc kubenswrapper[4815]: I1207 19:16:39.545507 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:39 crc kubenswrapper[4815]: I1207 19:16:39.545523 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:39 crc kubenswrapper[4815]: I1207 19:16:39.545532 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:39Z","lastTransitionTime":"2025-12-07T19:16:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:39 crc kubenswrapper[4815]: I1207 19:16:39.648442 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:39 crc kubenswrapper[4815]: I1207 19:16:39.648485 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:39 crc kubenswrapper[4815]: I1207 19:16:39.648503 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:39 crc kubenswrapper[4815]: I1207 19:16:39.648525 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:39 crc kubenswrapper[4815]: I1207 19:16:39.648542 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:39Z","lastTransitionTime":"2025-12-07T19:16:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:39 crc kubenswrapper[4815]: I1207 19:16:39.751664 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:39 crc kubenswrapper[4815]: I1207 19:16:39.751701 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:39 crc kubenswrapper[4815]: I1207 19:16:39.751717 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:39 crc kubenswrapper[4815]: I1207 19:16:39.751738 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:39 crc kubenswrapper[4815]: I1207 19:16:39.751753 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:39Z","lastTransitionTime":"2025-12-07T19:16:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:39 crc kubenswrapper[4815]: I1207 19:16:39.769530 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 07 19:16:39 crc kubenswrapper[4815]: I1207 19:16:39.769827 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 07 19:16:39 crc kubenswrapper[4815]: E1207 19:16:39.770011 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 07 19:16:39 crc kubenswrapper[4815]: E1207 19:16:39.770128 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 07 19:16:39 crc kubenswrapper[4815]: I1207 19:16:39.855114 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:39 crc kubenswrapper[4815]: I1207 19:16:39.855184 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:39 crc kubenswrapper[4815]: I1207 19:16:39.855209 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:39 crc kubenswrapper[4815]: I1207 19:16:39.855239 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:39 crc kubenswrapper[4815]: I1207 19:16:39.855262 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:39Z","lastTransitionTime":"2025-12-07T19:16:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:39 crc kubenswrapper[4815]: I1207 19:16:39.958408 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:39 crc kubenswrapper[4815]: I1207 19:16:39.958495 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:39 crc kubenswrapper[4815]: I1207 19:16:39.958511 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:39 crc kubenswrapper[4815]: I1207 19:16:39.958532 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:39 crc kubenswrapper[4815]: I1207 19:16:39.958548 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:39Z","lastTransitionTime":"2025-12-07T19:16:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:40 crc kubenswrapper[4815]: I1207 19:16:40.061305 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:40 crc kubenswrapper[4815]: I1207 19:16:40.061376 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:40 crc kubenswrapper[4815]: I1207 19:16:40.061394 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:40 crc kubenswrapper[4815]: I1207 19:16:40.061441 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:40 crc kubenswrapper[4815]: I1207 19:16:40.061459 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:40Z","lastTransitionTime":"2025-12-07T19:16:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:40 crc kubenswrapper[4815]: I1207 19:16:40.164283 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:40 crc kubenswrapper[4815]: I1207 19:16:40.164399 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:40 crc kubenswrapper[4815]: I1207 19:16:40.164419 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:40 crc kubenswrapper[4815]: I1207 19:16:40.164444 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:40 crc kubenswrapper[4815]: I1207 19:16:40.164463 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:40Z","lastTransitionTime":"2025-12-07T19:16:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:40 crc kubenswrapper[4815]: I1207 19:16:40.267956 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:40 crc kubenswrapper[4815]: I1207 19:16:40.268034 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:40 crc kubenswrapper[4815]: I1207 19:16:40.268052 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:40 crc kubenswrapper[4815]: I1207 19:16:40.268540 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:40 crc kubenswrapper[4815]: I1207 19:16:40.268607 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:40Z","lastTransitionTime":"2025-12-07T19:16:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:40 crc kubenswrapper[4815]: I1207 19:16:40.372535 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:40 crc kubenswrapper[4815]: I1207 19:16:40.372582 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:40 crc kubenswrapper[4815]: I1207 19:16:40.372600 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:40 crc kubenswrapper[4815]: I1207 19:16:40.372622 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:40 crc kubenswrapper[4815]: I1207 19:16:40.372640 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:40Z","lastTransitionTime":"2025-12-07T19:16:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:40 crc kubenswrapper[4815]: I1207 19:16:40.475356 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:40 crc kubenswrapper[4815]: I1207 19:16:40.475403 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:40 crc kubenswrapper[4815]: I1207 19:16:40.475436 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:40 crc kubenswrapper[4815]: I1207 19:16:40.475454 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:40 crc kubenswrapper[4815]: I1207 19:16:40.475467 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:40Z","lastTransitionTime":"2025-12-07T19:16:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:40 crc kubenswrapper[4815]: I1207 19:16:40.578057 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:40 crc kubenswrapper[4815]: I1207 19:16:40.578127 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:40 crc kubenswrapper[4815]: I1207 19:16:40.578147 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:40 crc kubenswrapper[4815]: I1207 19:16:40.578173 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:40 crc kubenswrapper[4815]: I1207 19:16:40.578191 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:40Z","lastTransitionTime":"2025-12-07T19:16:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:40 crc kubenswrapper[4815]: I1207 19:16:40.681137 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:40 crc kubenswrapper[4815]: I1207 19:16:40.681201 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:40 crc kubenswrapper[4815]: I1207 19:16:40.681221 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:40 crc kubenswrapper[4815]: I1207 19:16:40.681247 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:40 crc kubenswrapper[4815]: I1207 19:16:40.681264 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:40Z","lastTransitionTime":"2025-12-07T19:16:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:40 crc kubenswrapper[4815]: I1207 19:16:40.769587 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xbq22" Dec 07 19:16:40 crc kubenswrapper[4815]: I1207 19:16:40.769597 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 07 19:16:40 crc kubenswrapper[4815]: E1207 19:16:40.769802 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xbq22" podUID="201e9ba8-3e19-4555-90f0-587497a2a328" Dec 07 19:16:40 crc kubenswrapper[4815]: E1207 19:16:40.769998 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 07 19:16:40 crc kubenswrapper[4815]: I1207 19:16:40.783835 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:40 crc kubenswrapper[4815]: I1207 19:16:40.783896 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:40 crc kubenswrapper[4815]: I1207 19:16:40.783937 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:40 crc kubenswrapper[4815]: I1207 19:16:40.783962 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:40 crc kubenswrapper[4815]: I1207 19:16:40.783982 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:40Z","lastTransitionTime":"2025-12-07T19:16:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:40 crc kubenswrapper[4815]: I1207 19:16:40.887409 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:40 crc kubenswrapper[4815]: I1207 19:16:40.887503 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:40 crc kubenswrapper[4815]: I1207 19:16:40.887520 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:40 crc kubenswrapper[4815]: I1207 19:16:40.887544 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:40 crc kubenswrapper[4815]: I1207 19:16:40.887563 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:40Z","lastTransitionTime":"2025-12-07T19:16:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:40 crc kubenswrapper[4815]: I1207 19:16:40.990369 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:40 crc kubenswrapper[4815]: I1207 19:16:40.990463 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:40 crc kubenswrapper[4815]: I1207 19:16:40.990483 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:40 crc kubenswrapper[4815]: I1207 19:16:40.990507 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:40 crc kubenswrapper[4815]: I1207 19:16:40.990523 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:40Z","lastTransitionTime":"2025-12-07T19:16:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:41 crc kubenswrapper[4815]: I1207 19:16:41.094092 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:41 crc kubenswrapper[4815]: I1207 19:16:41.094154 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:41 crc kubenswrapper[4815]: I1207 19:16:41.094171 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:41 crc kubenswrapper[4815]: I1207 19:16:41.094202 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:41 crc kubenswrapper[4815]: I1207 19:16:41.094224 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:41Z","lastTransitionTime":"2025-12-07T19:16:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:41 crc kubenswrapper[4815]: I1207 19:16:41.197032 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:41 crc kubenswrapper[4815]: I1207 19:16:41.197121 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:41 crc kubenswrapper[4815]: I1207 19:16:41.197140 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:41 crc kubenswrapper[4815]: I1207 19:16:41.197162 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:41 crc kubenswrapper[4815]: I1207 19:16:41.197179 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:41Z","lastTransitionTime":"2025-12-07T19:16:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:41 crc kubenswrapper[4815]: I1207 19:16:41.300551 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:41 crc kubenswrapper[4815]: I1207 19:16:41.300609 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:41 crc kubenswrapper[4815]: I1207 19:16:41.300625 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:41 crc kubenswrapper[4815]: I1207 19:16:41.300650 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:41 crc kubenswrapper[4815]: I1207 19:16:41.300667 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:41Z","lastTransitionTime":"2025-12-07T19:16:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:41 crc kubenswrapper[4815]: I1207 19:16:41.403399 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:41 crc kubenswrapper[4815]: I1207 19:16:41.403460 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:41 crc kubenswrapper[4815]: I1207 19:16:41.403481 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:41 crc kubenswrapper[4815]: I1207 19:16:41.403505 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:41 crc kubenswrapper[4815]: I1207 19:16:41.403524 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:41Z","lastTransitionTime":"2025-12-07T19:16:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:41 crc kubenswrapper[4815]: I1207 19:16:41.506359 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:41 crc kubenswrapper[4815]: I1207 19:16:41.506419 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:41 crc kubenswrapper[4815]: I1207 19:16:41.506441 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:41 crc kubenswrapper[4815]: I1207 19:16:41.506470 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:41 crc kubenswrapper[4815]: I1207 19:16:41.506506 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:41Z","lastTransitionTime":"2025-12-07T19:16:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:41 crc kubenswrapper[4815]: I1207 19:16:41.609899 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:41 crc kubenswrapper[4815]: I1207 19:16:41.610004 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:41 crc kubenswrapper[4815]: I1207 19:16:41.610029 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:41 crc kubenswrapper[4815]: I1207 19:16:41.610057 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:41 crc kubenswrapper[4815]: I1207 19:16:41.610078 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:41Z","lastTransitionTime":"2025-12-07T19:16:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:41 crc kubenswrapper[4815]: I1207 19:16:41.713477 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:41 crc kubenswrapper[4815]: I1207 19:16:41.713532 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:41 crc kubenswrapper[4815]: I1207 19:16:41.713548 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:41 crc kubenswrapper[4815]: I1207 19:16:41.713570 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:41 crc kubenswrapper[4815]: I1207 19:16:41.713588 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:41Z","lastTransitionTime":"2025-12-07T19:16:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:41 crc kubenswrapper[4815]: I1207 19:16:41.772117 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 07 19:16:41 crc kubenswrapper[4815]: I1207 19:16:41.772203 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 07 19:16:41 crc kubenswrapper[4815]: E1207 19:16:41.772305 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 07 19:16:41 crc kubenswrapper[4815]: E1207 19:16:41.772500 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 07 19:16:41 crc kubenswrapper[4815]: I1207 19:16:41.815953 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:41 crc kubenswrapper[4815]: I1207 19:16:41.816013 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:41 crc kubenswrapper[4815]: I1207 19:16:41.816031 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:41 crc kubenswrapper[4815]: I1207 19:16:41.816055 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:41 crc kubenswrapper[4815]: I1207 19:16:41.816072 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:41Z","lastTransitionTime":"2025-12-07T19:16:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:41 crc kubenswrapper[4815]: I1207 19:16:41.919567 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:41 crc kubenswrapper[4815]: I1207 19:16:41.919652 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:41 crc kubenswrapper[4815]: I1207 19:16:41.919675 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:41 crc kubenswrapper[4815]: I1207 19:16:41.919700 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:41 crc kubenswrapper[4815]: I1207 19:16:41.919717 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:41Z","lastTransitionTime":"2025-12-07T19:16:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:42 crc kubenswrapper[4815]: I1207 19:16:42.022910 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:42 crc kubenswrapper[4815]: I1207 19:16:42.022959 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:42 crc kubenswrapper[4815]: I1207 19:16:42.022968 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:42 crc kubenswrapper[4815]: I1207 19:16:42.022981 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:42 crc kubenswrapper[4815]: I1207 19:16:42.022991 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:42Z","lastTransitionTime":"2025-12-07T19:16:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:42 crc kubenswrapper[4815]: I1207 19:16:42.125935 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:42 crc kubenswrapper[4815]: I1207 19:16:42.125992 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:42 crc kubenswrapper[4815]: I1207 19:16:42.126010 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:42 crc kubenswrapper[4815]: I1207 19:16:42.126031 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:42 crc kubenswrapper[4815]: I1207 19:16:42.126047 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:42Z","lastTransitionTime":"2025-12-07T19:16:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:42 crc kubenswrapper[4815]: I1207 19:16:42.228705 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:42 crc kubenswrapper[4815]: I1207 19:16:42.228760 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:42 crc kubenswrapper[4815]: I1207 19:16:42.228769 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:42 crc kubenswrapper[4815]: I1207 19:16:42.228783 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:42 crc kubenswrapper[4815]: I1207 19:16:42.228792 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:42Z","lastTransitionTime":"2025-12-07T19:16:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:42 crc kubenswrapper[4815]: I1207 19:16:42.332431 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:42 crc kubenswrapper[4815]: I1207 19:16:42.332874 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:42 crc kubenswrapper[4815]: I1207 19:16:42.333160 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:42 crc kubenswrapper[4815]: I1207 19:16:42.333407 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:42 crc kubenswrapper[4815]: I1207 19:16:42.333594 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:42Z","lastTransitionTime":"2025-12-07T19:16:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:42 crc kubenswrapper[4815]: I1207 19:16:42.437827 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:42 crc kubenswrapper[4815]: I1207 19:16:42.437887 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:42 crc kubenswrapper[4815]: I1207 19:16:42.437907 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:42 crc kubenswrapper[4815]: I1207 19:16:42.437972 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:42 crc kubenswrapper[4815]: I1207 19:16:42.437992 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:42Z","lastTransitionTime":"2025-12-07T19:16:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:42 crc kubenswrapper[4815]: I1207 19:16:42.540911 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:42 crc kubenswrapper[4815]: I1207 19:16:42.541007 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:42 crc kubenswrapper[4815]: I1207 19:16:42.541031 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:42 crc kubenswrapper[4815]: I1207 19:16:42.541059 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:42 crc kubenswrapper[4815]: I1207 19:16:42.541080 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:42Z","lastTransitionTime":"2025-12-07T19:16:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:42 crc kubenswrapper[4815]: I1207 19:16:42.643483 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:42 crc kubenswrapper[4815]: I1207 19:16:42.643530 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:42 crc kubenswrapper[4815]: I1207 19:16:42.643543 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:42 crc kubenswrapper[4815]: I1207 19:16:42.643565 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:42 crc kubenswrapper[4815]: I1207 19:16:42.643577 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:42Z","lastTransitionTime":"2025-12-07T19:16:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:42 crc kubenswrapper[4815]: I1207 19:16:42.747152 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:42 crc kubenswrapper[4815]: I1207 19:16:42.747325 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:42 crc kubenswrapper[4815]: I1207 19:16:42.747351 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:42 crc kubenswrapper[4815]: I1207 19:16:42.747381 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:42 crc kubenswrapper[4815]: I1207 19:16:42.747402 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:42Z","lastTransitionTime":"2025-12-07T19:16:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:42 crc kubenswrapper[4815]: I1207 19:16:42.768880 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xbq22" Dec 07 19:16:42 crc kubenswrapper[4815]: E1207 19:16:42.769217 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xbq22" podUID="201e9ba8-3e19-4555-90f0-587497a2a328" Dec 07 19:16:42 crc kubenswrapper[4815]: I1207 19:16:42.769399 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 07 19:16:42 crc kubenswrapper[4815]: E1207 19:16:42.769615 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 07 19:16:42 crc kubenswrapper[4815]: I1207 19:16:42.770851 4815 scope.go:117] "RemoveContainer" containerID="ebff9b10eeedac44403b96b81a1731f6d5569cf3097b0387d72e11c1d602d51b" Dec 07 19:16:42 crc kubenswrapper[4815]: E1207 19:16:42.771189 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-zzw6c_openshift-ovn-kubernetes(13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" podUID="13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f" Dec 07 19:16:42 crc kubenswrapper[4815]: I1207 19:16:42.851245 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:42 crc kubenswrapper[4815]: I1207 19:16:42.851335 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:42 crc kubenswrapper[4815]: I1207 19:16:42.851362 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:42 crc kubenswrapper[4815]: I1207 19:16:42.851398 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:42 crc kubenswrapper[4815]: I1207 19:16:42.851426 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:42Z","lastTransitionTime":"2025-12-07T19:16:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:42 crc kubenswrapper[4815]: I1207 19:16:42.954348 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:42 crc kubenswrapper[4815]: I1207 19:16:42.954410 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:42 crc kubenswrapper[4815]: I1207 19:16:42.954429 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:42 crc kubenswrapper[4815]: I1207 19:16:42.954456 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:42 crc kubenswrapper[4815]: I1207 19:16:42.954477 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:42Z","lastTransitionTime":"2025-12-07T19:16:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:43 crc kubenswrapper[4815]: I1207 19:16:43.057006 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:43 crc kubenswrapper[4815]: I1207 19:16:43.057071 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:43 crc kubenswrapper[4815]: I1207 19:16:43.057093 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:43 crc kubenswrapper[4815]: I1207 19:16:43.057125 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:43 crc kubenswrapper[4815]: I1207 19:16:43.057148 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:43Z","lastTransitionTime":"2025-12-07T19:16:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:43 crc kubenswrapper[4815]: I1207 19:16:43.160700 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:43 crc kubenswrapper[4815]: I1207 19:16:43.160813 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:43 crc kubenswrapper[4815]: I1207 19:16:43.160833 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:43 crc kubenswrapper[4815]: I1207 19:16:43.160858 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:43 crc kubenswrapper[4815]: I1207 19:16:43.160877 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:43Z","lastTransitionTime":"2025-12-07T19:16:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:43 crc kubenswrapper[4815]: I1207 19:16:43.264460 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:43 crc kubenswrapper[4815]: I1207 19:16:43.264540 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:43 crc kubenswrapper[4815]: I1207 19:16:43.264563 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:43 crc kubenswrapper[4815]: I1207 19:16:43.264594 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:43 crc kubenswrapper[4815]: I1207 19:16:43.264617 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:43Z","lastTransitionTime":"2025-12-07T19:16:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:43 crc kubenswrapper[4815]: I1207 19:16:43.367590 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:43 crc kubenswrapper[4815]: I1207 19:16:43.367661 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:43 crc kubenswrapper[4815]: I1207 19:16:43.367677 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:43 crc kubenswrapper[4815]: I1207 19:16:43.367746 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:43 crc kubenswrapper[4815]: I1207 19:16:43.367764 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:43Z","lastTransitionTime":"2025-12-07T19:16:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:43 crc kubenswrapper[4815]: I1207 19:16:43.471509 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:43 crc kubenswrapper[4815]: I1207 19:16:43.471564 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:43 crc kubenswrapper[4815]: I1207 19:16:43.471584 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:43 crc kubenswrapper[4815]: I1207 19:16:43.471607 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:43 crc kubenswrapper[4815]: I1207 19:16:43.471624 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:43Z","lastTransitionTime":"2025-12-07T19:16:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:43 crc kubenswrapper[4815]: I1207 19:16:43.574527 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:43 crc kubenswrapper[4815]: I1207 19:16:43.574684 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:43 crc kubenswrapper[4815]: I1207 19:16:43.574705 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:43 crc kubenswrapper[4815]: I1207 19:16:43.574727 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:43 crc kubenswrapper[4815]: I1207 19:16:43.574744 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:43Z","lastTransitionTime":"2025-12-07T19:16:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:43 crc kubenswrapper[4815]: I1207 19:16:43.677643 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:43 crc kubenswrapper[4815]: I1207 19:16:43.677706 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:43 crc kubenswrapper[4815]: I1207 19:16:43.677724 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:43 crc kubenswrapper[4815]: I1207 19:16:43.677750 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:43 crc kubenswrapper[4815]: I1207 19:16:43.677767 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:43Z","lastTransitionTime":"2025-12-07T19:16:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:43 crc kubenswrapper[4815]: I1207 19:16:43.769826 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 07 19:16:43 crc kubenswrapper[4815]: I1207 19:16:43.770051 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 07 19:16:43 crc kubenswrapper[4815]: E1207 19:16:43.770216 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 07 19:16:43 crc kubenswrapper[4815]: E1207 19:16:43.770564 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 07 19:16:43 crc kubenswrapper[4815]: I1207 19:16:43.781355 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:43 crc kubenswrapper[4815]: I1207 19:16:43.781411 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:43 crc kubenswrapper[4815]: I1207 19:16:43.781429 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:43 crc kubenswrapper[4815]: I1207 19:16:43.781455 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:43 crc kubenswrapper[4815]: I1207 19:16:43.781473 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:43Z","lastTransitionTime":"2025-12-07T19:16:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:43 crc kubenswrapper[4815]: I1207 19:16:43.884671 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:43 crc kubenswrapper[4815]: I1207 19:16:43.884805 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:43 crc kubenswrapper[4815]: I1207 19:16:43.884831 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:43 crc kubenswrapper[4815]: I1207 19:16:43.884859 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:43 crc kubenswrapper[4815]: I1207 19:16:43.884877 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:43Z","lastTransitionTime":"2025-12-07T19:16:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:43 crc kubenswrapper[4815]: I1207 19:16:43.987479 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:43 crc kubenswrapper[4815]: I1207 19:16:43.987522 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:43 crc kubenswrapper[4815]: I1207 19:16:43.987534 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:43 crc kubenswrapper[4815]: I1207 19:16:43.987548 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:43 crc kubenswrapper[4815]: I1207 19:16:43.987559 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:43Z","lastTransitionTime":"2025-12-07T19:16:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:44 crc kubenswrapper[4815]: I1207 19:16:44.090619 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:44 crc kubenswrapper[4815]: I1207 19:16:44.090677 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:44 crc kubenswrapper[4815]: I1207 19:16:44.090699 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:44 crc kubenswrapper[4815]: I1207 19:16:44.090725 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:44 crc kubenswrapper[4815]: I1207 19:16:44.090751 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:44Z","lastTransitionTime":"2025-12-07T19:16:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:44 crc kubenswrapper[4815]: I1207 19:16:44.193582 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:44 crc kubenswrapper[4815]: I1207 19:16:44.193646 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:44 crc kubenswrapper[4815]: I1207 19:16:44.193674 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:44 crc kubenswrapper[4815]: I1207 19:16:44.193702 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:44 crc kubenswrapper[4815]: I1207 19:16:44.193722 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:44Z","lastTransitionTime":"2025-12-07T19:16:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:44 crc kubenswrapper[4815]: I1207 19:16:44.298488 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:44 crc kubenswrapper[4815]: I1207 19:16:44.298552 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:44 crc kubenswrapper[4815]: I1207 19:16:44.298590 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:44 crc kubenswrapper[4815]: I1207 19:16:44.298617 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:44 crc kubenswrapper[4815]: I1207 19:16:44.298634 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:44Z","lastTransitionTime":"2025-12-07T19:16:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:44 crc kubenswrapper[4815]: I1207 19:16:44.401615 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:44 crc kubenswrapper[4815]: I1207 19:16:44.401726 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:44 crc kubenswrapper[4815]: I1207 19:16:44.401743 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:44 crc kubenswrapper[4815]: I1207 19:16:44.401793 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:44 crc kubenswrapper[4815]: I1207 19:16:44.401810 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:44Z","lastTransitionTime":"2025-12-07T19:16:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:44 crc kubenswrapper[4815]: I1207 19:16:44.505973 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:44 crc kubenswrapper[4815]: I1207 19:16:44.506075 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:44 crc kubenswrapper[4815]: I1207 19:16:44.506102 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:44 crc kubenswrapper[4815]: I1207 19:16:44.506135 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:44 crc kubenswrapper[4815]: I1207 19:16:44.506226 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:44Z","lastTransitionTime":"2025-12-07T19:16:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:44 crc kubenswrapper[4815]: I1207 19:16:44.611120 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:44 crc kubenswrapper[4815]: I1207 19:16:44.611177 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:44 crc kubenswrapper[4815]: I1207 19:16:44.611201 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:44 crc kubenswrapper[4815]: I1207 19:16:44.611231 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:44 crc kubenswrapper[4815]: I1207 19:16:44.611251 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:44Z","lastTransitionTime":"2025-12-07T19:16:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:44 crc kubenswrapper[4815]: I1207 19:16:44.712808 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/201e9ba8-3e19-4555-90f0-587497a2a328-metrics-certs\") pod \"network-metrics-daemon-xbq22\" (UID: \"201e9ba8-3e19-4555-90f0-587497a2a328\") " pod="openshift-multus/network-metrics-daemon-xbq22" Dec 07 19:16:44 crc kubenswrapper[4815]: E1207 19:16:44.712993 4815 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 07 19:16:44 crc kubenswrapper[4815]: E1207 19:16:44.713055 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/201e9ba8-3e19-4555-90f0-587497a2a328-metrics-certs podName:201e9ba8-3e19-4555-90f0-587497a2a328 nodeName:}" failed. No retries permitted until 2025-12-07 19:17:48.713035391 +0000 UTC m=+173.292025446 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/201e9ba8-3e19-4555-90f0-587497a2a328-metrics-certs") pod "network-metrics-daemon-xbq22" (UID: "201e9ba8-3e19-4555-90f0-587497a2a328") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 07 19:16:44 crc kubenswrapper[4815]: I1207 19:16:44.714753 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:44 crc kubenswrapper[4815]: I1207 19:16:44.714988 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:44 crc kubenswrapper[4815]: I1207 19:16:44.715140 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:44 crc kubenswrapper[4815]: I1207 19:16:44.715367 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:44 crc kubenswrapper[4815]: I1207 19:16:44.715650 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:44Z","lastTransitionTime":"2025-12-07T19:16:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:44 crc kubenswrapper[4815]: I1207 19:16:44.769489 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 07 19:16:44 crc kubenswrapper[4815]: I1207 19:16:44.769500 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xbq22" Dec 07 19:16:44 crc kubenswrapper[4815]: E1207 19:16:44.770086 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xbq22" podUID="201e9ba8-3e19-4555-90f0-587497a2a328" Dec 07 19:16:44 crc kubenswrapper[4815]: E1207 19:16:44.770090 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 07 19:16:44 crc kubenswrapper[4815]: I1207 19:16:44.818216 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:44 crc kubenswrapper[4815]: I1207 19:16:44.818251 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:44 crc kubenswrapper[4815]: I1207 19:16:44.818263 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:44 crc kubenswrapper[4815]: I1207 19:16:44.818279 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:44 crc kubenswrapper[4815]: I1207 19:16:44.818289 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:44Z","lastTransitionTime":"2025-12-07T19:16:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:44 crc kubenswrapper[4815]: I1207 19:16:44.921521 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:44 crc kubenswrapper[4815]: I1207 19:16:44.921989 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:44 crc kubenswrapper[4815]: I1207 19:16:44.922196 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:44 crc kubenswrapper[4815]: I1207 19:16:44.922353 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:44 crc kubenswrapper[4815]: I1207 19:16:44.922509 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:44Z","lastTransitionTime":"2025-12-07T19:16:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:45 crc kubenswrapper[4815]: I1207 19:16:45.025333 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:45 crc kubenswrapper[4815]: I1207 19:16:45.025703 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:45 crc kubenswrapper[4815]: I1207 19:16:45.025866 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:45 crc kubenswrapper[4815]: I1207 19:16:45.026040 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:45 crc kubenswrapper[4815]: I1207 19:16:45.026176 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:45Z","lastTransitionTime":"2025-12-07T19:16:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:45 crc kubenswrapper[4815]: I1207 19:16:45.129768 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:45 crc kubenswrapper[4815]: I1207 19:16:45.129827 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:45 crc kubenswrapper[4815]: I1207 19:16:45.129852 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:45 crc kubenswrapper[4815]: I1207 19:16:45.129882 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:45 crc kubenswrapper[4815]: I1207 19:16:45.129908 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:45Z","lastTransitionTime":"2025-12-07T19:16:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:45 crc kubenswrapper[4815]: I1207 19:16:45.232999 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:45 crc kubenswrapper[4815]: I1207 19:16:45.233055 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:45 crc kubenswrapper[4815]: I1207 19:16:45.233073 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:45 crc kubenswrapper[4815]: I1207 19:16:45.233098 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:45 crc kubenswrapper[4815]: I1207 19:16:45.233115 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:45Z","lastTransitionTime":"2025-12-07T19:16:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:45 crc kubenswrapper[4815]: I1207 19:16:45.335247 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:45 crc kubenswrapper[4815]: I1207 19:16:45.335280 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:45 crc kubenswrapper[4815]: I1207 19:16:45.335291 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:45 crc kubenswrapper[4815]: I1207 19:16:45.335329 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:45 crc kubenswrapper[4815]: I1207 19:16:45.335342 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:45Z","lastTransitionTime":"2025-12-07T19:16:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:45 crc kubenswrapper[4815]: I1207 19:16:45.438756 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:45 crc kubenswrapper[4815]: I1207 19:16:45.438803 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:45 crc kubenswrapper[4815]: I1207 19:16:45.438815 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:45 crc kubenswrapper[4815]: I1207 19:16:45.438834 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:45 crc kubenswrapper[4815]: I1207 19:16:45.438847 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:45Z","lastTransitionTime":"2025-12-07T19:16:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:45 crc kubenswrapper[4815]: I1207 19:16:45.541708 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:45 crc kubenswrapper[4815]: I1207 19:16:45.541752 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:45 crc kubenswrapper[4815]: I1207 19:16:45.541766 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:45 crc kubenswrapper[4815]: I1207 19:16:45.541784 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:45 crc kubenswrapper[4815]: I1207 19:16:45.541798 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:45Z","lastTransitionTime":"2025-12-07T19:16:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:45 crc kubenswrapper[4815]: I1207 19:16:45.644555 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:45 crc kubenswrapper[4815]: I1207 19:16:45.644721 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:45 crc kubenswrapper[4815]: I1207 19:16:45.644747 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:45 crc kubenswrapper[4815]: I1207 19:16:45.644770 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:45 crc kubenswrapper[4815]: I1207 19:16:45.644787 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:45Z","lastTransitionTime":"2025-12-07T19:16:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:45 crc kubenswrapper[4815]: I1207 19:16:45.747742 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:45 crc kubenswrapper[4815]: I1207 19:16:45.747796 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:45 crc kubenswrapper[4815]: I1207 19:16:45.747817 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:45 crc kubenswrapper[4815]: I1207 19:16:45.747841 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:45 crc kubenswrapper[4815]: I1207 19:16:45.747858 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:45Z","lastTransitionTime":"2025-12-07T19:16:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:45 crc kubenswrapper[4815]: I1207 19:16:45.769591 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 07 19:16:45 crc kubenswrapper[4815]: E1207 19:16:45.769776 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 07 19:16:45 crc kubenswrapper[4815]: I1207 19:16:45.770163 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 07 19:16:45 crc kubenswrapper[4815]: E1207 19:16:45.770314 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 07 19:16:45 crc kubenswrapper[4815]: I1207 19:16:45.800117 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podStartSLOduration=80.800094451 podStartE2EDuration="1m20.800094451s" podCreationTimestamp="2025-12-07 19:15:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:16:45.799501575 +0000 UTC m=+110.378491660" watchObservedRunningTime="2025-12-07 19:16:45.800094451 +0000 UTC m=+110.379084526" Dec 07 19:16:45 crc kubenswrapper[4815]: I1207 19:16:45.853206 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:45 crc kubenswrapper[4815]: I1207 19:16:45.853264 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:45 crc kubenswrapper[4815]: I1207 19:16:45.853280 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:45 crc kubenswrapper[4815]: I1207 19:16:45.853302 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:45 crc kubenswrapper[4815]: I1207 19:16:45.853364 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:45Z","lastTransitionTime":"2025-12-07T19:16:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:45 crc kubenswrapper[4815]: I1207 19:16:45.883496 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=29.88346724 podStartE2EDuration="29.88346724s" podCreationTimestamp="2025-12-07 19:16:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:16:45.881893738 +0000 UTC m=+110.460883823" watchObservedRunningTime="2025-12-07 19:16:45.88346724 +0000 UTC m=+110.462457335" Dec 07 19:16:45 crc kubenswrapper[4815]: I1207 19:16:45.884327 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=16.884314673 podStartE2EDuration="16.884314673s" podCreationTimestamp="2025-12-07 19:16:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:16:45.836290753 +0000 UTC m=+110.415280848" watchObservedRunningTime="2025-12-07 19:16:45.884314673 +0000 UTC m=+110.463304768" Dec 07 19:16:45 crc kubenswrapper[4815]: I1207 19:16:45.909150 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-s95hp" podStartSLOduration=80.909117789 podStartE2EDuration="1m20.909117789s" podCreationTimestamp="2025-12-07 19:15:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:16:45.906995552 +0000 UTC m=+110.485985647" watchObservedRunningTime="2025-12-07 19:16:45.909117789 +0000 UTC m=+110.488107884" Dec 07 19:16:45 crc kubenswrapper[4815]: I1207 19:16:45.938704 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-gmf4f" podStartSLOduration=80.938688303 podStartE2EDuration="1m20.938688303s" podCreationTimestamp="2025-12-07 19:15:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:16:45.937407028 +0000 UTC m=+110.516397083" watchObservedRunningTime="2025-12-07 19:16:45.938688303 +0000 UTC m=+110.517678358" Dec 07 19:16:45 crc kubenswrapper[4815]: I1207 19:16:45.958463 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:45 crc kubenswrapper[4815]: I1207 19:16:45.958529 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:45 crc kubenswrapper[4815]: I1207 19:16:45.958542 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:45 crc kubenswrapper[4815]: I1207 19:16:45.958560 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:45 crc kubenswrapper[4815]: I1207 19:16:45.958573 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:45Z","lastTransitionTime":"2025-12-07T19:16:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:45 crc kubenswrapper[4815]: I1207 19:16:45.964199 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-tqxds" podStartSLOduration=79.964172517 podStartE2EDuration="1m19.964172517s" podCreationTimestamp="2025-12-07 19:15:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:16:45.962758259 +0000 UTC m=+110.541748344" watchObservedRunningTime="2025-12-07 19:16:45.964172517 +0000 UTC m=+110.543162602" Dec 07 19:16:45 crc kubenswrapper[4815]: I1207 19:16:45.998648 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=85.998631942 podStartE2EDuration="1m25.998631942s" podCreationTimestamp="2025-12-07 19:15:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:16:45.99816415 +0000 UTC m=+110.577154215" watchObservedRunningTime="2025-12-07 19:16:45.998631942 +0000 UTC m=+110.577621997" Dec 07 19:16:46 crc kubenswrapper[4815]: I1207 19:16:46.063547 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:46 crc kubenswrapper[4815]: I1207 19:16:46.063593 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:46 crc kubenswrapper[4815]: I1207 19:16:46.063608 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:46 crc kubenswrapper[4815]: I1207 19:16:46.063637 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:46 crc kubenswrapper[4815]: I1207 19:16:46.063652 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:46Z","lastTransitionTime":"2025-12-07T19:16:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:46 crc kubenswrapper[4815]: I1207 19:16:46.072345 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-kh7gd" podStartSLOduration=81.072329871 podStartE2EDuration="1m21.072329871s" podCreationTimestamp="2025-12-07 19:15:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:16:46.07154991 +0000 UTC m=+110.650539965" watchObservedRunningTime="2025-12-07 19:16:46.072329871 +0000 UTC m=+110.651319926" Dec 07 19:16:46 crc kubenswrapper[4815]: I1207 19:16:46.100956 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-nz6q9" podStartSLOduration=81.100935419 podStartE2EDuration="1m21.100935419s" podCreationTimestamp="2025-12-07 19:15:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:16:46.083750658 +0000 UTC m=+110.662740713" watchObservedRunningTime="2025-12-07 19:16:46.100935419 +0000 UTC m=+110.679925474" Dec 07 19:16:46 crc kubenswrapper[4815]: I1207 19:16:46.120860 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=63.120840004 podStartE2EDuration="1m3.120840004s" podCreationTimestamp="2025-12-07 19:15:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:16:46.102120181 +0000 UTC m=+110.681110236" watchObservedRunningTime="2025-12-07 19:16:46.120840004 +0000 UTC m=+110.699830059" Dec 07 19:16:46 crc kubenswrapper[4815]: I1207 19:16:46.138833 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=86.138807006 podStartE2EDuration="1m26.138807006s" podCreationTimestamp="2025-12-07 19:15:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:16:46.121200004 +0000 UTC m=+110.700190069" watchObservedRunningTime="2025-12-07 19:16:46.138807006 +0000 UTC m=+110.717797061" Dec 07 19:16:46 crc kubenswrapper[4815]: I1207 19:16:46.166444 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:46 crc kubenswrapper[4815]: I1207 19:16:46.166469 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:46 crc kubenswrapper[4815]: I1207 19:16:46.166479 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:46 crc kubenswrapper[4815]: I1207 19:16:46.166492 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:46 crc kubenswrapper[4815]: I1207 19:16:46.166500 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:46Z","lastTransitionTime":"2025-12-07T19:16:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:46 crc kubenswrapper[4815]: I1207 19:16:46.269390 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:46 crc kubenswrapper[4815]: I1207 19:16:46.269429 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:46 crc kubenswrapper[4815]: I1207 19:16:46.269440 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:46 crc kubenswrapper[4815]: I1207 19:16:46.269461 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:46 crc kubenswrapper[4815]: I1207 19:16:46.269473 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:46Z","lastTransitionTime":"2025-12-07T19:16:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:46 crc kubenswrapper[4815]: I1207 19:16:46.372734 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:46 crc kubenswrapper[4815]: I1207 19:16:46.372780 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:46 crc kubenswrapper[4815]: I1207 19:16:46.372795 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:46 crc kubenswrapper[4815]: I1207 19:16:46.372814 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:46 crc kubenswrapper[4815]: I1207 19:16:46.372838 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:46Z","lastTransitionTime":"2025-12-07T19:16:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:46 crc kubenswrapper[4815]: I1207 19:16:46.475040 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:46 crc kubenswrapper[4815]: I1207 19:16:46.475090 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:46 crc kubenswrapper[4815]: I1207 19:16:46.475106 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:46 crc kubenswrapper[4815]: I1207 19:16:46.475128 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:46 crc kubenswrapper[4815]: I1207 19:16:46.475144 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:46Z","lastTransitionTime":"2025-12-07T19:16:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:46 crc kubenswrapper[4815]: I1207 19:16:46.578435 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:46 crc kubenswrapper[4815]: I1207 19:16:46.578510 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:46 crc kubenswrapper[4815]: I1207 19:16:46.578534 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:46 crc kubenswrapper[4815]: I1207 19:16:46.578564 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:46 crc kubenswrapper[4815]: I1207 19:16:46.578587 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:46Z","lastTransitionTime":"2025-12-07T19:16:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:46 crc kubenswrapper[4815]: I1207 19:16:46.681840 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:46 crc kubenswrapper[4815]: I1207 19:16:46.681962 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:46 crc kubenswrapper[4815]: I1207 19:16:46.682003 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:46 crc kubenswrapper[4815]: I1207 19:16:46.682032 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:46 crc kubenswrapper[4815]: I1207 19:16:46.682048 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:46Z","lastTransitionTime":"2025-12-07T19:16:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:46 crc kubenswrapper[4815]: I1207 19:16:46.768862 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 07 19:16:46 crc kubenswrapper[4815]: I1207 19:16:46.768975 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xbq22" Dec 07 19:16:46 crc kubenswrapper[4815]: E1207 19:16:46.769043 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 07 19:16:46 crc kubenswrapper[4815]: E1207 19:16:46.769195 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xbq22" podUID="201e9ba8-3e19-4555-90f0-587497a2a328" Dec 07 19:16:46 crc kubenswrapper[4815]: I1207 19:16:46.785325 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:46 crc kubenswrapper[4815]: I1207 19:16:46.785428 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:46 crc kubenswrapper[4815]: I1207 19:16:46.785453 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:46 crc kubenswrapper[4815]: I1207 19:16:46.785482 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:46 crc kubenswrapper[4815]: I1207 19:16:46.785504 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:46Z","lastTransitionTime":"2025-12-07T19:16:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:46 crc kubenswrapper[4815]: I1207 19:16:46.888545 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:46 crc kubenswrapper[4815]: I1207 19:16:46.888640 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:46 crc kubenswrapper[4815]: I1207 19:16:46.888663 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:46 crc kubenswrapper[4815]: I1207 19:16:46.888693 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:46 crc kubenswrapper[4815]: I1207 19:16:46.888715 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:46Z","lastTransitionTime":"2025-12-07T19:16:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:46 crc kubenswrapper[4815]: I1207 19:16:46.992056 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:46 crc kubenswrapper[4815]: I1207 19:16:46.992126 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:46 crc kubenswrapper[4815]: I1207 19:16:46.992149 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:46 crc kubenswrapper[4815]: I1207 19:16:46.992179 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:46 crc kubenswrapper[4815]: I1207 19:16:46.992201 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:46Z","lastTransitionTime":"2025-12-07T19:16:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:47 crc kubenswrapper[4815]: I1207 19:16:47.095594 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:47 crc kubenswrapper[4815]: I1207 19:16:47.095661 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:47 crc kubenswrapper[4815]: I1207 19:16:47.095678 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:47 crc kubenswrapper[4815]: I1207 19:16:47.095751 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:47 crc kubenswrapper[4815]: I1207 19:16:47.095773 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:47Z","lastTransitionTime":"2025-12-07T19:16:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:47 crc kubenswrapper[4815]: I1207 19:16:47.198624 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:47 crc kubenswrapper[4815]: I1207 19:16:47.198684 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:47 crc kubenswrapper[4815]: I1207 19:16:47.198702 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:47 crc kubenswrapper[4815]: I1207 19:16:47.198726 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:47 crc kubenswrapper[4815]: I1207 19:16:47.198744 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:47Z","lastTransitionTime":"2025-12-07T19:16:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:47 crc kubenswrapper[4815]: I1207 19:16:47.302744 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:47 crc kubenswrapper[4815]: I1207 19:16:47.302841 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:47 crc kubenswrapper[4815]: I1207 19:16:47.302858 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:47 crc kubenswrapper[4815]: I1207 19:16:47.302883 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:47 crc kubenswrapper[4815]: I1207 19:16:47.302901 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:47Z","lastTransitionTime":"2025-12-07T19:16:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:47 crc kubenswrapper[4815]: I1207 19:16:47.406595 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:47 crc kubenswrapper[4815]: I1207 19:16:47.406658 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:47 crc kubenswrapper[4815]: I1207 19:16:47.406718 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:47 crc kubenswrapper[4815]: I1207 19:16:47.406746 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:47 crc kubenswrapper[4815]: I1207 19:16:47.406763 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:47Z","lastTransitionTime":"2025-12-07T19:16:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:47 crc kubenswrapper[4815]: I1207 19:16:47.510454 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:47 crc kubenswrapper[4815]: I1207 19:16:47.510520 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:47 crc kubenswrapper[4815]: I1207 19:16:47.510545 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:47 crc kubenswrapper[4815]: I1207 19:16:47.510574 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:47 crc kubenswrapper[4815]: I1207 19:16:47.510596 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:47Z","lastTransitionTime":"2025-12-07T19:16:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:47 crc kubenswrapper[4815]: I1207 19:16:47.613685 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:47 crc kubenswrapper[4815]: I1207 19:16:47.613766 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:47 crc kubenswrapper[4815]: I1207 19:16:47.613791 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:47 crc kubenswrapper[4815]: I1207 19:16:47.613820 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:47 crc kubenswrapper[4815]: I1207 19:16:47.613836 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:47Z","lastTransitionTime":"2025-12-07T19:16:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:47 crc kubenswrapper[4815]: I1207 19:16:47.717283 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:47 crc kubenswrapper[4815]: I1207 19:16:47.717319 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:47 crc kubenswrapper[4815]: I1207 19:16:47.717328 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:47 crc kubenswrapper[4815]: I1207 19:16:47.717344 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:47 crc kubenswrapper[4815]: I1207 19:16:47.717352 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:47Z","lastTransitionTime":"2025-12-07T19:16:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:47 crc kubenswrapper[4815]: I1207 19:16:47.769603 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 07 19:16:47 crc kubenswrapper[4815]: I1207 19:16:47.769902 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 07 19:16:47 crc kubenswrapper[4815]: E1207 19:16:47.770152 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 07 19:16:47 crc kubenswrapper[4815]: E1207 19:16:47.770392 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 07 19:16:47 crc kubenswrapper[4815]: I1207 19:16:47.819561 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:47 crc kubenswrapper[4815]: I1207 19:16:47.819775 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:47 crc kubenswrapper[4815]: I1207 19:16:47.819879 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:47 crc kubenswrapper[4815]: I1207 19:16:47.819979 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:47 crc kubenswrapper[4815]: I1207 19:16:47.820121 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:47Z","lastTransitionTime":"2025-12-07T19:16:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:47 crc kubenswrapper[4815]: I1207 19:16:47.923717 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:47 crc kubenswrapper[4815]: I1207 19:16:47.924088 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:47 crc kubenswrapper[4815]: I1207 19:16:47.924374 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:47 crc kubenswrapper[4815]: I1207 19:16:47.924532 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:47 crc kubenswrapper[4815]: I1207 19:16:47.924734 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:47Z","lastTransitionTime":"2025-12-07T19:16:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:48 crc kubenswrapper[4815]: I1207 19:16:48.028795 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:48 crc kubenswrapper[4815]: I1207 19:16:48.028850 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:48 crc kubenswrapper[4815]: I1207 19:16:48.028869 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:48 crc kubenswrapper[4815]: I1207 19:16:48.028894 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:48 crc kubenswrapper[4815]: I1207 19:16:48.028962 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:48Z","lastTransitionTime":"2025-12-07T19:16:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:48 crc kubenswrapper[4815]: I1207 19:16:48.132417 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:48 crc kubenswrapper[4815]: I1207 19:16:48.132484 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:48 crc kubenswrapper[4815]: I1207 19:16:48.132508 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:48 crc kubenswrapper[4815]: I1207 19:16:48.132536 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:48 crc kubenswrapper[4815]: I1207 19:16:48.132558 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:48Z","lastTransitionTime":"2025-12-07T19:16:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:48 crc kubenswrapper[4815]: I1207 19:16:48.235529 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:48 crc kubenswrapper[4815]: I1207 19:16:48.235609 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:48 crc kubenswrapper[4815]: I1207 19:16:48.235649 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:48 crc kubenswrapper[4815]: I1207 19:16:48.235689 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:48 crc kubenswrapper[4815]: I1207 19:16:48.235711 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:48Z","lastTransitionTime":"2025-12-07T19:16:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:48 crc kubenswrapper[4815]: I1207 19:16:48.338529 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:48 crc kubenswrapper[4815]: I1207 19:16:48.338613 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:48 crc kubenswrapper[4815]: I1207 19:16:48.338637 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:48 crc kubenswrapper[4815]: I1207 19:16:48.338668 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:48 crc kubenswrapper[4815]: I1207 19:16:48.338690 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:48Z","lastTransitionTime":"2025-12-07T19:16:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:48 crc kubenswrapper[4815]: I1207 19:16:48.441684 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:48 crc kubenswrapper[4815]: I1207 19:16:48.441748 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:48 crc kubenswrapper[4815]: I1207 19:16:48.441764 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:48 crc kubenswrapper[4815]: I1207 19:16:48.441790 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:48 crc kubenswrapper[4815]: I1207 19:16:48.441807 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:48Z","lastTransitionTime":"2025-12-07T19:16:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:48 crc kubenswrapper[4815]: I1207 19:16:48.548436 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:48 crc kubenswrapper[4815]: I1207 19:16:48.548497 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:48 crc kubenswrapper[4815]: I1207 19:16:48.548515 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:48 crc kubenswrapper[4815]: I1207 19:16:48.548544 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:48 crc kubenswrapper[4815]: I1207 19:16:48.548562 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:48Z","lastTransitionTime":"2025-12-07T19:16:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:48 crc kubenswrapper[4815]: I1207 19:16:48.651983 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:48 crc kubenswrapper[4815]: I1207 19:16:48.652040 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:48 crc kubenswrapper[4815]: I1207 19:16:48.652059 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:48 crc kubenswrapper[4815]: I1207 19:16:48.652083 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:48 crc kubenswrapper[4815]: I1207 19:16:48.652099 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:48Z","lastTransitionTime":"2025-12-07T19:16:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:48 crc kubenswrapper[4815]: I1207 19:16:48.755385 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:48 crc kubenswrapper[4815]: I1207 19:16:48.755449 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:48 crc kubenswrapper[4815]: I1207 19:16:48.755471 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:48 crc kubenswrapper[4815]: I1207 19:16:48.755500 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:48 crc kubenswrapper[4815]: I1207 19:16:48.755522 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:48Z","lastTransitionTime":"2025-12-07T19:16:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:48 crc kubenswrapper[4815]: I1207 19:16:48.769095 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 07 19:16:48 crc kubenswrapper[4815]: I1207 19:16:48.769101 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xbq22" Dec 07 19:16:48 crc kubenswrapper[4815]: E1207 19:16:48.769262 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 07 19:16:48 crc kubenswrapper[4815]: E1207 19:16:48.769365 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xbq22" podUID="201e9ba8-3e19-4555-90f0-587497a2a328" Dec 07 19:16:48 crc kubenswrapper[4815]: I1207 19:16:48.859108 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:48 crc kubenswrapper[4815]: I1207 19:16:48.859160 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:48 crc kubenswrapper[4815]: I1207 19:16:48.859177 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:48 crc kubenswrapper[4815]: I1207 19:16:48.859199 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:48 crc kubenswrapper[4815]: I1207 19:16:48.859216 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:48Z","lastTransitionTime":"2025-12-07T19:16:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:48 crc kubenswrapper[4815]: I1207 19:16:48.961466 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:48 crc kubenswrapper[4815]: I1207 19:16:48.961592 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:48 crc kubenswrapper[4815]: I1207 19:16:48.961658 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:48 crc kubenswrapper[4815]: I1207 19:16:48.961696 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:48 crc kubenswrapper[4815]: I1207 19:16:48.961715 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:48Z","lastTransitionTime":"2025-12-07T19:16:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:49 crc kubenswrapper[4815]: I1207 19:16:49.064228 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:49 crc kubenswrapper[4815]: I1207 19:16:49.064285 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:49 crc kubenswrapper[4815]: I1207 19:16:49.064303 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:49 crc kubenswrapper[4815]: I1207 19:16:49.064328 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:49 crc kubenswrapper[4815]: I1207 19:16:49.064346 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:49Z","lastTransitionTime":"2025-12-07T19:16:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:49 crc kubenswrapper[4815]: I1207 19:16:49.167707 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:49 crc kubenswrapper[4815]: I1207 19:16:49.167758 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:49 crc kubenswrapper[4815]: I1207 19:16:49.167775 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:49 crc kubenswrapper[4815]: I1207 19:16:49.167798 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:49 crc kubenswrapper[4815]: I1207 19:16:49.167818 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:49Z","lastTransitionTime":"2025-12-07T19:16:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:49 crc kubenswrapper[4815]: I1207 19:16:49.270566 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:49 crc kubenswrapper[4815]: I1207 19:16:49.270959 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:49 crc kubenswrapper[4815]: I1207 19:16:49.271130 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:49 crc kubenswrapper[4815]: I1207 19:16:49.271280 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:49 crc kubenswrapper[4815]: I1207 19:16:49.271428 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:49Z","lastTransitionTime":"2025-12-07T19:16:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:49 crc kubenswrapper[4815]: I1207 19:16:49.292543 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 07 19:16:49 crc kubenswrapper[4815]: I1207 19:16:49.292765 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 07 19:16:49 crc kubenswrapper[4815]: I1207 19:16:49.292968 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 07 19:16:49 crc kubenswrapper[4815]: I1207 19:16:49.293151 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 07 19:16:49 crc kubenswrapper[4815]: I1207 19:16:49.293279 4815 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-07T19:16:49Z","lastTransitionTime":"2025-12-07T19:16:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 07 19:16:49 crc kubenswrapper[4815]: I1207 19:16:49.356196 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-spgpb"] Dec 07 19:16:49 crc kubenswrapper[4815]: I1207 19:16:49.356580 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-spgpb" Dec 07 19:16:49 crc kubenswrapper[4815]: I1207 19:16:49.360066 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 07 19:16:49 crc kubenswrapper[4815]: I1207 19:16:49.360152 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 07 19:16:49 crc kubenswrapper[4815]: I1207 19:16:49.360575 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 07 19:16:49 crc kubenswrapper[4815]: I1207 19:16:49.360787 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 07 19:16:49 crc kubenswrapper[4815]: I1207 19:16:49.466020 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e5e588d4-9615-4a3e-8644-038e6268c30f-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-spgpb\" (UID: \"e5e588d4-9615-4a3e-8644-038e6268c30f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-spgpb" Dec 07 19:16:49 crc kubenswrapper[4815]: I1207 19:16:49.466085 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e5e588d4-9615-4a3e-8644-038e6268c30f-service-ca\") pod \"cluster-version-operator-5c965bbfc6-spgpb\" (UID: \"e5e588d4-9615-4a3e-8644-038e6268c30f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-spgpb" Dec 07 19:16:49 crc kubenswrapper[4815]: I1207 19:16:49.466120 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e5e588d4-9615-4a3e-8644-038e6268c30f-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-spgpb\" (UID: \"e5e588d4-9615-4a3e-8644-038e6268c30f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-spgpb" Dec 07 19:16:49 crc kubenswrapper[4815]: I1207 19:16:49.466163 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/e5e588d4-9615-4a3e-8644-038e6268c30f-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-spgpb\" (UID: \"e5e588d4-9615-4a3e-8644-038e6268c30f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-spgpb" Dec 07 19:16:49 crc kubenswrapper[4815]: I1207 19:16:49.466192 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/e5e588d4-9615-4a3e-8644-038e6268c30f-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-spgpb\" (UID: \"e5e588d4-9615-4a3e-8644-038e6268c30f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-spgpb" Dec 07 19:16:49 crc kubenswrapper[4815]: I1207 19:16:49.567443 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/e5e588d4-9615-4a3e-8644-038e6268c30f-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-spgpb\" (UID: \"e5e588d4-9615-4a3e-8644-038e6268c30f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-spgpb" Dec 07 19:16:49 crc kubenswrapper[4815]: I1207 19:16:49.567508 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e5e588d4-9615-4a3e-8644-038e6268c30f-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-spgpb\" (UID: \"e5e588d4-9615-4a3e-8644-038e6268c30f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-spgpb" Dec 07 19:16:49 crc kubenswrapper[4815]: I1207 19:16:49.567582 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e5e588d4-9615-4a3e-8644-038e6268c30f-service-ca\") pod \"cluster-version-operator-5c965bbfc6-spgpb\" (UID: \"e5e588d4-9615-4a3e-8644-038e6268c30f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-spgpb" Dec 07 19:16:49 crc kubenswrapper[4815]: I1207 19:16:49.567639 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e5e588d4-9615-4a3e-8644-038e6268c30f-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-spgpb\" (UID: \"e5e588d4-9615-4a3e-8644-038e6268c30f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-spgpb" Dec 07 19:16:49 crc kubenswrapper[4815]: I1207 19:16:49.567702 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/e5e588d4-9615-4a3e-8644-038e6268c30f-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-spgpb\" (UID: \"e5e588d4-9615-4a3e-8644-038e6268c30f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-spgpb" Dec 07 19:16:49 crc kubenswrapper[4815]: I1207 19:16:49.567802 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/e5e588d4-9615-4a3e-8644-038e6268c30f-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-spgpb\" (UID: \"e5e588d4-9615-4a3e-8644-038e6268c30f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-spgpb" Dec 07 19:16:49 crc kubenswrapper[4815]: I1207 19:16:49.567583 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/e5e588d4-9615-4a3e-8644-038e6268c30f-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-spgpb\" (UID: \"e5e588d4-9615-4a3e-8644-038e6268c30f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-spgpb" Dec 07 19:16:49 crc kubenswrapper[4815]: I1207 19:16:49.569361 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e5e588d4-9615-4a3e-8644-038e6268c30f-service-ca\") pod \"cluster-version-operator-5c965bbfc6-spgpb\" (UID: \"e5e588d4-9615-4a3e-8644-038e6268c30f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-spgpb" Dec 07 19:16:49 crc kubenswrapper[4815]: I1207 19:16:49.578071 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e5e588d4-9615-4a3e-8644-038e6268c30f-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-spgpb\" (UID: \"e5e588d4-9615-4a3e-8644-038e6268c30f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-spgpb" Dec 07 19:16:49 crc kubenswrapper[4815]: I1207 19:16:49.591128 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e5e588d4-9615-4a3e-8644-038e6268c30f-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-spgpb\" (UID: \"e5e588d4-9615-4a3e-8644-038e6268c30f\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-spgpb" Dec 07 19:16:49 crc kubenswrapper[4815]: I1207 19:16:49.681214 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-spgpb" Dec 07 19:16:49 crc kubenswrapper[4815]: I1207 19:16:49.769586 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 07 19:16:49 crc kubenswrapper[4815]: E1207 19:16:49.769717 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 07 19:16:49 crc kubenswrapper[4815]: I1207 19:16:49.770007 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 07 19:16:49 crc kubenswrapper[4815]: E1207 19:16:49.770290 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 07 19:16:50 crc kubenswrapper[4815]: I1207 19:16:50.470213 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-spgpb" event={"ID":"e5e588d4-9615-4a3e-8644-038e6268c30f","Type":"ContainerStarted","Data":"90001fe0913acd5b5462c06d74a7e8d2c1ff560ba8750b128c3f5cc8559c00b5"} Dec 07 19:16:50 crc kubenswrapper[4815]: I1207 19:16:50.470293 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-spgpb" event={"ID":"e5e588d4-9615-4a3e-8644-038e6268c30f","Type":"ContainerStarted","Data":"f7e42ce4cb3ad148dd915a9eefa18f97bccf2a30ebf4029e8b59973bced663cc"} Dec 07 19:16:50 crc kubenswrapper[4815]: I1207 19:16:50.769751 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 07 19:16:50 crc kubenswrapper[4815]: I1207 19:16:50.769771 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xbq22" Dec 07 19:16:50 crc kubenswrapper[4815]: E1207 19:16:50.769995 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 07 19:16:50 crc kubenswrapper[4815]: E1207 19:16:50.770101 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xbq22" podUID="201e9ba8-3e19-4555-90f0-587497a2a328" Dec 07 19:16:51 crc kubenswrapper[4815]: I1207 19:16:51.769850 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 07 19:16:51 crc kubenswrapper[4815]: I1207 19:16:51.769897 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 07 19:16:51 crc kubenswrapper[4815]: E1207 19:16:51.770103 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 07 19:16:51 crc kubenswrapper[4815]: E1207 19:16:51.770221 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 07 19:16:52 crc kubenswrapper[4815]: I1207 19:16:52.769475 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xbq22" Dec 07 19:16:52 crc kubenswrapper[4815]: E1207 19:16:52.769626 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xbq22" podUID="201e9ba8-3e19-4555-90f0-587497a2a328" Dec 07 19:16:52 crc kubenswrapper[4815]: I1207 19:16:52.770093 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 07 19:16:52 crc kubenswrapper[4815]: E1207 19:16:52.770954 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 07 19:16:53 crc kubenswrapper[4815]: I1207 19:16:53.769252 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 07 19:16:53 crc kubenswrapper[4815]: E1207 19:16:53.769471 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 07 19:16:53 crc kubenswrapper[4815]: I1207 19:16:53.770232 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 07 19:16:53 crc kubenswrapper[4815]: E1207 19:16:53.770360 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 07 19:16:53 crc kubenswrapper[4815]: I1207 19:16:53.771041 4815 scope.go:117] "RemoveContainer" containerID="ebff9b10eeedac44403b96b81a1731f6d5569cf3097b0387d72e11c1d602d51b" Dec 07 19:16:53 crc kubenswrapper[4815]: E1207 19:16:53.771435 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-zzw6c_openshift-ovn-kubernetes(13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" podUID="13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f" Dec 07 19:16:54 crc kubenswrapper[4815]: I1207 19:16:54.768981 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 07 19:16:54 crc kubenswrapper[4815]: E1207 19:16:54.769773 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 07 19:16:54 crc kubenswrapper[4815]: I1207 19:16:54.769186 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xbq22" Dec 07 19:16:54 crc kubenswrapper[4815]: E1207 19:16:54.770204 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xbq22" podUID="201e9ba8-3e19-4555-90f0-587497a2a328" Dec 07 19:16:55 crc kubenswrapper[4815]: I1207 19:16:55.773117 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 07 19:16:55 crc kubenswrapper[4815]: I1207 19:16:55.773343 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 07 19:16:55 crc kubenswrapper[4815]: E1207 19:16:55.773608 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 07 19:16:55 crc kubenswrapper[4815]: E1207 19:16:55.774214 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 07 19:16:55 crc kubenswrapper[4815]: E1207 19:16:55.779539 4815 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Dec 07 19:16:55 crc kubenswrapper[4815]: E1207 19:16:55.972583 4815 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 07 19:16:56 crc kubenswrapper[4815]: I1207 19:16:56.768878 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xbq22" Dec 07 19:16:56 crc kubenswrapper[4815]: E1207 19:16:56.769036 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xbq22" podUID="201e9ba8-3e19-4555-90f0-587497a2a328" Dec 07 19:16:56 crc kubenswrapper[4815]: I1207 19:16:56.769233 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 07 19:16:56 crc kubenswrapper[4815]: E1207 19:16:56.769289 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 07 19:16:57 crc kubenswrapper[4815]: I1207 19:16:57.769846 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 07 19:16:57 crc kubenswrapper[4815]: I1207 19:16:57.769989 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 07 19:16:57 crc kubenswrapper[4815]: E1207 19:16:57.770138 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 07 19:16:57 crc kubenswrapper[4815]: E1207 19:16:57.770361 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 07 19:16:58 crc kubenswrapper[4815]: I1207 19:16:58.769580 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 07 19:16:58 crc kubenswrapper[4815]: I1207 19:16:58.769647 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xbq22" Dec 07 19:16:58 crc kubenswrapper[4815]: E1207 19:16:58.769818 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 07 19:16:58 crc kubenswrapper[4815]: E1207 19:16:58.769988 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xbq22" podUID="201e9ba8-3e19-4555-90f0-587497a2a328" Dec 07 19:16:59 crc kubenswrapper[4815]: I1207 19:16:59.769157 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 07 19:16:59 crc kubenswrapper[4815]: I1207 19:16:59.769247 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 07 19:16:59 crc kubenswrapper[4815]: E1207 19:16:59.769394 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 07 19:16:59 crc kubenswrapper[4815]: E1207 19:16:59.769510 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 07 19:17:00 crc kubenswrapper[4815]: I1207 19:17:00.505508 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-s95hp_0b739f36-d9c4-4fb6-9ead-9df05e283dea/kube-multus/1.log" Dec 07 19:17:00 crc kubenswrapper[4815]: I1207 19:17:00.506377 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-s95hp_0b739f36-d9c4-4fb6-9ead-9df05e283dea/kube-multus/0.log" Dec 07 19:17:00 crc kubenswrapper[4815]: I1207 19:17:00.506448 4815 generic.go:334] "Generic (PLEG): container finished" podID="0b739f36-d9c4-4fb6-9ead-9df05e283dea" containerID="7285635ab7710e9071a051b6e49036856b4c60c87b5110debec3bfb20bb0ac97" exitCode=1 Dec 07 19:17:00 crc kubenswrapper[4815]: I1207 19:17:00.506489 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-s95hp" event={"ID":"0b739f36-d9c4-4fb6-9ead-9df05e283dea","Type":"ContainerDied","Data":"7285635ab7710e9071a051b6e49036856b4c60c87b5110debec3bfb20bb0ac97"} Dec 07 19:17:00 crc kubenswrapper[4815]: I1207 19:17:00.506536 4815 scope.go:117] "RemoveContainer" containerID="73eedf1d85ce69ddf2fed2445ed9318cb4515aee47f89fe9b1db69bbf213ea2a" Dec 07 19:17:00 crc kubenswrapper[4815]: I1207 19:17:00.507209 4815 scope.go:117] "RemoveContainer" containerID="7285635ab7710e9071a051b6e49036856b4c60c87b5110debec3bfb20bb0ac97" Dec 07 19:17:00 crc kubenswrapper[4815]: E1207 19:17:00.507622 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-s95hp_openshift-multus(0b739f36-d9c4-4fb6-9ead-9df05e283dea)\"" pod="openshift-multus/multus-s95hp" podUID="0b739f36-d9c4-4fb6-9ead-9df05e283dea" Dec 07 19:17:00 crc kubenswrapper[4815]: I1207 19:17:00.541602 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-spgpb" podStartSLOduration=95.541579515 podStartE2EDuration="1m35.541579515s" podCreationTimestamp="2025-12-07 19:15:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:16:50.488880913 +0000 UTC m=+115.067871008" watchObservedRunningTime="2025-12-07 19:17:00.541579515 +0000 UTC m=+125.120569600" Dec 07 19:17:00 crc kubenswrapper[4815]: I1207 19:17:00.769680 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xbq22" Dec 07 19:17:00 crc kubenswrapper[4815]: I1207 19:17:00.769741 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 07 19:17:00 crc kubenswrapper[4815]: E1207 19:17:00.769852 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xbq22" podUID="201e9ba8-3e19-4555-90f0-587497a2a328" Dec 07 19:17:00 crc kubenswrapper[4815]: E1207 19:17:00.770029 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 07 19:17:00 crc kubenswrapper[4815]: E1207 19:17:00.973890 4815 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 07 19:17:01 crc kubenswrapper[4815]: I1207 19:17:01.512229 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-s95hp_0b739f36-d9c4-4fb6-9ead-9df05e283dea/kube-multus/1.log" Dec 07 19:17:01 crc kubenswrapper[4815]: I1207 19:17:01.769527 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 07 19:17:01 crc kubenswrapper[4815]: I1207 19:17:01.769553 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 07 19:17:01 crc kubenswrapper[4815]: E1207 19:17:01.769704 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 07 19:17:01 crc kubenswrapper[4815]: E1207 19:17:01.770046 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 07 19:17:02 crc kubenswrapper[4815]: I1207 19:17:02.769766 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 07 19:17:02 crc kubenswrapper[4815]: I1207 19:17:02.769789 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xbq22" Dec 07 19:17:02 crc kubenswrapper[4815]: E1207 19:17:02.770001 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 07 19:17:02 crc kubenswrapper[4815]: E1207 19:17:02.770070 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xbq22" podUID="201e9ba8-3e19-4555-90f0-587497a2a328" Dec 07 19:17:03 crc kubenswrapper[4815]: I1207 19:17:03.769734 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 07 19:17:03 crc kubenswrapper[4815]: E1207 19:17:03.769971 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 07 19:17:03 crc kubenswrapper[4815]: I1207 19:17:03.770978 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 07 19:17:03 crc kubenswrapper[4815]: E1207 19:17:03.771384 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 07 19:17:04 crc kubenswrapper[4815]: I1207 19:17:04.769528 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 07 19:17:04 crc kubenswrapper[4815]: E1207 19:17:04.769688 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 07 19:17:04 crc kubenswrapper[4815]: I1207 19:17:04.769548 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xbq22" Dec 07 19:17:04 crc kubenswrapper[4815]: E1207 19:17:04.770117 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xbq22" podUID="201e9ba8-3e19-4555-90f0-587497a2a328" Dec 07 19:17:05 crc kubenswrapper[4815]: I1207 19:17:05.769599 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 07 19:17:05 crc kubenswrapper[4815]: I1207 19:17:05.769639 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 07 19:17:05 crc kubenswrapper[4815]: E1207 19:17:05.771656 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 07 19:17:05 crc kubenswrapper[4815]: E1207 19:17:05.771820 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 07 19:17:05 crc kubenswrapper[4815]: E1207 19:17:05.975582 4815 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 07 19:17:06 crc kubenswrapper[4815]: I1207 19:17:06.769449 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xbq22" Dec 07 19:17:06 crc kubenswrapper[4815]: I1207 19:17:06.769449 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 07 19:17:06 crc kubenswrapper[4815]: E1207 19:17:06.769708 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xbq22" podUID="201e9ba8-3e19-4555-90f0-587497a2a328" Dec 07 19:17:06 crc kubenswrapper[4815]: E1207 19:17:06.769808 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 07 19:17:07 crc kubenswrapper[4815]: I1207 19:17:07.769768 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 07 19:17:07 crc kubenswrapper[4815]: E1207 19:17:07.770021 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 07 19:17:07 crc kubenswrapper[4815]: I1207 19:17:07.770149 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 07 19:17:07 crc kubenswrapper[4815]: E1207 19:17:07.770911 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 07 19:17:07 crc kubenswrapper[4815]: I1207 19:17:07.771483 4815 scope.go:117] "RemoveContainer" containerID="ebff9b10eeedac44403b96b81a1731f6d5569cf3097b0387d72e11c1d602d51b" Dec 07 19:17:07 crc kubenswrapper[4815]: E1207 19:17:07.771777 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-zzw6c_openshift-ovn-kubernetes(13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" podUID="13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f" Dec 07 19:17:08 crc kubenswrapper[4815]: I1207 19:17:08.769511 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xbq22" Dec 07 19:17:08 crc kubenswrapper[4815]: I1207 19:17:08.769610 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 07 19:17:08 crc kubenswrapper[4815]: E1207 19:17:08.769727 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xbq22" podUID="201e9ba8-3e19-4555-90f0-587497a2a328" Dec 07 19:17:08 crc kubenswrapper[4815]: E1207 19:17:08.769846 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 07 19:17:09 crc kubenswrapper[4815]: I1207 19:17:09.769218 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 07 19:17:09 crc kubenswrapper[4815]: I1207 19:17:09.769218 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 07 19:17:09 crc kubenswrapper[4815]: E1207 19:17:09.769458 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 07 19:17:09 crc kubenswrapper[4815]: E1207 19:17:09.769531 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 07 19:17:10 crc kubenswrapper[4815]: I1207 19:17:10.769374 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 07 19:17:10 crc kubenswrapper[4815]: I1207 19:17:10.769413 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xbq22" Dec 07 19:17:10 crc kubenswrapper[4815]: E1207 19:17:10.769676 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 07 19:17:10 crc kubenswrapper[4815]: E1207 19:17:10.769806 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xbq22" podUID="201e9ba8-3e19-4555-90f0-587497a2a328" Dec 07 19:17:10 crc kubenswrapper[4815]: I1207 19:17:10.770083 4815 scope.go:117] "RemoveContainer" containerID="7285635ab7710e9071a051b6e49036856b4c60c87b5110debec3bfb20bb0ac97" Dec 07 19:17:10 crc kubenswrapper[4815]: E1207 19:17:10.976387 4815 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 07 19:17:11 crc kubenswrapper[4815]: I1207 19:17:11.554416 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-s95hp_0b739f36-d9c4-4fb6-9ead-9df05e283dea/kube-multus/1.log" Dec 07 19:17:11 crc kubenswrapper[4815]: I1207 19:17:11.554514 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-s95hp" event={"ID":"0b739f36-d9c4-4fb6-9ead-9df05e283dea","Type":"ContainerStarted","Data":"17214bf8d7b3f024980012a53d6b512815ab09eca779cdfe9e2a75a966a21663"} Dec 07 19:17:11 crc kubenswrapper[4815]: I1207 19:17:11.769632 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 07 19:17:11 crc kubenswrapper[4815]: I1207 19:17:11.769632 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 07 19:17:11 crc kubenswrapper[4815]: E1207 19:17:11.769861 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 07 19:17:11 crc kubenswrapper[4815]: E1207 19:17:11.770041 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 07 19:17:12 crc kubenswrapper[4815]: I1207 19:17:12.769743 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 07 19:17:12 crc kubenswrapper[4815]: I1207 19:17:12.769743 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xbq22" Dec 07 19:17:12 crc kubenswrapper[4815]: E1207 19:17:12.770022 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 07 19:17:12 crc kubenswrapper[4815]: E1207 19:17:12.770074 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xbq22" podUID="201e9ba8-3e19-4555-90f0-587497a2a328" Dec 07 19:17:13 crc kubenswrapper[4815]: I1207 19:17:13.769581 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 07 19:17:13 crc kubenswrapper[4815]: I1207 19:17:13.769744 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 07 19:17:13 crc kubenswrapper[4815]: E1207 19:17:13.769805 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 07 19:17:13 crc kubenswrapper[4815]: E1207 19:17:13.770085 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 07 19:17:14 crc kubenswrapper[4815]: I1207 19:17:14.769560 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 07 19:17:14 crc kubenswrapper[4815]: I1207 19:17:14.769595 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xbq22" Dec 07 19:17:14 crc kubenswrapper[4815]: E1207 19:17:14.769726 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 07 19:17:14 crc kubenswrapper[4815]: E1207 19:17:14.770004 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xbq22" podUID="201e9ba8-3e19-4555-90f0-587497a2a328" Dec 07 19:17:15 crc kubenswrapper[4815]: I1207 19:17:15.769600 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 07 19:17:15 crc kubenswrapper[4815]: I1207 19:17:15.769647 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 07 19:17:15 crc kubenswrapper[4815]: E1207 19:17:15.771025 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 07 19:17:15 crc kubenswrapper[4815]: E1207 19:17:15.771711 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 07 19:17:15 crc kubenswrapper[4815]: E1207 19:17:15.977582 4815 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 07 19:17:16 crc kubenswrapper[4815]: I1207 19:17:16.768748 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 07 19:17:16 crc kubenswrapper[4815]: I1207 19:17:16.768771 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xbq22" Dec 07 19:17:16 crc kubenswrapper[4815]: E1207 19:17:16.768876 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 07 19:17:16 crc kubenswrapper[4815]: E1207 19:17:16.769119 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xbq22" podUID="201e9ba8-3e19-4555-90f0-587497a2a328" Dec 07 19:17:17 crc kubenswrapper[4815]: I1207 19:17:17.769500 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 07 19:17:17 crc kubenswrapper[4815]: I1207 19:17:17.769510 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 07 19:17:17 crc kubenswrapper[4815]: E1207 19:17:17.769711 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 07 19:17:17 crc kubenswrapper[4815]: E1207 19:17:17.769803 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 07 19:17:18 crc kubenswrapper[4815]: I1207 19:17:18.769632 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xbq22" Dec 07 19:17:18 crc kubenswrapper[4815]: I1207 19:17:18.769666 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 07 19:17:18 crc kubenswrapper[4815]: E1207 19:17:18.770903 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xbq22" podUID="201e9ba8-3e19-4555-90f0-587497a2a328" Dec 07 19:17:18 crc kubenswrapper[4815]: E1207 19:17:18.770899 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 07 19:17:19 crc kubenswrapper[4815]: I1207 19:17:19.769781 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 07 19:17:19 crc kubenswrapper[4815]: I1207 19:17:19.769855 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 07 19:17:19 crc kubenswrapper[4815]: E1207 19:17:19.770004 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 07 19:17:19 crc kubenswrapper[4815]: E1207 19:17:19.770203 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 07 19:17:20 crc kubenswrapper[4815]: I1207 19:17:20.769124 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xbq22" Dec 07 19:17:20 crc kubenswrapper[4815]: I1207 19:17:20.769124 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 07 19:17:20 crc kubenswrapper[4815]: E1207 19:17:20.769336 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xbq22" podUID="201e9ba8-3e19-4555-90f0-587497a2a328" Dec 07 19:17:20 crc kubenswrapper[4815]: E1207 19:17:20.769392 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 07 19:17:20 crc kubenswrapper[4815]: I1207 19:17:20.770435 4815 scope.go:117] "RemoveContainer" containerID="ebff9b10eeedac44403b96b81a1731f6d5569cf3097b0387d72e11c1d602d51b" Dec 07 19:17:20 crc kubenswrapper[4815]: E1207 19:17:20.978841 4815 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 07 19:17:21 crc kubenswrapper[4815]: I1207 19:17:21.600505 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zzw6c_13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f/ovnkube-controller/3.log" Dec 07 19:17:21 crc kubenswrapper[4815]: I1207 19:17:21.604097 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" event={"ID":"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f","Type":"ContainerStarted","Data":"d149289581f96fc5a122787e8f8595783c8924d8fa538104d01ab788729c41fd"} Dec 07 19:17:21 crc kubenswrapper[4815]: I1207 19:17:21.606564 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" Dec 07 19:17:21 crc kubenswrapper[4815]: I1207 19:17:21.613244 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-xbq22"] Dec 07 19:17:21 crc kubenswrapper[4815]: I1207 19:17:21.613341 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xbq22" Dec 07 19:17:21 crc kubenswrapper[4815]: E1207 19:17:21.613420 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xbq22" podUID="201e9ba8-3e19-4555-90f0-587497a2a328" Dec 07 19:17:21 crc kubenswrapper[4815]: I1207 19:17:21.657132 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" podStartSLOduration=116.657075567 podStartE2EDuration="1m56.657075567s" podCreationTimestamp="2025-12-07 19:15:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:17:21.654759683 +0000 UTC m=+146.233749738" watchObservedRunningTime="2025-12-07 19:17:21.657075567 +0000 UTC m=+146.236065652" Dec 07 19:17:21 crc kubenswrapper[4815]: I1207 19:17:21.770337 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 07 19:17:21 crc kubenswrapper[4815]: E1207 19:17:21.770518 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 07 19:17:21 crc kubenswrapper[4815]: I1207 19:17:21.770764 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 07 19:17:21 crc kubenswrapper[4815]: E1207 19:17:21.770858 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 07 19:17:22 crc kubenswrapper[4815]: I1207 19:17:22.768823 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 07 19:17:22 crc kubenswrapper[4815]: E1207 19:17:22.769092 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 07 19:17:23 crc kubenswrapper[4815]: I1207 19:17:23.769614 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 07 19:17:23 crc kubenswrapper[4815]: I1207 19:17:23.769684 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xbq22" Dec 07 19:17:23 crc kubenswrapper[4815]: E1207 19:17:23.769809 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 07 19:17:23 crc kubenswrapper[4815]: I1207 19:17:23.769836 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 07 19:17:23 crc kubenswrapper[4815]: E1207 19:17:23.770031 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xbq22" podUID="201e9ba8-3e19-4555-90f0-587497a2a328" Dec 07 19:17:23 crc kubenswrapper[4815]: E1207 19:17:23.770198 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 07 19:17:24 crc kubenswrapper[4815]: I1207 19:17:24.769662 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 07 19:17:24 crc kubenswrapper[4815]: E1207 19:17:24.769840 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 07 19:17:25 crc kubenswrapper[4815]: I1207 19:17:25.769321 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 07 19:17:25 crc kubenswrapper[4815]: I1207 19:17:25.769329 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 07 19:17:25 crc kubenswrapper[4815]: E1207 19:17:25.771180 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 07 19:17:25 crc kubenswrapper[4815]: I1207 19:17:25.771205 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xbq22" Dec 07 19:17:25 crc kubenswrapper[4815]: E1207 19:17:25.771246 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 07 19:17:25 crc kubenswrapper[4815]: E1207 19:17:25.771381 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xbq22" podUID="201e9ba8-3e19-4555-90f0-587497a2a328" Dec 07 19:17:26 crc kubenswrapper[4815]: I1207 19:17:26.360131 4815 patch_prober.go:28] interesting pod/machine-config-daemon-gkn4h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 07 19:17:26 crc kubenswrapper[4815]: I1207 19:17:26.360227 4815 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 07 19:17:26 crc kubenswrapper[4815]: I1207 19:17:26.768792 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 07 19:17:26 crc kubenswrapper[4815]: I1207 19:17:26.770953 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 07 19:17:26 crc kubenswrapper[4815]: I1207 19:17:26.772311 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 07 19:17:27 crc kubenswrapper[4815]: I1207 19:17:27.200820 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" Dec 07 19:17:27 crc kubenswrapper[4815]: I1207 19:17:27.769803 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 07 19:17:27 crc kubenswrapper[4815]: I1207 19:17:27.769844 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xbq22" Dec 07 19:17:27 crc kubenswrapper[4815]: I1207 19:17:27.769803 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 07 19:17:27 crc kubenswrapper[4815]: I1207 19:17:27.773174 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 07 19:17:27 crc kubenswrapper[4815]: I1207 19:17:27.773757 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 07 19:17:27 crc kubenswrapper[4815]: I1207 19:17:27.774220 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 07 19:17:27 crc kubenswrapper[4815]: I1207 19:17:27.775143 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 07 19:17:28 crc kubenswrapper[4815]: I1207 19:17:28.845724 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 07 19:17:28 crc kubenswrapper[4815]: I1207 19:17:28.845895 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 07 19:17:28 crc kubenswrapper[4815]: E1207 19:17:28.846014 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-07 19:19:30.845963473 +0000 UTC m=+275.424953558 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:28 crc kubenswrapper[4815]: I1207 19:17:28.846064 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 07 19:17:28 crc kubenswrapper[4815]: I1207 19:17:28.846104 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 07 19:17:28 crc kubenswrapper[4815]: I1207 19:17:28.846149 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 07 19:17:28 crc kubenswrapper[4815]: I1207 19:17:28.847140 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 07 19:17:28 crc kubenswrapper[4815]: I1207 19:17:28.852750 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 07 19:17:28 crc kubenswrapper[4815]: I1207 19:17:28.853329 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 07 19:17:28 crc kubenswrapper[4815]: I1207 19:17:28.853612 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 07 19:17:28 crc kubenswrapper[4815]: I1207 19:17:28.891174 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 07 19:17:28 crc kubenswrapper[4815]: I1207 19:17:28.990614 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 07 19:17:29 crc kubenswrapper[4815]: I1207 19:17:29.009995 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 07 19:17:29 crc kubenswrapper[4815]: W1207 19:17:29.283204 4815 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-ff0bedc2f66424cc30f45aadd0de363fac2566cc6e7fe5b97a3bc8772bb659c0 WatchSource:0}: Error finding container ff0bedc2f66424cc30f45aadd0de363fac2566cc6e7fe5b97a3bc8772bb659c0: Status 404 returned error can't find the container with id ff0bedc2f66424cc30f45aadd0de363fac2566cc6e7fe5b97a3bc8772bb659c0 Dec 07 19:17:29 crc kubenswrapper[4815]: I1207 19:17:29.634686 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"ff0bedc2f66424cc30f45aadd0de363fac2566cc6e7fe5b97a3bc8772bb659c0"} Dec 07 19:17:29 crc kubenswrapper[4815]: I1207 19:17:29.636245 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"dabc7735720c6f38e2964e2f2b6b744108a3b97d8da7719c221b908f58af1a0b"} Dec 07 19:17:29 crc kubenswrapper[4815]: I1207 19:17:29.638497 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"3c403b42514dca817b246703f836c1f34c6330b8988e4eda8305183df075289a"} Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.063319 4815 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.119497 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-r924c"] Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.120145 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-mh7gn"] Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.120562 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mh7gn" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.121127 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-r924c" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.133323 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.133623 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.134564 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-r87km"] Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.135053 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-r87km" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.139308 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.139326 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.139326 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.139399 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.139411 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.140262 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.143290 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.148694 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.149727 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.159519 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.162048 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-nz5vs"] Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.162585 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-nz5vs" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.162853 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.163272 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.163381 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.165122 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.165276 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.165378 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.165675 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.165804 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.166332 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.168512 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-9hczz"] Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.168759 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.169086 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-xxlhj"] Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.169416 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-xxlhj" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.169440 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9hczz" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.177024 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-c2mqh"] Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.182561 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.182949 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.185473 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.185985 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.186117 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.186304 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.186410 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.186507 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.190160 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.190719 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.191763 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.191963 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.192070 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.192161 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.194274 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.198204 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-mxwsj"] Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.198533 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-6hr2j"] Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.198777 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-gpv67"] Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.198972 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.198990 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qpls2"] Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.199182 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ks27d"] Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.199276 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.199373 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-k72f8"] Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.199429 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.199577 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.199693 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-kznql"] Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.199930 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.199982 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-z4njj"] Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.200029 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-6hr2j" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.200151 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.200389 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-c2mqh" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.200465 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-z4njj" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.200569 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ks27d" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.200622 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-gpv67" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.200673 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-kznql" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.200698 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-mxwsj" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.200819 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qpls2" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.200847 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k72f8" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.204402 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.205413 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cdxxd"] Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.218867 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-mh7gn"] Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.219199 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-r924c"] Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.219232 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.219825 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cdxxd" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.226278 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.226434 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.226611 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.226732 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.227007 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.227062 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.227211 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.227277 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.227350 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.227460 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.227606 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.227648 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.227826 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.227994 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.228031 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.228105 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.228143 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.228227 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.228484 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.227526 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.230737 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.230843 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.231188 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.231285 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.231320 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.231491 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.231650 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.231837 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.232972 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.233767 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.238181 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.238311 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-nz5vs"] Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.238339 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-c2mqh"] Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.238349 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-r87km"] Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.239225 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.239343 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.239428 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.239507 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.239961 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.240115 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.240182 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.240448 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.241093 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.241179 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.241270 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.241345 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.242758 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.243200 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.243984 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.244105 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.242752 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-mxwsj"] Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.244433 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.244749 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qpls2"] Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.245900 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-xxlhj"] Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.246026 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.246038 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.246165 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.246408 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.246543 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.246838 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.249103 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.251082 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.251304 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.255931 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.264793 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c13c8a39-c00b-46e6-b721-57c74409a776-config\") pod \"machine-approver-56656f9798-9hczz\" (UID: \"c13c8a39-c00b-46e6-b721-57c74409a776\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9hczz" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.264830 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4dbec086-a865-4859-adbf-ab61d8395463-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-r924c\" (UID: \"4dbec086-a865-4859-adbf-ab61d8395463\") " pod="openshift-authentication/oauth-openshift-558db77b4-r924c" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.264894 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95mgg\" (UniqueName: \"kubernetes.io/projected/c13c8a39-c00b-46e6-b721-57c74409a776-kube-api-access-95mgg\") pod \"machine-approver-56656f9798-9hczz\" (UID: \"c13c8a39-c00b-46e6-b721-57c74409a776\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9hczz" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.264976 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f8d9863a-2779-463c-8d73-76246a51b333-service-ca\") pod \"console-f9d7485db-xxlhj\" (UID: \"f8d9863a-2779-463c-8d73-76246a51b333\") " pod="openshift-console/console-f9d7485db-xxlhj" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.265010 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8d9863a-2779-463c-8d73-76246a51b333-trusted-ca-bundle\") pod \"console-f9d7485db-xxlhj\" (UID: \"f8d9863a-2779-463c-8d73-76246a51b333\") " pod="openshift-console/console-f9d7485db-xxlhj" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.265033 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4dbec086-a865-4859-adbf-ab61d8395463-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-r924c\" (UID: \"4dbec086-a865-4859-adbf-ab61d8395463\") " pod="openshift-authentication/oauth-openshift-558db77b4-r924c" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.265057 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f8d9863a-2779-463c-8d73-76246a51b333-oauth-serving-cert\") pod \"console-f9d7485db-xxlhj\" (UID: \"f8d9863a-2779-463c-8d73-76246a51b333\") " pod="openshift-console/console-f9d7485db-xxlhj" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.265073 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4dbec086-a865-4859-adbf-ab61d8395463-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-r924c\" (UID: \"4dbec086-a865-4859-adbf-ab61d8395463\") " pod="openshift-authentication/oauth-openshift-558db77b4-r924c" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.265090 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4dbec086-a865-4859-adbf-ab61d8395463-audit-dir\") pod \"oauth-openshift-558db77b4-r924c\" (UID: \"4dbec086-a865-4859-adbf-ab61d8395463\") " pod="openshift-authentication/oauth-openshift-558db77b4-r924c" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.265104 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a0f846e4-03b3-4bce-a909-75339183ebcb-service-ca-bundle\") pod \"authentication-operator-69f744f599-nz5vs\" (UID: \"a0f846e4-03b3-4bce-a909-75339183ebcb\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-nz5vs" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.265134 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4dbec086-a865-4859-adbf-ab61d8395463-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-r924c\" (UID: \"4dbec086-a865-4859-adbf-ab61d8395463\") " pod="openshift-authentication/oauth-openshift-558db77b4-r924c" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.265174 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1523d123-99b7-4f3f-a3f6-c5502562eedc-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-r87km\" (UID: \"1523d123-99b7-4f3f-a3f6-c5502562eedc\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-r87km" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.265227 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4dbec086-a865-4859-adbf-ab61d8395463-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-r924c\" (UID: \"4dbec086-a865-4859-adbf-ab61d8395463\") " pod="openshift-authentication/oauth-openshift-558db77b4-r924c" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.265252 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f6cc294-684f-4ac2-8eb1-183af364c619-config\") pod \"route-controller-manager-6576b87f9c-mh7gn\" (UID: \"9f6cc294-684f-4ac2-8eb1-183af364c619\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mh7gn" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.265266 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4sth8\" (UniqueName: \"kubernetes.io/projected/4dbec086-a865-4859-adbf-ab61d8395463-kube-api-access-4sth8\") pod \"oauth-openshift-558db77b4-r924c\" (UID: \"4dbec086-a865-4859-adbf-ab61d8395463\") " pod="openshift-authentication/oauth-openshift-558db77b4-r924c" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.265288 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjtrc\" (UniqueName: \"kubernetes.io/projected/9f6cc294-684f-4ac2-8eb1-183af364c619-kube-api-access-pjtrc\") pod \"route-controller-manager-6576b87f9c-mh7gn\" (UID: \"9f6cc294-684f-4ac2-8eb1-183af364c619\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mh7gn" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.265329 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/c13c8a39-c00b-46e6-b721-57c74409a776-machine-approver-tls\") pod \"machine-approver-56656f9798-9hczz\" (UID: \"c13c8a39-c00b-46e6-b721-57c74409a776\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9hczz" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.265346 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4dbec086-a865-4859-adbf-ab61d8395463-audit-policies\") pod \"oauth-openshift-558db77b4-r924c\" (UID: \"4dbec086-a865-4859-adbf-ab61d8395463\") " pod="openshift-authentication/oauth-openshift-558db77b4-r924c" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.265386 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4dbec086-a865-4859-adbf-ab61d8395463-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-r924c\" (UID: \"4dbec086-a865-4859-adbf-ab61d8395463\") " pod="openshift-authentication/oauth-openshift-558db77b4-r924c" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.265415 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0f846e4-03b3-4bce-a909-75339183ebcb-config\") pod \"authentication-operator-69f744f599-nz5vs\" (UID: \"a0f846e4-03b3-4bce-a909-75339183ebcb\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-nz5vs" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.265464 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f8d9863a-2779-463c-8d73-76246a51b333-console-config\") pod \"console-f9d7485db-xxlhj\" (UID: \"f8d9863a-2779-463c-8d73-76246a51b333\") " pod="openshift-console/console-f9d7485db-xxlhj" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.265480 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tdxz\" (UniqueName: \"kubernetes.io/projected/a0f846e4-03b3-4bce-a909-75339183ebcb-kube-api-access-6tdxz\") pod \"authentication-operator-69f744f599-nz5vs\" (UID: \"a0f846e4-03b3-4bce-a909-75339183ebcb\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-nz5vs" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.265531 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f6cc294-684f-4ac2-8eb1-183af364c619-serving-cert\") pod \"route-controller-manager-6576b87f9c-mh7gn\" (UID: \"9f6cc294-684f-4ac2-8eb1-183af364c619\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mh7gn" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.265549 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chf44\" (UniqueName: \"kubernetes.io/projected/1523d123-99b7-4f3f-a3f6-c5502562eedc-kube-api-access-chf44\") pod \"cluster-samples-operator-665b6dd947-r87km\" (UID: \"1523d123-99b7-4f3f-a3f6-c5502562eedc\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-r87km" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.265593 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msgq4\" (UniqueName: \"kubernetes.io/projected/f8d9863a-2779-463c-8d73-76246a51b333-kube-api-access-msgq4\") pod \"console-f9d7485db-xxlhj\" (UID: \"f8d9863a-2779-463c-8d73-76246a51b333\") " pod="openshift-console/console-f9d7485db-xxlhj" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.265610 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4dbec086-a865-4859-adbf-ab61d8395463-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-r924c\" (UID: \"4dbec086-a865-4859-adbf-ab61d8395463\") " pod="openshift-authentication/oauth-openshift-558db77b4-r924c" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.265654 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f8d9863a-2779-463c-8d73-76246a51b333-console-serving-cert\") pod \"console-f9d7485db-xxlhj\" (UID: \"f8d9863a-2779-463c-8d73-76246a51b333\") " pod="openshift-console/console-f9d7485db-xxlhj" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.265671 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4dbec086-a865-4859-adbf-ab61d8395463-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-r924c\" (UID: \"4dbec086-a865-4859-adbf-ab61d8395463\") " pod="openshift-authentication/oauth-openshift-558db77b4-r924c" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.265696 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9f6cc294-684f-4ac2-8eb1-183af364c619-client-ca\") pod \"route-controller-manager-6576b87f9c-mh7gn\" (UID: \"9f6cc294-684f-4ac2-8eb1-183af364c619\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mh7gn" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.265728 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f8d9863a-2779-463c-8d73-76246a51b333-console-oauth-config\") pod \"console-f9d7485db-xxlhj\" (UID: \"f8d9863a-2779-463c-8d73-76246a51b333\") " pod="openshift-console/console-f9d7485db-xxlhj" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.265886 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4dbec086-a865-4859-adbf-ab61d8395463-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-r924c\" (UID: \"4dbec086-a865-4859-adbf-ab61d8395463\") " pod="openshift-authentication/oauth-openshift-558db77b4-r924c" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.265930 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4dbec086-a865-4859-adbf-ab61d8395463-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-r924c\" (UID: \"4dbec086-a865-4859-adbf-ab61d8395463\") " pod="openshift-authentication/oauth-openshift-558db77b4-r924c" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.265981 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c13c8a39-c00b-46e6-b721-57c74409a776-auth-proxy-config\") pod \"machine-approver-56656f9798-9hczz\" (UID: \"c13c8a39-c00b-46e6-b721-57c74409a776\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9hczz" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.265999 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4dbec086-a865-4859-adbf-ab61d8395463-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-r924c\" (UID: \"4dbec086-a865-4859-adbf-ab61d8395463\") " pod="openshift-authentication/oauth-openshift-558db77b4-r924c" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.266012 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a0f846e4-03b3-4bce-a909-75339183ebcb-serving-cert\") pod \"authentication-operator-69f744f599-nz5vs\" (UID: \"a0f846e4-03b3-4bce-a909-75339183ebcb\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-nz5vs" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.266073 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a0f846e4-03b3-4bce-a909-75339183ebcb-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-nz5vs\" (UID: \"a0f846e4-03b3-4bce-a909-75339183ebcb\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-nz5vs" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.309678 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.328862 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-k99gl"] Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.329417 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-7gwm4"] Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.329962 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-7gwm4" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.330097 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-k99gl" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.338721 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.339206 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.339500 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.339755 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.342553 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-t955b"] Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.343157 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-t955b" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.343460 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-c74lm"] Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.352399 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.353004 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-c74lm" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.359844 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-ntzvg"] Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.362490 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cxk4m"] Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.363443 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cxk4m" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.363539 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ntzvg" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.371555 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.372905 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fp4mr"] Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.374206 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fp4mr" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.376298 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a2bbe4d3-67cf-4517-93c7-528e1694f76c-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-k72f8\" (UID: \"a2bbe4d3-67cf-4517-93c7-528e1694f76c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k72f8" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.376356 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d4648241-9b66-421e-b267-fc03442657a8-trusted-ca-bundle\") pod \"apiserver-76f77b778f-z4njj\" (UID: \"d4648241-9b66-421e-b267-fc03442657a8\") " pod="openshift-apiserver/apiserver-76f77b778f-z4njj" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.376408 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d4648241-9b66-421e-b267-fc03442657a8-encryption-config\") pod \"apiserver-76f77b778f-z4njj\" (UID: \"d4648241-9b66-421e-b267-fc03442657a8\") " pod="openshift-apiserver/apiserver-76f77b778f-z4njj" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.376442 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hctpn\" (UniqueName: \"kubernetes.io/projected/25d8da8c-5fc5-42ea-8779-2394e32fadee-kube-api-access-hctpn\") pod \"openshift-config-operator-7777fb866f-kznql\" (UID: \"25d8da8c-5fc5-42ea-8779-2394e32fadee\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-kznql" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.376474 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f8d9863a-2779-463c-8d73-76246a51b333-console-config\") pod \"console-f9d7485db-xxlhj\" (UID: \"f8d9863a-2779-463c-8d73-76246a51b333\") " pod="openshift-console/console-f9d7485db-xxlhj" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.376532 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tdxz\" (UniqueName: \"kubernetes.io/projected/a0f846e4-03b3-4bce-a909-75339183ebcb-kube-api-access-6tdxz\") pod \"authentication-operator-69f744f599-nz5vs\" (UID: \"a0f846e4-03b3-4bce-a909-75339183ebcb\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-nz5vs" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.376572 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/d4648241-9b66-421e-b267-fc03442657a8-image-import-ca\") pod \"apiserver-76f77b778f-z4njj\" (UID: \"d4648241-9b66-421e-b267-fc03442657a8\") " pod="openshift-apiserver/apiserver-76f77b778f-z4njj" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.376592 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/d1f45a14-a0f6-4585-8171-2ef3dc5e5d9b-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-mxwsj\" (UID: \"d1f45a14-a0f6-4585-8171-2ef3dc5e5d9b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mxwsj" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.376623 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f6cc294-684f-4ac2-8eb1-183af364c619-serving-cert\") pod \"route-controller-manager-6576b87f9c-mh7gn\" (UID: \"9f6cc294-684f-4ac2-8eb1-183af364c619\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mh7gn" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.376644 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6tnc\" (UniqueName: \"kubernetes.io/projected/8e21be4e-1543-4e91-b451-6b7d9f258195-kube-api-access-v6tnc\") pod \"console-operator-58897d9998-c2mqh\" (UID: \"8e21be4e-1543-4e91-b451-6b7d9f258195\") " pod="openshift-console-operator/console-operator-58897d9998-c2mqh" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.376749 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chf44\" (UniqueName: \"kubernetes.io/projected/1523d123-99b7-4f3f-a3f6-c5502562eedc-kube-api-access-chf44\") pod \"cluster-samples-operator-665b6dd947-r87km\" (UID: \"1523d123-99b7-4f3f-a3f6-c5502562eedc\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-r87km" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.376768 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e56a0152-c9d6-4e9c-9a59-4ef99aab6524-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-ks27d\" (UID: \"e56a0152-c9d6-4e9c-9a59-4ef99aab6524\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ks27d" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.376787 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/25d8da8c-5fc5-42ea-8779-2394e32fadee-available-featuregates\") pod \"openshift-config-operator-7777fb866f-kznql\" (UID: \"25d8da8c-5fc5-42ea-8779-2394e32fadee\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-kznql" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.376810 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-msgq4\" (UniqueName: \"kubernetes.io/projected/f8d9863a-2779-463c-8d73-76246a51b333-kube-api-access-msgq4\") pod \"console-f9d7485db-xxlhj\" (UID: \"f8d9863a-2779-463c-8d73-76246a51b333\") " pod="openshift-console/console-f9d7485db-xxlhj" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.376833 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4dbec086-a865-4859-adbf-ab61d8395463-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-r924c\" (UID: \"4dbec086-a865-4859-adbf-ab61d8395463\") " pod="openshift-authentication/oauth-openshift-558db77b4-r924c" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.376855 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bftb\" (UniqueName: \"kubernetes.io/projected/a2bbe4d3-67cf-4517-93c7-528e1694f76c-kube-api-access-7bftb\") pod \"apiserver-7bbb656c7d-k72f8\" (UID: \"a2bbe4d3-67cf-4517-93c7-528e1694f76c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k72f8" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.376875 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/df54df8a-669c-4230-b377-640a79b757ab-serving-cert\") pod \"controller-manager-879f6c89f-6hr2j\" (UID: \"df54df8a-669c-4230-b377-640a79b757ab\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6hr2j" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.376893 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a2bbe4d3-67cf-4517-93c7-528e1694f76c-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-k72f8\" (UID: \"a2bbe4d3-67cf-4517-93c7-528e1694f76c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k72f8" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.377003 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/4a3d0d98-5bcb-44a8-beda-1a8559f504ad-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-cdxxd\" (UID: \"4a3d0d98-5bcb-44a8-beda-1a8559f504ad\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cdxxd" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.377030 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f8d9863a-2779-463c-8d73-76246a51b333-console-serving-cert\") pod \"console-f9d7485db-xxlhj\" (UID: \"f8d9863a-2779-463c-8d73-76246a51b333\") " pod="openshift-console/console-f9d7485db-xxlhj" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.377052 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4dbec086-a865-4859-adbf-ab61d8395463-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-r924c\" (UID: \"4dbec086-a865-4859-adbf-ab61d8395463\") " pod="openshift-authentication/oauth-openshift-558db77b4-r924c" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.377072 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9f6cc294-684f-4ac2-8eb1-183af364c619-client-ca\") pod \"route-controller-manager-6576b87f9c-mh7gn\" (UID: \"9f6cc294-684f-4ac2-8eb1-183af364c619\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mh7gn" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.377094 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a2bbe4d3-67cf-4517-93c7-528e1694f76c-etcd-client\") pod \"apiserver-7bbb656c7d-k72f8\" (UID: \"a2bbe4d3-67cf-4517-93c7-528e1694f76c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k72f8" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.377119 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/df54df8a-669c-4230-b377-640a79b757ab-client-ca\") pod \"controller-manager-879f6c89f-6hr2j\" (UID: \"df54df8a-669c-4230-b377-640a79b757ab\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6hr2j" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.377143 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9m8v\" (UniqueName: \"kubernetes.io/projected/bc66ff05-f3e1-457d-bed3-b56751586ac4-kube-api-access-x9m8v\") pod \"openshift-controller-manager-operator-756b6f6bc6-qpls2\" (UID: \"bc66ff05-f3e1-457d-bed3-b56751586ac4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qpls2" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.377163 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f8d9863a-2779-463c-8d73-76246a51b333-console-oauth-config\") pod \"console-f9d7485db-xxlhj\" (UID: \"f8d9863a-2779-463c-8d73-76246a51b333\") " pod="openshift-console/console-f9d7485db-xxlhj" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.377258 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4a3d0d98-5bcb-44a8-beda-1a8559f504ad-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-cdxxd\" (UID: \"4a3d0d98-5bcb-44a8-beda-1a8559f504ad\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cdxxd" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.377277 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc66ff05-f3e1-457d-bed3-b56751586ac4-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-qpls2\" (UID: \"bc66ff05-f3e1-457d-bed3-b56751586ac4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qpls2" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.377299 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e21be4e-1543-4e91-b451-6b7d9f258195-serving-cert\") pod \"console-operator-58897d9998-c2mqh\" (UID: \"8e21be4e-1543-4e91-b451-6b7d9f258195\") " pod="openshift-console-operator/console-operator-58897d9998-c2mqh" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.377317 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzd2z\" (UniqueName: \"kubernetes.io/projected/cb417697-bbf5-4de5-ae9c-c04c37623e57-kube-api-access-kzd2z\") pod \"downloads-7954f5f757-gpv67\" (UID: \"cb417697-bbf5-4de5-ae9c-c04c37623e57\") " pod="openshift-console/downloads-7954f5f757-gpv67" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.377348 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4dbec086-a865-4859-adbf-ab61d8395463-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-r924c\" (UID: \"4dbec086-a865-4859-adbf-ab61d8395463\") " pod="openshift-authentication/oauth-openshift-558db77b4-r924c" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.377367 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d4648241-9b66-421e-b267-fc03442657a8-etcd-client\") pod \"apiserver-76f77b778f-z4njj\" (UID: \"d4648241-9b66-421e-b267-fc03442657a8\") " pod="openshift-apiserver/apiserver-76f77b778f-z4njj" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.377390 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df54df8a-669c-4230-b377-640a79b757ab-config\") pod \"controller-manager-879f6c89f-6hr2j\" (UID: \"df54df8a-669c-4230-b377-640a79b757ab\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6hr2j" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.377412 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/d4648241-9b66-421e-b267-fc03442657a8-audit\") pod \"apiserver-76f77b778f-z4njj\" (UID: \"d4648241-9b66-421e-b267-fc03442657a8\") " pod="openshift-apiserver/apiserver-76f77b778f-z4njj" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.377543 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4dbec086-a865-4859-adbf-ab61d8395463-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-r924c\" (UID: \"4dbec086-a865-4859-adbf-ab61d8395463\") " pod="openshift-authentication/oauth-openshift-558db77b4-r924c" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.377565 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e56a0152-c9d6-4e9c-9a59-4ef99aab6524-config\") pod \"openshift-apiserver-operator-796bbdcf4f-ks27d\" (UID: \"e56a0152-c9d6-4e9c-9a59-4ef99aab6524\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ks27d" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.377584 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a2bbe4d3-67cf-4517-93c7-528e1694f76c-encryption-config\") pod \"apiserver-7bbb656c7d-k72f8\" (UID: \"a2bbe4d3-67cf-4517-93c7-528e1694f76c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k72f8" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.377606 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c13c8a39-c00b-46e6-b721-57c74409a776-auth-proxy-config\") pod \"machine-approver-56656f9798-9hczz\" (UID: \"c13c8a39-c00b-46e6-b721-57c74409a776\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9hczz" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.377625 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4dbec086-a865-4859-adbf-ab61d8395463-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-r924c\" (UID: \"4dbec086-a865-4859-adbf-ab61d8395463\") " pod="openshift-authentication/oauth-openshift-558db77b4-r924c" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.377645 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a0f846e4-03b3-4bce-a909-75339183ebcb-serving-cert\") pod \"authentication-operator-69f744f599-nz5vs\" (UID: \"a0f846e4-03b3-4bce-a909-75339183ebcb\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-nz5vs" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.377665 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2rw2\" (UniqueName: \"kubernetes.io/projected/4a3d0d98-5bcb-44a8-beda-1a8559f504ad-kube-api-access-q2rw2\") pod \"cluster-image-registry-operator-dc59b4c8b-cdxxd\" (UID: \"4a3d0d98-5bcb-44a8-beda-1a8559f504ad\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cdxxd" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.377702 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f8d9863a-2779-463c-8d73-76246a51b333-console-config\") pod \"console-f9d7485db-xxlhj\" (UID: \"f8d9863a-2779-463c-8d73-76246a51b333\") " pod="openshift-console/console-f9d7485db-xxlhj" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.377690 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a0f846e4-03b3-4bce-a909-75339183ebcb-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-nz5vs\" (UID: \"a0f846e4-03b3-4bce-a909-75339183ebcb\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-nz5vs" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.377782 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9p4zx\" (UniqueName: \"kubernetes.io/projected/df54df8a-669c-4230-b377-640a79b757ab-kube-api-access-9p4zx\") pod \"controller-manager-879f6c89f-6hr2j\" (UID: \"df54df8a-669c-4230-b377-640a79b757ab\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6hr2j" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.377805 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2bbe4d3-67cf-4517-93c7-528e1694f76c-serving-cert\") pod \"apiserver-7bbb656c7d-k72f8\" (UID: \"a2bbe4d3-67cf-4517-93c7-528e1694f76c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k72f8" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.377847 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c13c8a39-c00b-46e6-b721-57c74409a776-config\") pod \"machine-approver-56656f9798-9hczz\" (UID: \"c13c8a39-c00b-46e6-b721-57c74409a776\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9hczz" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.377867 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25d8da8c-5fc5-42ea-8779-2394e32fadee-serving-cert\") pod \"openshift-config-operator-7777fb866f-kznql\" (UID: \"25d8da8c-5fc5-42ea-8779-2394e32fadee\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-kznql" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.377897 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d4648241-9b66-421e-b267-fc03442657a8-etcd-serving-ca\") pod \"apiserver-76f77b778f-z4njj\" (UID: \"d4648241-9b66-421e-b267-fc03442657a8\") " pod="openshift-apiserver/apiserver-76f77b778f-z4njj" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.379018 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xnwvg"] Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.380563 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xnwvg" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.384595 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d4648241-9b66-421e-b267-fc03442657a8-audit-dir\") pod \"apiserver-76f77b778f-z4njj\" (UID: \"d4648241-9b66-421e-b267-fc03442657a8\") " pod="openshift-apiserver/apiserver-76f77b778f-z4njj" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.384661 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4dbec086-a865-4859-adbf-ab61d8395463-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-r924c\" (UID: \"4dbec086-a865-4859-adbf-ab61d8395463\") " pod="openshift-authentication/oauth-openshift-558db77b4-r924c" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.385073 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4a3d0d98-5bcb-44a8-beda-1a8559f504ad-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-cdxxd\" (UID: \"4a3d0d98-5bcb-44a8-beda-1a8559f504ad\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cdxxd" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.385214 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t85jj\" (UniqueName: \"kubernetes.io/projected/d1f45a14-a0f6-4585-8171-2ef3dc5e5d9b-kube-api-access-t85jj\") pod \"machine-api-operator-5694c8668f-mxwsj\" (UID: \"d1f45a14-a0f6-4585-8171-2ef3dc5e5d9b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mxwsj" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.385357 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9bkv\" (UniqueName: \"kubernetes.io/projected/d4648241-9b66-421e-b267-fc03442657a8-kube-api-access-b9bkv\") pod \"apiserver-76f77b778f-z4njj\" (UID: \"d4648241-9b66-421e-b267-fc03442657a8\") " pod="openshift-apiserver/apiserver-76f77b778f-z4njj" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.385482 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95mgg\" (UniqueName: \"kubernetes.io/projected/c13c8a39-c00b-46e6-b721-57c74409a776-kube-api-access-95mgg\") pod \"machine-approver-56656f9798-9hczz\" (UID: \"c13c8a39-c00b-46e6-b721-57c74409a776\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9hczz" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.385603 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4648241-9b66-421e-b267-fc03442657a8-config\") pod \"apiserver-76f77b778f-z4njj\" (UID: \"d4648241-9b66-421e-b267-fc03442657a8\") " pod="openshift-apiserver/apiserver-76f77b778f-z4njj" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.385722 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f8d9863a-2779-463c-8d73-76246a51b333-service-ca\") pod \"console-f9d7485db-xxlhj\" (UID: \"f8d9863a-2779-463c-8d73-76246a51b333\") " pod="openshift-console/console-f9d7485db-xxlhj" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.385845 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8d9863a-2779-463c-8d73-76246a51b333-trusted-ca-bundle\") pod \"console-f9d7485db-xxlhj\" (UID: \"f8d9863a-2779-463c-8d73-76246a51b333\") " pod="openshift-console/console-f9d7485db-xxlhj" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.385998 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4dbec086-a865-4859-adbf-ab61d8395463-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-r924c\" (UID: \"4dbec086-a865-4859-adbf-ab61d8395463\") " pod="openshift-authentication/oauth-openshift-558db77b4-r924c" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.386118 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/df54df8a-669c-4230-b377-640a79b757ab-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-6hr2j\" (UID: \"df54df8a-669c-4230-b377-640a79b757ab\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6hr2j" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.386218 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d4648241-9b66-421e-b267-fc03442657a8-node-pullsecrets\") pod \"apiserver-76f77b778f-z4njj\" (UID: \"d4648241-9b66-421e-b267-fc03442657a8\") " pod="openshift-apiserver/apiserver-76f77b778f-z4njj" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.386317 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htccl\" (UniqueName: \"kubernetes.io/projected/e56a0152-c9d6-4e9c-9a59-4ef99aab6524-kube-api-access-htccl\") pod \"openshift-apiserver-operator-796bbdcf4f-ks27d\" (UID: \"e56a0152-c9d6-4e9c-9a59-4ef99aab6524\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ks27d" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.386418 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f8d9863a-2779-463c-8d73-76246a51b333-oauth-serving-cert\") pod \"console-f9d7485db-xxlhj\" (UID: \"f8d9863a-2779-463c-8d73-76246a51b333\") " pod="openshift-console/console-f9d7485db-xxlhj" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.386517 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4dbec086-a865-4859-adbf-ab61d8395463-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-r924c\" (UID: \"4dbec086-a865-4859-adbf-ab61d8395463\") " pod="openshift-authentication/oauth-openshift-558db77b4-r924c" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.386614 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4dbec086-a865-4859-adbf-ab61d8395463-audit-dir\") pod \"oauth-openshift-558db77b4-r924c\" (UID: \"4dbec086-a865-4859-adbf-ab61d8395463\") " pod="openshift-authentication/oauth-openshift-558db77b4-r924c" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.386726 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a0f846e4-03b3-4bce-a909-75339183ebcb-service-ca-bundle\") pod \"authentication-operator-69f744f599-nz5vs\" (UID: \"a0f846e4-03b3-4bce-a909-75339183ebcb\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-nz5vs" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.386832 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a2bbe4d3-67cf-4517-93c7-528e1694f76c-audit-dir\") pod \"apiserver-7bbb656c7d-k72f8\" (UID: \"a2bbe4d3-67cf-4517-93c7-528e1694f76c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k72f8" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.386990 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4648241-9b66-421e-b267-fc03442657a8-serving-cert\") pod \"apiserver-76f77b778f-z4njj\" (UID: \"d4648241-9b66-421e-b267-fc03442657a8\") " pod="openshift-apiserver/apiserver-76f77b778f-z4njj" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.387110 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4dbec086-a865-4859-adbf-ab61d8395463-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-r924c\" (UID: \"4dbec086-a865-4859-adbf-ab61d8395463\") " pod="openshift-authentication/oauth-openshift-558db77b4-r924c" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.387226 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1523d123-99b7-4f3f-a3f6-c5502562eedc-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-r87km\" (UID: \"1523d123-99b7-4f3f-a3f6-c5502562eedc\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-r87km" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.400174 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8e21be4e-1543-4e91-b451-6b7d9f258195-trusted-ca\") pod \"console-operator-58897d9998-c2mqh\" (UID: \"8e21be4e-1543-4e91-b451-6b7d9f258195\") " pod="openshift-console-operator/console-operator-58897d9998-c2mqh" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.400228 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1f45a14-a0f6-4585-8171-2ef3dc5e5d9b-config\") pod \"machine-api-operator-5694c8668f-mxwsj\" (UID: \"d1f45a14-a0f6-4585-8171-2ef3dc5e5d9b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mxwsj" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.400265 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4dbec086-a865-4859-adbf-ab61d8395463-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-r924c\" (UID: \"4dbec086-a865-4859-adbf-ab61d8395463\") " pod="openshift-authentication/oauth-openshift-558db77b4-r924c" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.400286 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e21be4e-1543-4e91-b451-6b7d9f258195-config\") pod \"console-operator-58897d9998-c2mqh\" (UID: \"8e21be4e-1543-4e91-b451-6b7d9f258195\") " pod="openshift-console-operator/console-operator-58897d9998-c2mqh" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.400311 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f6cc294-684f-4ac2-8eb1-183af364c619-config\") pod \"route-controller-manager-6576b87f9c-mh7gn\" (UID: \"9f6cc294-684f-4ac2-8eb1-183af364c619\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mh7gn" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.400334 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4sth8\" (UniqueName: \"kubernetes.io/projected/4dbec086-a865-4859-adbf-ab61d8395463-kube-api-access-4sth8\") pod \"oauth-openshift-558db77b4-r924c\" (UID: \"4dbec086-a865-4859-adbf-ab61d8395463\") " pod="openshift-authentication/oauth-openshift-558db77b4-r924c" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.400359 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a2bbe4d3-67cf-4517-93c7-528e1694f76c-audit-policies\") pod \"apiserver-7bbb656c7d-k72f8\" (UID: \"a2bbe4d3-67cf-4517-93c7-528e1694f76c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k72f8" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.400378 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4dbec086-a865-4859-adbf-ab61d8395463-audit-policies\") pod \"oauth-openshift-558db77b4-r924c\" (UID: \"4dbec086-a865-4859-adbf-ab61d8395463\") " pod="openshift-authentication/oauth-openshift-558db77b4-r924c" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.400400 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc66ff05-f3e1-457d-bed3-b56751586ac4-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-qpls2\" (UID: \"bc66ff05-f3e1-457d-bed3-b56751586ac4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qpls2" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.396243 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4dbec086-a865-4859-adbf-ab61d8395463-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-r924c\" (UID: \"4dbec086-a865-4859-adbf-ab61d8395463\") " pod="openshift-authentication/oauth-openshift-558db77b4-r924c" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.386229 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.386563 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-prjml"] Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.403626 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4dbec086-a865-4859-adbf-ab61d8395463-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-r924c\" (UID: \"4dbec086-a865-4859-adbf-ab61d8395463\") " pod="openshift-authentication/oauth-openshift-558db77b4-r924c" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.405583 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4dbec086-a865-4859-adbf-ab61d8395463-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-r924c\" (UID: \"4dbec086-a865-4859-adbf-ab61d8395463\") " pod="openshift-authentication/oauth-openshift-558db77b4-r924c" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.406284 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c13c8a39-c00b-46e6-b721-57c74409a776-config\") pod \"machine-approver-56656f9798-9hczz\" (UID: \"c13c8a39-c00b-46e6-b721-57c74409a776\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9hczz" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.406536 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c13c8a39-c00b-46e6-b721-57c74409a776-auth-proxy-config\") pod \"machine-approver-56656f9798-9hczz\" (UID: \"c13c8a39-c00b-46e6-b721-57c74409a776\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9hczz" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.407132 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9f6cc294-684f-4ac2-8eb1-183af364c619-client-ca\") pod \"route-controller-manager-6576b87f9c-mh7gn\" (UID: \"9f6cc294-684f-4ac2-8eb1-183af364c619\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mh7gn" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.416325 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f8d9863a-2779-463c-8d73-76246a51b333-console-serving-cert\") pod \"console-f9d7485db-xxlhj\" (UID: \"f8d9863a-2779-463c-8d73-76246a51b333\") " pod="openshift-console/console-f9d7485db-xxlhj" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.416854 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4dbec086-a865-4859-adbf-ab61d8395463-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-r924c\" (UID: \"4dbec086-a865-4859-adbf-ab61d8395463\") " pod="openshift-authentication/oauth-openshift-558db77b4-r924c" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.438867 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f8d9863a-2779-463c-8d73-76246a51b333-console-oauth-config\") pod \"console-f9d7485db-xxlhj\" (UID: \"f8d9863a-2779-463c-8d73-76246a51b333\") " pod="openshift-console/console-f9d7485db-xxlhj" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.439203 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f6cc294-684f-4ac2-8eb1-183af364c619-serving-cert\") pod \"route-controller-manager-6576b87f9c-mh7gn\" (UID: \"9f6cc294-684f-4ac2-8eb1-183af364c619\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mh7gn" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.439557 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjtrc\" (UniqueName: \"kubernetes.io/projected/9f6cc294-684f-4ac2-8eb1-183af364c619-kube-api-access-pjtrc\") pod \"route-controller-manager-6576b87f9c-mh7gn\" (UID: \"9f6cc294-684f-4ac2-8eb1-183af364c619\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mh7gn" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.439664 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/c13c8a39-c00b-46e6-b721-57c74409a776-machine-approver-tls\") pod \"machine-approver-56656f9798-9hczz\" (UID: \"c13c8a39-c00b-46e6-b721-57c74409a776\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9hczz" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.440558 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d1f45a14-a0f6-4585-8171-2ef3dc5e5d9b-images\") pod \"machine-api-operator-5694c8668f-mxwsj\" (UID: \"d1f45a14-a0f6-4585-8171-2ef3dc5e5d9b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mxwsj" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.440681 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4dbec086-a865-4859-adbf-ab61d8395463-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-r924c\" (UID: \"4dbec086-a865-4859-adbf-ab61d8395463\") " pod="openshift-authentication/oauth-openshift-558db77b4-r924c" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.440769 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0f846e4-03b3-4bce-a909-75339183ebcb-config\") pod \"authentication-operator-69f744f599-nz5vs\" (UID: \"a0f846e4-03b3-4bce-a909-75339183ebcb\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-nz5vs" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.441522 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0f846e4-03b3-4bce-a909-75339183ebcb-config\") pod \"authentication-operator-69f744f599-nz5vs\" (UID: \"a0f846e4-03b3-4bce-a909-75339183ebcb\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-nz5vs" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.442431 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f8d9863a-2779-463c-8d73-76246a51b333-service-ca\") pod \"console-f9d7485db-xxlhj\" (UID: \"f8d9863a-2779-463c-8d73-76246a51b333\") " pod="openshift-console/console-f9d7485db-xxlhj" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.443708 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.445201 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a0f846e4-03b3-4bce-a909-75339183ebcb-serving-cert\") pod \"authentication-operator-69f744f599-nz5vs\" (UID: \"a0f846e4-03b3-4bce-a909-75339183ebcb\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-nz5vs" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.444813 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a0f846e4-03b3-4bce-a909-75339183ebcb-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-nz5vs\" (UID: \"a0f846e4-03b3-4bce-a909-75339183ebcb\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-nz5vs" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.450429 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f8d9863a-2779-463c-8d73-76246a51b333-oauth-serving-cert\") pod \"console-f9d7485db-xxlhj\" (UID: \"f8d9863a-2779-463c-8d73-76246a51b333\") " pod="openshift-console/console-f9d7485db-xxlhj" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.452466 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4dbec086-a865-4859-adbf-ab61d8395463-audit-dir\") pod \"oauth-openshift-558db77b4-r924c\" (UID: \"4dbec086-a865-4859-adbf-ab61d8395463\") " pod="openshift-authentication/oauth-openshift-558db77b4-r924c" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.453568 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4dbec086-a865-4859-adbf-ab61d8395463-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-r924c\" (UID: \"4dbec086-a865-4859-adbf-ab61d8395463\") " pod="openshift-authentication/oauth-openshift-558db77b4-r924c" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.453641 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4dbec086-a865-4859-adbf-ab61d8395463-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-r924c\" (UID: \"4dbec086-a865-4859-adbf-ab61d8395463\") " pod="openshift-authentication/oauth-openshift-558db77b4-r924c" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.454112 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nm8gg"] Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.454471 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mwpns"] Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.454515 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4dbec086-a865-4859-adbf-ab61d8395463-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-r924c\" (UID: \"4dbec086-a865-4859-adbf-ab61d8395463\") " pod="openshift-authentication/oauth-openshift-558db77b4-r924c" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.454783 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-rsb58"] Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.454845 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/c13c8a39-c00b-46e6-b721-57c74409a776-machine-approver-tls\") pod \"machine-approver-56656f9798-9hczz\" (UID: \"c13c8a39-c00b-46e6-b721-57c74409a776\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9hczz" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.455286 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4lfqc"] Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.455456 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4dbec086-a865-4859-adbf-ab61d8395463-audit-policies\") pod \"oauth-openshift-558db77b4-r924c\" (UID: \"4dbec086-a865-4859-adbf-ab61d8395463\") " pod="openshift-authentication/oauth-openshift-558db77b4-r924c" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.455617 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f6cc294-684f-4ac2-8eb1-183af364c619-config\") pod \"route-controller-manager-6576b87f9c-mh7gn\" (UID: \"9f6cc294-684f-4ac2-8eb1-183af364c619\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mh7gn" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.455663 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-lvls8"] Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.456174 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a0f846e4-03b3-4bce-a909-75339183ebcb-service-ca-bundle\") pod \"authentication-operator-69f744f599-nz5vs\" (UID: \"a0f846e4-03b3-4bce-a909-75339183ebcb\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-nz5vs" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.456193 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-2zzzb"] Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.456357 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nm8gg" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.443778 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.456866 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-4gqqs"] Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.444767 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8d9863a-2779-463c-8d73-76246a51b333-trusted-ca-bundle\") pod \"console-f9d7485db-xxlhj\" (UID: \"f8d9863a-2779-463c-8d73-76246a51b333\") " pod="openshift-console/console-f9d7485db-xxlhj" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.443990 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.457377 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-7pcrk"] Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.457731 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4dbec086-a865-4859-adbf-ab61d8395463-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-r924c\" (UID: \"4dbec086-a865-4859-adbf-ab61d8395463\") " pod="openshift-authentication/oauth-openshift-558db77b4-r924c" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.457738 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-qfbp2"] Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.457811 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-prjml" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.458154 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bls26"] Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.458362 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4dbec086-a865-4859-adbf-ab61d8395463-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-r924c\" (UID: \"4dbec086-a865-4859-adbf-ab61d8395463\") " pod="openshift-authentication/oauth-openshift-558db77b4-r924c" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.458569 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-5qcv4"] Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.458989 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1523d123-99b7-4f3f-a3f6-c5502562eedc-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-r87km\" (UID: \"1523d123-99b7-4f3f-a3f6-c5502562eedc\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-r87km" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.459082 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-5qcv4" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.459363 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mwpns" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.459540 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bls26" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.459962 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-2zzzb" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.460105 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-7pcrk" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.460140 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-rsb58" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.460184 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4lfqc" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.460280 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-lvls8" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.462790 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-qfbp2" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.464133 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4gqqs" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.464287 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.466333 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qcs4x"] Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.466825 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qcs4x" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.467296 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29418915-rch2v"] Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.467964 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29418915-rch2v" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.468568 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2r46v"] Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.469050 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-2r46v" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.469338 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-sbvc7"] Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.470000 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-sbvc7" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.471408 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cdxxd"] Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.472097 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-gpv67"] Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.472341 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4dbec086-a865-4859-adbf-ab61d8395463-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-r924c\" (UID: \"4dbec086-a865-4859-adbf-ab61d8395463\") " pod="openshift-authentication/oauth-openshift-558db77b4-r924c" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.472617 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ks27d"] Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.473809 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-k72f8"] Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.475184 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rw7gt"] Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.475871 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-6hr2j"] Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.475975 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rw7gt" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.476686 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-kznql"] Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.481794 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-prjml"] Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.483278 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.484879 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fp4mr"] Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.486576 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-7gwm4"] Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.488311 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-c74lm"] Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.488335 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xnwvg"] Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.489500 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-4gqqs"] Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.490865 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-ntzvg"] Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.492658 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bls26"] Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.493897 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4lfqc"] Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.495744 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-k99gl"] Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.501455 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nm8gg"] Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.501533 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mwpns"] Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.502079 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4dbec086-a865-4859-adbf-ab61d8395463-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-r924c\" (UID: \"4dbec086-a865-4859-adbf-ab61d8395463\") " pod="openshift-authentication/oauth-openshift-558db77b4-r924c" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.503546 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.505327 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-sbvc7"] Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.506260 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rw7gt"] Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.511521 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-5qcv4"] Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.517371 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-74mbw"] Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.520802 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-7pcrk"] Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.521384 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-74mbw" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.526131 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.531551 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-2zzzb"] Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.533014 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-lvls8"] Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.534739 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-rsb58"] Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.536534 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-z4njj"] Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.541274 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2r46v"] Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.542084 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/00cce455-9eba-4d80-a518-6c05a8efece5-etcd-service-ca\") pod \"etcd-operator-b45778765-c74lm\" (UID: \"00cce455-9eba-4d80-a518-6c05a8efece5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c74lm" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.542118 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e56a0152-c9d6-4e9c-9a59-4ef99aab6524-config\") pod \"openshift-apiserver-operator-796bbdcf4f-ks27d\" (UID: \"e56a0152-c9d6-4e9c-9a59-4ef99aab6524\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ks27d" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.542137 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a2bbe4d3-67cf-4517-93c7-528e1694f76c-encryption-config\") pod \"apiserver-7bbb656c7d-k72f8\" (UID: \"a2bbe4d3-67cf-4517-93c7-528e1694f76c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k72f8" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.542156 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2rw2\" (UniqueName: \"kubernetes.io/projected/4a3d0d98-5bcb-44a8-beda-1a8559f504ad-kube-api-access-q2rw2\") pod \"cluster-image-registry-operator-dc59b4c8b-cdxxd\" (UID: \"4a3d0d98-5bcb-44a8-beda-1a8559f504ad\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cdxxd" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.542174 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9p4zx\" (UniqueName: \"kubernetes.io/projected/df54df8a-669c-4230-b377-640a79b757ab-kube-api-access-9p4zx\") pod \"controller-manager-879f6c89f-6hr2j\" (UID: \"df54df8a-669c-4230-b377-640a79b757ab\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6hr2j" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.542190 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2bbe4d3-67cf-4517-93c7-528e1694f76c-serving-cert\") pod \"apiserver-7bbb656c7d-k72f8\" (UID: \"a2bbe4d3-67cf-4517-93c7-528e1694f76c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k72f8" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.542212 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7d44182d-206b-47ba-9355-dd21174fbea9-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-cxk4m\" (UID: \"7d44182d-206b-47ba-9355-dd21174fbea9\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cxk4m" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.542229 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d44182d-206b-47ba-9355-dd21174fbea9-config\") pod \"kube-controller-manager-operator-78b949d7b-cxk4m\" (UID: \"7d44182d-206b-47ba-9355-dd21174fbea9\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cxk4m" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.542245 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25d8da8c-5fc5-42ea-8779-2394e32fadee-serving-cert\") pod \"openshift-config-operator-7777fb866f-kznql\" (UID: \"25d8da8c-5fc5-42ea-8779-2394e32fadee\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-kznql" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.542261 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d4648241-9b66-421e-b267-fc03442657a8-etcd-serving-ca\") pod \"apiserver-76f77b778f-z4njj\" (UID: \"d4648241-9b66-421e-b267-fc03442657a8\") " pod="openshift-apiserver/apiserver-76f77b778f-z4njj" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.542276 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d4648241-9b66-421e-b267-fc03442657a8-audit-dir\") pod \"apiserver-76f77b778f-z4njj\" (UID: \"d4648241-9b66-421e-b267-fc03442657a8\") " pod="openshift-apiserver/apiserver-76f77b778f-z4njj" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.542291 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e44f9afe-8dfb-40a8-802d-2dd4bae4d7e3-default-certificate\") pod \"router-default-5444994796-t955b\" (UID: \"e44f9afe-8dfb-40a8-802d-2dd4bae4d7e3\") " pod="openshift-ingress/router-default-5444994796-t955b" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.542308 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4a3d0d98-5bcb-44a8-beda-1a8559f504ad-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-cdxxd\" (UID: \"4a3d0d98-5bcb-44a8-beda-1a8559f504ad\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cdxxd" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.542325 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t85jj\" (UniqueName: \"kubernetes.io/projected/d1f45a14-a0f6-4585-8171-2ef3dc5e5d9b-kube-api-access-t85jj\") pod \"machine-api-operator-5694c8668f-mxwsj\" (UID: \"d1f45a14-a0f6-4585-8171-2ef3dc5e5d9b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mxwsj" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.542341 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9bkv\" (UniqueName: \"kubernetes.io/projected/d4648241-9b66-421e-b267-fc03442657a8-kube-api-access-b9bkv\") pod \"apiserver-76f77b778f-z4njj\" (UID: \"d4648241-9b66-421e-b267-fc03442657a8\") " pod="openshift-apiserver/apiserver-76f77b778f-z4njj" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.542362 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e44f9afe-8dfb-40a8-802d-2dd4bae4d7e3-stats-auth\") pod \"router-default-5444994796-t955b\" (UID: \"e44f9afe-8dfb-40a8-802d-2dd4bae4d7e3\") " pod="openshift-ingress/router-default-5444994796-t955b" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.542379 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/00cce455-9eba-4d80-a518-6c05a8efece5-etcd-ca\") pod \"etcd-operator-b45778765-c74lm\" (UID: \"00cce455-9eba-4d80-a518-6c05a8efece5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c74lm" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.542398 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4648241-9b66-421e-b267-fc03442657a8-config\") pod \"apiserver-76f77b778f-z4njj\" (UID: \"d4648241-9b66-421e-b267-fc03442657a8\") " pod="openshift-apiserver/apiserver-76f77b778f-z4njj" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.542421 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/df54df8a-669c-4230-b377-640a79b757ab-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-6hr2j\" (UID: \"df54df8a-669c-4230-b377-640a79b757ab\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6hr2j" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.542438 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d4648241-9b66-421e-b267-fc03442657a8-node-pullsecrets\") pod \"apiserver-76f77b778f-z4njj\" (UID: \"d4648241-9b66-421e-b267-fc03442657a8\") " pod="openshift-apiserver/apiserver-76f77b778f-z4njj" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.542457 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htccl\" (UniqueName: \"kubernetes.io/projected/e56a0152-c9d6-4e9c-9a59-4ef99aab6524-kube-api-access-htccl\") pod \"openshift-apiserver-operator-796bbdcf4f-ks27d\" (UID: \"e56a0152-c9d6-4e9c-9a59-4ef99aab6524\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ks27d" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.542477 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a2bbe4d3-67cf-4517-93c7-528e1694f76c-audit-dir\") pod \"apiserver-7bbb656c7d-k72f8\" (UID: \"a2bbe4d3-67cf-4517-93c7-528e1694f76c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k72f8" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.542495 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4648241-9b66-421e-b267-fc03442657a8-serving-cert\") pod \"apiserver-76f77b778f-z4njj\" (UID: \"d4648241-9b66-421e-b267-fc03442657a8\") " pod="openshift-apiserver/apiserver-76f77b778f-z4njj" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.542513 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgmqb\" (UniqueName: \"kubernetes.io/projected/e44f9afe-8dfb-40a8-802d-2dd4bae4d7e3-kube-api-access-zgmqb\") pod \"router-default-5444994796-t955b\" (UID: \"e44f9afe-8dfb-40a8-802d-2dd4bae4d7e3\") " pod="openshift-ingress/router-default-5444994796-t955b" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.542530 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnjc8\" (UniqueName: \"kubernetes.io/projected/00cce455-9eba-4d80-a518-6c05a8efece5-kube-api-access-nnjc8\") pod \"etcd-operator-b45778765-c74lm\" (UID: \"00cce455-9eba-4d80-a518-6c05a8efece5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c74lm" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.542553 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8e21be4e-1543-4e91-b451-6b7d9f258195-trusted-ca\") pod \"console-operator-58897d9998-c2mqh\" (UID: \"8e21be4e-1543-4e91-b451-6b7d9f258195\") " pod="openshift-console-operator/console-operator-58897d9998-c2mqh" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.542572 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1f45a14-a0f6-4585-8171-2ef3dc5e5d9b-config\") pod \"machine-api-operator-5694c8668f-mxwsj\" (UID: \"d1f45a14-a0f6-4585-8171-2ef3dc5e5d9b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mxwsj" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.542589 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e44f9afe-8dfb-40a8-802d-2dd4bae4d7e3-service-ca-bundle\") pod \"router-default-5444994796-t955b\" (UID: \"e44f9afe-8dfb-40a8-802d-2dd4bae4d7e3\") " pod="openshift-ingress/router-default-5444994796-t955b" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.542608 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e21be4e-1543-4e91-b451-6b7d9f258195-config\") pod \"console-operator-58897d9998-c2mqh\" (UID: \"8e21be4e-1543-4e91-b451-6b7d9f258195\") " pod="openshift-console-operator/console-operator-58897d9998-c2mqh" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.542625 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/00cce455-9eba-4d80-a518-6c05a8efece5-serving-cert\") pod \"etcd-operator-b45778765-c74lm\" (UID: \"00cce455-9eba-4d80-a518-6c05a8efece5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c74lm" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.542650 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a2bbe4d3-67cf-4517-93c7-528e1694f76c-audit-policies\") pod \"apiserver-7bbb656c7d-k72f8\" (UID: \"a2bbe4d3-67cf-4517-93c7-528e1694f76c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k72f8" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.542667 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc66ff05-f3e1-457d-bed3-b56751586ac4-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-qpls2\" (UID: \"bc66ff05-f3e1-457d-bed3-b56751586ac4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qpls2" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.542698 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d1f45a14-a0f6-4585-8171-2ef3dc5e5d9b-images\") pod \"machine-api-operator-5694c8668f-mxwsj\" (UID: \"d1f45a14-a0f6-4585-8171-2ef3dc5e5d9b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mxwsj" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.542718 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/7ddadb50-73b1-4948-807b-fe26ca78ea76-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-xnwvg\" (UID: \"7ddadb50-73b1-4948-807b-fe26ca78ea76\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xnwvg" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.542740 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a2bbe4d3-67cf-4517-93c7-528e1694f76c-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-k72f8\" (UID: \"a2bbe4d3-67cf-4517-93c7-528e1694f76c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k72f8" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.542756 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d4648241-9b66-421e-b267-fc03442657a8-trusted-ca-bundle\") pod \"apiserver-76f77b778f-z4njj\" (UID: \"d4648241-9b66-421e-b267-fc03442657a8\") " pod="openshift-apiserver/apiserver-76f77b778f-z4njj" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.542772 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d4648241-9b66-421e-b267-fc03442657a8-encryption-config\") pod \"apiserver-76f77b778f-z4njj\" (UID: \"d4648241-9b66-421e-b267-fc03442657a8\") " pod="openshift-apiserver/apiserver-76f77b778f-z4njj" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.542790 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hctpn\" (UniqueName: \"kubernetes.io/projected/25d8da8c-5fc5-42ea-8779-2394e32fadee-kube-api-access-hctpn\") pod \"openshift-config-operator-7777fb866f-kznql\" (UID: \"25d8da8c-5fc5-42ea-8779-2394e32fadee\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-kznql" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.542811 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/d4648241-9b66-421e-b267-fc03442657a8-image-import-ca\") pod \"apiserver-76f77b778f-z4njj\" (UID: \"d4648241-9b66-421e-b267-fc03442657a8\") " pod="openshift-apiserver/apiserver-76f77b778f-z4njj" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.542828 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/d1f45a14-a0f6-4585-8171-2ef3dc5e5d9b-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-mxwsj\" (UID: \"d1f45a14-a0f6-4585-8171-2ef3dc5e5d9b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mxwsj" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.542844 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d44182d-206b-47ba-9355-dd21174fbea9-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-cxk4m\" (UID: \"7d44182d-206b-47ba-9355-dd21174fbea9\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cxk4m" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.542866 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6tnc\" (UniqueName: \"kubernetes.io/projected/8e21be4e-1543-4e91-b451-6b7d9f258195-kube-api-access-v6tnc\") pod \"console-operator-58897d9998-c2mqh\" (UID: \"8e21be4e-1543-4e91-b451-6b7d9f258195\") " pod="openshift-console-operator/console-operator-58897d9998-c2mqh" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.542888 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e56a0152-c9d6-4e9c-9a59-4ef99aab6524-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-ks27d\" (UID: \"e56a0152-c9d6-4e9c-9a59-4ef99aab6524\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ks27d" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.542905 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/25d8da8c-5fc5-42ea-8779-2394e32fadee-available-featuregates\") pod \"openshift-config-operator-7777fb866f-kznql\" (UID: \"25d8da8c-5fc5-42ea-8779-2394e32fadee\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-kznql" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.542933 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6029b56f-3736-417d-b832-46b53a99a505-metrics-tls\") pod \"dns-operator-744455d44c-7gwm4\" (UID: \"6029b56f-3736-417d-b832-46b53a99a505\") " pod="openshift-dns-operator/dns-operator-744455d44c-7gwm4" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.542948 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jngql\" (UniqueName: \"kubernetes.io/projected/6029b56f-3736-417d-b832-46b53a99a505-kube-api-access-jngql\") pod \"dns-operator-744455d44c-7gwm4\" (UID: \"6029b56f-3736-417d-b832-46b53a99a505\") " pod="openshift-dns-operator/dns-operator-744455d44c-7gwm4" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.542971 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bftb\" (UniqueName: \"kubernetes.io/projected/a2bbe4d3-67cf-4517-93c7-528e1694f76c-kube-api-access-7bftb\") pod \"apiserver-7bbb656c7d-k72f8\" (UID: \"a2bbe4d3-67cf-4517-93c7-528e1694f76c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k72f8" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.542986 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/df54df8a-669c-4230-b377-640a79b757ab-serving-cert\") pod \"controller-manager-879f6c89f-6hr2j\" (UID: \"df54df8a-669c-4230-b377-640a79b757ab\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6hr2j" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.543001 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e44f9afe-8dfb-40a8-802d-2dd4bae4d7e3-metrics-certs\") pod \"router-default-5444994796-t955b\" (UID: \"e44f9afe-8dfb-40a8-802d-2dd4bae4d7e3\") " pod="openshift-ingress/router-default-5444994796-t955b" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.543019 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a2bbe4d3-67cf-4517-93c7-528e1694f76c-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-k72f8\" (UID: \"a2bbe4d3-67cf-4517-93c7-528e1694f76c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k72f8" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.543036 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/4a3d0d98-5bcb-44a8-beda-1a8559f504ad-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-cdxxd\" (UID: \"4a3d0d98-5bcb-44a8-beda-1a8559f504ad\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cdxxd" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.543053 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a2bbe4d3-67cf-4517-93c7-528e1694f76c-etcd-client\") pod \"apiserver-7bbb656c7d-k72f8\" (UID: \"a2bbe4d3-67cf-4517-93c7-528e1694f76c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k72f8" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.543087 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/df54df8a-669c-4230-b377-640a79b757ab-client-ca\") pod \"controller-manager-879f6c89f-6hr2j\" (UID: \"df54df8a-669c-4230-b377-640a79b757ab\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6hr2j" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.543104 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9m8v\" (UniqueName: \"kubernetes.io/projected/bc66ff05-f3e1-457d-bed3-b56751586ac4-kube-api-access-x9m8v\") pod \"openshift-controller-manager-operator-756b6f6bc6-qpls2\" (UID: \"bc66ff05-f3e1-457d-bed3-b56751586ac4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qpls2" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.543122 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rftw2\" (UniqueName: \"kubernetes.io/projected/7ddadb50-73b1-4948-807b-fe26ca78ea76-kube-api-access-rftw2\") pod \"control-plane-machine-set-operator-78cbb6b69f-xnwvg\" (UID: \"7ddadb50-73b1-4948-807b-fe26ca78ea76\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xnwvg" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.543145 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4a3d0d98-5bcb-44a8-beda-1a8559f504ad-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-cdxxd\" (UID: \"4a3d0d98-5bcb-44a8-beda-1a8559f504ad\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cdxxd" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.543159 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.543161 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc66ff05-f3e1-457d-bed3-b56751586ac4-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-qpls2\" (UID: \"bc66ff05-f3e1-457d-bed3-b56751586ac4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qpls2" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.543866 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc66ff05-f3e1-457d-bed3-b56751586ac4-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-qpls2\" (UID: \"bc66ff05-f3e1-457d-bed3-b56751586ac4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qpls2" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.545894 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a2bbe4d3-67cf-4517-93c7-528e1694f76c-audit-policies\") pod \"apiserver-7bbb656c7d-k72f8\" (UID: \"a2bbe4d3-67cf-4517-93c7-528e1694f76c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k72f8" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.547341 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d4648241-9b66-421e-b267-fc03442657a8-etcd-serving-ca\") pod \"apiserver-76f77b778f-z4njj\" (UID: \"d4648241-9b66-421e-b267-fc03442657a8\") " pod="openshift-apiserver/apiserver-76f77b778f-z4njj" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.547384 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d4648241-9b66-421e-b267-fc03442657a8-audit-dir\") pod \"apiserver-76f77b778f-z4njj\" (UID: \"d4648241-9b66-421e-b267-fc03442657a8\") " pod="openshift-apiserver/apiserver-76f77b778f-z4njj" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.547723 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e56a0152-c9d6-4e9c-9a59-4ef99aab6524-config\") pod \"openshift-apiserver-operator-796bbdcf4f-ks27d\" (UID: \"e56a0152-c9d6-4e9c-9a59-4ef99aab6524\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ks27d" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.549859 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a2bbe4d3-67cf-4517-93c7-528e1694f76c-encryption-config\") pod \"apiserver-7bbb656c7d-k72f8\" (UID: \"a2bbe4d3-67cf-4517-93c7-528e1694f76c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k72f8" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.550209 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25d8da8c-5fc5-42ea-8779-2394e32fadee-serving-cert\") pod \"openshift-config-operator-7777fb866f-kznql\" (UID: \"25d8da8c-5fc5-42ea-8779-2394e32fadee\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-kznql" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.550280 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc66ff05-f3e1-457d-bed3-b56751586ac4-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-qpls2\" (UID: \"bc66ff05-f3e1-457d-bed3-b56751586ac4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qpls2" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.550326 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cxk4m"] Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.550356 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29418915-rch2v"] Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.550370 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d1f45a14-a0f6-4585-8171-2ef3dc5e5d9b-images\") pod \"machine-api-operator-5694c8668f-mxwsj\" (UID: \"d1f45a14-a0f6-4585-8171-2ef3dc5e5d9b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mxwsj" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.550380 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a2bbe4d3-67cf-4517-93c7-528e1694f76c-audit-dir\") pod \"apiserver-7bbb656c7d-k72f8\" (UID: \"a2bbe4d3-67cf-4517-93c7-528e1694f76c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k72f8" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.550817 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a2bbe4d3-67cf-4517-93c7-528e1694f76c-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-k72f8\" (UID: \"a2bbe4d3-67cf-4517-93c7-528e1694f76c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k72f8" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.550875 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4648241-9b66-421e-b267-fc03442657a8-config\") pod \"apiserver-76f77b778f-z4njj\" (UID: \"d4648241-9b66-421e-b267-fc03442657a8\") " pod="openshift-apiserver/apiserver-76f77b778f-z4njj" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.551746 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d4648241-9b66-421e-b267-fc03442657a8-node-pullsecrets\") pod \"apiserver-76f77b778f-z4njj\" (UID: \"d4648241-9b66-421e-b267-fc03442657a8\") " pod="openshift-apiserver/apiserver-76f77b778f-z4njj" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.552205 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/df54df8a-669c-4230-b377-640a79b757ab-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-6hr2j\" (UID: \"df54df8a-669c-4230-b377-640a79b757ab\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6hr2j" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.552252 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/df54df8a-669c-4230-b377-640a79b757ab-client-ca\") pod \"controller-manager-879f6c89f-6hr2j\" (UID: \"df54df8a-669c-4230-b377-640a79b757ab\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6hr2j" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.552542 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d4648241-9b66-421e-b267-fc03442657a8-trusted-ca-bundle\") pod \"apiserver-76f77b778f-z4njj\" (UID: \"d4648241-9b66-421e-b267-fc03442657a8\") " pod="openshift-apiserver/apiserver-76f77b778f-z4njj" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.552600 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a2bbe4d3-67cf-4517-93c7-528e1694f76c-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-k72f8\" (UID: \"a2bbe4d3-67cf-4517-93c7-528e1694f76c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k72f8" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.553130 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e21be4e-1543-4e91-b451-6b7d9f258195-config\") pod \"console-operator-58897d9998-c2mqh\" (UID: \"8e21be4e-1543-4e91-b451-6b7d9f258195\") " pod="openshift-console-operator/console-operator-58897d9998-c2mqh" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.553943 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e21be4e-1543-4e91-b451-6b7d9f258195-serving-cert\") pod \"console-operator-58897d9998-c2mqh\" (UID: \"8e21be4e-1543-4e91-b451-6b7d9f258195\") " pod="openshift-console-operator/console-operator-58897d9998-c2mqh" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.554013 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzd2z\" (UniqueName: \"kubernetes.io/projected/cb417697-bbf5-4de5-ae9c-c04c37623e57-kube-api-access-kzd2z\") pod \"downloads-7954f5f757-gpv67\" (UID: \"cb417697-bbf5-4de5-ae9c-c04c37623e57\") " pod="openshift-console/downloads-7954f5f757-gpv67" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.554042 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00cce455-9eba-4d80-a518-6c05a8efece5-config\") pod \"etcd-operator-b45778765-c74lm\" (UID: \"00cce455-9eba-4d80-a518-6c05a8efece5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c74lm" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.554110 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/d4648241-9b66-421e-b267-fc03442657a8-image-import-ca\") pod \"apiserver-76f77b778f-z4njj\" (UID: \"d4648241-9b66-421e-b267-fc03442657a8\") " pod="openshift-apiserver/apiserver-76f77b778f-z4njj" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.554875 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d4648241-9b66-421e-b267-fc03442657a8-etcd-client\") pod \"apiserver-76f77b778f-z4njj\" (UID: \"d4648241-9b66-421e-b267-fc03442657a8\") " pod="openshift-apiserver/apiserver-76f77b778f-z4njj" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.555051 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df54df8a-669c-4230-b377-640a79b757ab-config\") pod \"controller-manager-879f6c89f-6hr2j\" (UID: \"df54df8a-669c-4230-b377-640a79b757ab\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6hr2j" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.555102 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/d4648241-9b66-421e-b267-fc03442657a8-audit\") pod \"apiserver-76f77b778f-z4njj\" (UID: \"d4648241-9b66-421e-b267-fc03442657a8\") " pod="openshift-apiserver/apiserver-76f77b778f-z4njj" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.555153 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/00cce455-9eba-4d80-a518-6c05a8efece5-etcd-client\") pod \"etcd-operator-b45778765-c74lm\" (UID: \"00cce455-9eba-4d80-a518-6c05a8efece5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c74lm" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.555436 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1f45a14-a0f6-4585-8171-2ef3dc5e5d9b-config\") pod \"machine-api-operator-5694c8668f-mxwsj\" (UID: \"d1f45a14-a0f6-4585-8171-2ef3dc5e5d9b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mxwsj" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.555568 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4a3d0d98-5bcb-44a8-beda-1a8559f504ad-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-cdxxd\" (UID: \"4a3d0d98-5bcb-44a8-beda-1a8559f504ad\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cdxxd" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.555871 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/25d8da8c-5fc5-42ea-8779-2394e32fadee-available-featuregates\") pod \"openshift-config-operator-7777fb866f-kznql\" (UID: \"25d8da8c-5fc5-42ea-8779-2394e32fadee\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-kznql" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.556304 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/d4648241-9b66-421e-b267-fc03442657a8-audit\") pod \"apiserver-76f77b778f-z4njj\" (UID: \"d4648241-9b66-421e-b267-fc03442657a8\") " pod="openshift-apiserver/apiserver-76f77b778f-z4njj" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.556700 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8e21be4e-1543-4e91-b451-6b7d9f258195-trusted-ca\") pod \"console-operator-58897d9998-c2mqh\" (UID: \"8e21be4e-1543-4e91-b451-6b7d9f258195\") " pod="openshift-console-operator/console-operator-58897d9998-c2mqh" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.557047 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a2bbe4d3-67cf-4517-93c7-528e1694f76c-etcd-client\") pod \"apiserver-7bbb656c7d-k72f8\" (UID: \"a2bbe4d3-67cf-4517-93c7-528e1694f76c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k72f8" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.557733 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df54df8a-669c-4230-b377-640a79b757ab-config\") pod \"controller-manager-879f6c89f-6hr2j\" (UID: \"df54df8a-669c-4230-b377-640a79b757ab\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6hr2j" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.558828 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2bbe4d3-67cf-4517-93c7-528e1694f76c-serving-cert\") pod \"apiserver-7bbb656c7d-k72f8\" (UID: \"a2bbe4d3-67cf-4517-93c7-528e1694f76c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k72f8" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.560528 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-74mbw"] Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.560533 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4648241-9b66-421e-b267-fc03442657a8-serving-cert\") pod \"apiserver-76f77b778f-z4njj\" (UID: \"d4648241-9b66-421e-b267-fc03442657a8\") " pod="openshift-apiserver/apiserver-76f77b778f-z4njj" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.561432 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e21be4e-1543-4e91-b451-6b7d9f258195-serving-cert\") pod \"console-operator-58897d9998-c2mqh\" (UID: \"8e21be4e-1543-4e91-b451-6b7d9f258195\") " pod="openshift-console-operator/console-operator-58897d9998-c2mqh" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.561993 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/d1f45a14-a0f6-4585-8171-2ef3dc5e5d9b-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-mxwsj\" (UID: \"d1f45a14-a0f6-4585-8171-2ef3dc5e5d9b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mxwsj" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.562932 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/4a3d0d98-5bcb-44a8-beda-1a8559f504ad-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-cdxxd\" (UID: \"4a3d0d98-5bcb-44a8-beda-1a8559f504ad\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cdxxd" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.563005 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qcs4x"] Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.564308 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/df54df8a-669c-4230-b377-640a79b757ab-serving-cert\") pod \"controller-manager-879f6c89f-6hr2j\" (UID: \"df54df8a-669c-4230-b377-640a79b757ab\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6hr2j" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.564799 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.565277 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d4648241-9b66-421e-b267-fc03442657a8-etcd-client\") pod \"apiserver-76f77b778f-z4njj\" (UID: \"d4648241-9b66-421e-b267-fc03442657a8\") " pod="openshift-apiserver/apiserver-76f77b778f-z4njj" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.566276 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e56a0152-c9d6-4e9c-9a59-4ef99aab6524-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-ks27d\" (UID: \"e56a0152-c9d6-4e9c-9a59-4ef99aab6524\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ks27d" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.566389 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d4648241-9b66-421e-b267-fc03442657a8-encryption-config\") pod \"apiserver-76f77b778f-z4njj\" (UID: \"d4648241-9b66-421e-b267-fc03442657a8\") " pod="openshift-apiserver/apiserver-76f77b778f-z4njj" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.582980 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.603122 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.622937 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.642435 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"439131639b3db2c54fa4db8f0818d645491cd3f295f651c50b3a460904d0b6a5"} Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.642856 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.643034 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.644208 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"2ada0e9d8119c0f6339556852021c1519f2be175a1e10a7166f3ecedc4473527"} Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.645631 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"02663723d9482c67b1d27c9bfae22b0813ad2879b67cb62c587466892f19d72a"} Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.655627 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e44f9afe-8dfb-40a8-802d-2dd4bae4d7e3-service-ca-bundle\") pod \"router-default-5444994796-t955b\" (UID: \"e44f9afe-8dfb-40a8-802d-2dd4bae4d7e3\") " pod="openshift-ingress/router-default-5444994796-t955b" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.655654 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnjc8\" (UniqueName: \"kubernetes.io/projected/00cce455-9eba-4d80-a518-6c05a8efece5-kube-api-access-nnjc8\") pod \"etcd-operator-b45778765-c74lm\" (UID: \"00cce455-9eba-4d80-a518-6c05a8efece5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c74lm" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.655678 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/00cce455-9eba-4d80-a518-6c05a8efece5-serving-cert\") pod \"etcd-operator-b45778765-c74lm\" (UID: \"00cce455-9eba-4d80-a518-6c05a8efece5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c74lm" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.655720 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/7ddadb50-73b1-4948-807b-fe26ca78ea76-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-xnwvg\" (UID: \"7ddadb50-73b1-4948-807b-fe26ca78ea76\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xnwvg" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.655757 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d44182d-206b-47ba-9355-dd21174fbea9-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-cxk4m\" (UID: \"7d44182d-206b-47ba-9355-dd21174fbea9\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cxk4m" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.655777 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6029b56f-3736-417d-b832-46b53a99a505-metrics-tls\") pod \"dns-operator-744455d44c-7gwm4\" (UID: \"6029b56f-3736-417d-b832-46b53a99a505\") " pod="openshift-dns-operator/dns-operator-744455d44c-7gwm4" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.655805 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jngql\" (UniqueName: \"kubernetes.io/projected/6029b56f-3736-417d-b832-46b53a99a505-kube-api-access-jngql\") pod \"dns-operator-744455d44c-7gwm4\" (UID: \"6029b56f-3736-417d-b832-46b53a99a505\") " pod="openshift-dns-operator/dns-operator-744455d44c-7gwm4" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.655819 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e44f9afe-8dfb-40a8-802d-2dd4bae4d7e3-metrics-certs\") pod \"router-default-5444994796-t955b\" (UID: \"e44f9afe-8dfb-40a8-802d-2dd4bae4d7e3\") " pod="openshift-ingress/router-default-5444994796-t955b" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.655841 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rftw2\" (UniqueName: \"kubernetes.io/projected/7ddadb50-73b1-4948-807b-fe26ca78ea76-kube-api-access-rftw2\") pod \"control-plane-machine-set-operator-78cbb6b69f-xnwvg\" (UID: \"7ddadb50-73b1-4948-807b-fe26ca78ea76\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xnwvg" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.655867 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00cce455-9eba-4d80-a518-6c05a8efece5-config\") pod \"etcd-operator-b45778765-c74lm\" (UID: \"00cce455-9eba-4d80-a518-6c05a8efece5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c74lm" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.655884 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/00cce455-9eba-4d80-a518-6c05a8efece5-etcd-client\") pod \"etcd-operator-b45778765-c74lm\" (UID: \"00cce455-9eba-4d80-a518-6c05a8efece5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c74lm" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.655898 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/00cce455-9eba-4d80-a518-6c05a8efece5-etcd-service-ca\") pod \"etcd-operator-b45778765-c74lm\" (UID: \"00cce455-9eba-4d80-a518-6c05a8efece5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c74lm" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.655951 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7d44182d-206b-47ba-9355-dd21174fbea9-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-cxk4m\" (UID: \"7d44182d-206b-47ba-9355-dd21174fbea9\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cxk4m" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.655966 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e44f9afe-8dfb-40a8-802d-2dd4bae4d7e3-default-certificate\") pod \"router-default-5444994796-t955b\" (UID: \"e44f9afe-8dfb-40a8-802d-2dd4bae4d7e3\") " pod="openshift-ingress/router-default-5444994796-t955b" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.655981 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d44182d-206b-47ba-9355-dd21174fbea9-config\") pod \"kube-controller-manager-operator-78b949d7b-cxk4m\" (UID: \"7d44182d-206b-47ba-9355-dd21174fbea9\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cxk4m" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.656018 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e44f9afe-8dfb-40a8-802d-2dd4bae4d7e3-stats-auth\") pod \"router-default-5444994796-t955b\" (UID: \"e44f9afe-8dfb-40a8-802d-2dd4bae4d7e3\") " pod="openshift-ingress/router-default-5444994796-t955b" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.656031 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/00cce455-9eba-4d80-a518-6c05a8efece5-etcd-ca\") pod \"etcd-operator-b45778765-c74lm\" (UID: \"00cce455-9eba-4d80-a518-6c05a8efece5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c74lm" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.656053 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgmqb\" (UniqueName: \"kubernetes.io/projected/e44f9afe-8dfb-40a8-802d-2dd4bae4d7e3-kube-api-access-zgmqb\") pod \"router-default-5444994796-t955b\" (UID: \"e44f9afe-8dfb-40a8-802d-2dd4bae4d7e3\") " pod="openshift-ingress/router-default-5444994796-t955b" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.657008 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e44f9afe-8dfb-40a8-802d-2dd4bae4d7e3-service-ca-bundle\") pod \"router-default-5444994796-t955b\" (UID: \"e44f9afe-8dfb-40a8-802d-2dd4bae4d7e3\") " pod="openshift-ingress/router-default-5444994796-t955b" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.657178 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00cce455-9eba-4d80-a518-6c05a8efece5-config\") pod \"etcd-operator-b45778765-c74lm\" (UID: \"00cce455-9eba-4d80-a518-6c05a8efece5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c74lm" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.657624 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/00cce455-9eba-4d80-a518-6c05a8efece5-etcd-service-ca\") pod \"etcd-operator-b45778765-c74lm\" (UID: \"00cce455-9eba-4d80-a518-6c05a8efece5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c74lm" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.657954 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/00cce455-9eba-4d80-a518-6c05a8efece5-etcd-ca\") pod \"etcd-operator-b45778765-c74lm\" (UID: \"00cce455-9eba-4d80-a518-6c05a8efece5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c74lm" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.661428 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/00cce455-9eba-4d80-a518-6c05a8efece5-etcd-client\") pod \"etcd-operator-b45778765-c74lm\" (UID: \"00cce455-9eba-4d80-a518-6c05a8efece5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c74lm" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.661506 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/00cce455-9eba-4d80-a518-6c05a8efece5-serving-cert\") pod \"etcd-operator-b45778765-c74lm\" (UID: \"00cce455-9eba-4d80-a518-6c05a8efece5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c74lm" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.661735 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e44f9afe-8dfb-40a8-802d-2dd4bae4d7e3-default-certificate\") pod \"router-default-5444994796-t955b\" (UID: \"e44f9afe-8dfb-40a8-802d-2dd4bae4d7e3\") " pod="openshift-ingress/router-default-5444994796-t955b" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.661817 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6029b56f-3736-417d-b832-46b53a99a505-metrics-tls\") pod \"dns-operator-744455d44c-7gwm4\" (UID: \"6029b56f-3736-417d-b832-46b53a99a505\") " pod="openshift-dns-operator/dns-operator-744455d44c-7gwm4" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.661887 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e44f9afe-8dfb-40a8-802d-2dd4bae4d7e3-stats-auth\") pod \"router-default-5444994796-t955b\" (UID: \"e44f9afe-8dfb-40a8-802d-2dd4bae4d7e3\") " pod="openshift-ingress/router-default-5444994796-t955b" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.682838 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.690366 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e44f9afe-8dfb-40a8-802d-2dd4bae4d7e3-metrics-certs\") pod \"router-default-5444994796-t955b\" (UID: \"e44f9afe-8dfb-40a8-802d-2dd4bae4d7e3\") " pod="openshift-ingress/router-default-5444994796-t955b" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.702695 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.723535 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.744186 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.764274 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.768986 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d44182d-206b-47ba-9355-dd21174fbea9-config\") pod \"kube-controller-manager-operator-78b949d7b-cxk4m\" (UID: \"7d44182d-206b-47ba-9355-dd21174fbea9\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cxk4m" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.783620 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.803573 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.843605 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.848016 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.862714 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.886796 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.903160 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.911844 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d44182d-206b-47ba-9355-dd21174fbea9-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-cxk4m\" (UID: \"7d44182d-206b-47ba-9355-dd21174fbea9\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cxk4m" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.922968 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.943237 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.963659 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.978490 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/7ddadb50-73b1-4948-807b-fe26ca78ea76-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-xnwvg\" (UID: \"7ddadb50-73b1-4948-807b-fe26ca78ea76\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xnwvg" Dec 07 19:17:30 crc kubenswrapper[4815]: I1207 19:17:30.982955 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 07 19:17:31 crc kubenswrapper[4815]: I1207 19:17:31.027583 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-msgq4\" (UniqueName: \"kubernetes.io/projected/f8d9863a-2779-463c-8d73-76246a51b333-kube-api-access-msgq4\") pod \"console-f9d7485db-xxlhj\" (UID: \"f8d9863a-2779-463c-8d73-76246a51b333\") " pod="openshift-console/console-f9d7485db-xxlhj" Dec 07 19:17:31 crc kubenswrapper[4815]: I1207 19:17:31.041414 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tdxz\" (UniqueName: \"kubernetes.io/projected/a0f846e4-03b3-4bce-a909-75339183ebcb-kube-api-access-6tdxz\") pod \"authentication-operator-69f744f599-nz5vs\" (UID: \"a0f846e4-03b3-4bce-a909-75339183ebcb\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-nz5vs" Dec 07 19:17:31 crc kubenswrapper[4815]: I1207 19:17:31.063662 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 07 19:17:31 crc kubenswrapper[4815]: I1207 19:17:31.071411 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chf44\" (UniqueName: \"kubernetes.io/projected/1523d123-99b7-4f3f-a3f6-c5502562eedc-kube-api-access-chf44\") pod \"cluster-samples-operator-665b6dd947-r87km\" (UID: \"1523d123-99b7-4f3f-a3f6-c5502562eedc\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-r87km" Dec 07 19:17:31 crc kubenswrapper[4815]: I1207 19:17:31.079954 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-r87km" Dec 07 19:17:31 crc kubenswrapper[4815]: I1207 19:17:31.089042 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-nz5vs" Dec 07 19:17:31 crc kubenswrapper[4815]: I1207 19:17:31.102109 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-xxlhj" Dec 07 19:17:31 crc kubenswrapper[4815]: I1207 19:17:31.137153 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjtrc\" (UniqueName: \"kubernetes.io/projected/9f6cc294-684f-4ac2-8eb1-183af364c619-kube-api-access-pjtrc\") pod \"route-controller-manager-6576b87f9c-mh7gn\" (UID: \"9f6cc294-684f-4ac2-8eb1-183af364c619\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mh7gn" Dec 07 19:17:31 crc kubenswrapper[4815]: I1207 19:17:31.138429 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95mgg\" (UniqueName: \"kubernetes.io/projected/c13c8a39-c00b-46e6-b721-57c74409a776-kube-api-access-95mgg\") pod \"machine-approver-56656f9798-9hczz\" (UID: \"c13c8a39-c00b-46e6-b721-57c74409a776\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9hczz" Dec 07 19:17:31 crc kubenswrapper[4815]: I1207 19:17:31.146831 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4sth8\" (UniqueName: \"kubernetes.io/projected/4dbec086-a865-4859-adbf-ab61d8395463-kube-api-access-4sth8\") pod \"oauth-openshift-558db77b4-r924c\" (UID: \"4dbec086-a865-4859-adbf-ab61d8395463\") " pod="openshift-authentication/oauth-openshift-558db77b4-r924c" Dec 07 19:17:31 crc kubenswrapper[4815]: I1207 19:17:31.164968 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 07 19:17:31 crc kubenswrapper[4815]: I1207 19:17:31.190245 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 07 19:17:31 crc kubenswrapper[4815]: I1207 19:17:31.204510 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 07 19:17:31 crc kubenswrapper[4815]: I1207 19:17:31.225296 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 07 19:17:31 crc kubenswrapper[4815]: I1207 19:17:31.244241 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 07 19:17:31 crc kubenswrapper[4815]: I1207 19:17:31.264706 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 07 19:17:31 crc kubenswrapper[4815]: I1207 19:17:31.283573 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 07 19:17:31 crc kubenswrapper[4815]: I1207 19:17:31.305249 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 07 19:17:31 crc kubenswrapper[4815]: I1207 19:17:31.323491 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 07 19:17:31 crc kubenswrapper[4815]: I1207 19:17:31.339091 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mh7gn" Dec 07 19:17:31 crc kubenswrapper[4815]: I1207 19:17:31.342497 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 07 19:17:31 crc kubenswrapper[4815]: I1207 19:17:31.352941 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-xxlhj"] Dec 07 19:17:31 crc kubenswrapper[4815]: W1207 19:17:31.363095 4815 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8d9863a_2779_463c_8d73_76246a51b333.slice/crio-61a99deab59b2786e77d7d3b14a9b598cfcf1fe6178097309645deeddf7bc3eb WatchSource:0}: Error finding container 61a99deab59b2786e77d7d3b14a9b598cfcf1fe6178097309645deeddf7bc3eb: Status 404 returned error can't find the container with id 61a99deab59b2786e77d7d3b14a9b598cfcf1fe6178097309645deeddf7bc3eb Dec 07 19:17:31 crc kubenswrapper[4815]: I1207 19:17:31.363353 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 07 19:17:31 crc kubenswrapper[4815]: I1207 19:17:31.367962 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-r924c" Dec 07 19:17:31 crc kubenswrapper[4815]: I1207 19:17:31.386261 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 07 19:17:31 crc kubenswrapper[4815]: I1207 19:17:31.403506 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 07 19:17:31 crc kubenswrapper[4815]: I1207 19:17:31.417463 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9hczz" Dec 07 19:17:31 crc kubenswrapper[4815]: I1207 19:17:31.427150 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 07 19:17:31 crc kubenswrapper[4815]: W1207 19:17:31.437544 4815 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc13c8a39_c00b_46e6_b721_57c74409a776.slice/crio-875bd86c4ff1665a7014cba5bb880d810b651e5b5eabbe493b5305376d44f469 WatchSource:0}: Error finding container 875bd86c4ff1665a7014cba5bb880d810b651e5b5eabbe493b5305376d44f469: Status 404 returned error can't find the container with id 875bd86c4ff1665a7014cba5bb880d810b651e5b5eabbe493b5305376d44f469 Dec 07 19:17:31 crc kubenswrapper[4815]: I1207 19:17:31.443241 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 07 19:17:31 crc kubenswrapper[4815]: I1207 19:17:31.461895 4815 request.go:700] Waited for 1.002079946s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/secrets?fieldSelector=metadata.name%3Dolm-operator-serving-cert&limit=500&resourceVersion=0 Dec 07 19:17:31 crc kubenswrapper[4815]: I1207 19:17:31.463237 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 07 19:17:31 crc kubenswrapper[4815]: I1207 19:17:31.485179 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 07 19:17:31 crc kubenswrapper[4815]: I1207 19:17:31.502733 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 07 19:17:31 crc kubenswrapper[4815]: I1207 19:17:31.523718 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 07 19:17:31 crc kubenswrapper[4815]: I1207 19:17:31.527723 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-r87km"] Dec 07 19:17:31 crc kubenswrapper[4815]: I1207 19:17:31.528807 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-nz5vs"] Dec 07 19:17:31 crc kubenswrapper[4815]: W1207 19:17:31.540460 4815 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0f846e4_03b3_4bce_a909_75339183ebcb.slice/crio-089344ae8cb8bc48b699b1c5e28e1db219798f073e0d3b26db2d0c72059757ca WatchSource:0}: Error finding container 089344ae8cb8bc48b699b1c5e28e1db219798f073e0d3b26db2d0c72059757ca: Status 404 returned error can't find the container with id 089344ae8cb8bc48b699b1c5e28e1db219798f073e0d3b26db2d0c72059757ca Dec 07 19:17:31 crc kubenswrapper[4815]: I1207 19:17:31.543099 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 07 19:17:31 crc kubenswrapper[4815]: I1207 19:17:31.563281 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 07 19:17:31 crc kubenswrapper[4815]: I1207 19:17:31.566334 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-r924c"] Dec 07 19:17:31 crc kubenswrapper[4815]: W1207 19:17:31.577279 4815 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4dbec086_a865_4859_adbf_ab61d8395463.slice/crio-f6d4a5da3a34ecd3946091b6fd90aa5ee96c95db062dd3ead3309532f7ca6854 WatchSource:0}: Error finding container f6d4a5da3a34ecd3946091b6fd90aa5ee96c95db062dd3ead3309532f7ca6854: Status 404 returned error can't find the container with id f6d4a5da3a34ecd3946091b6fd90aa5ee96c95db062dd3ead3309532f7ca6854 Dec 07 19:17:31 crc kubenswrapper[4815]: I1207 19:17:31.585721 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 07 19:17:31 crc kubenswrapper[4815]: I1207 19:17:31.602718 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 07 19:17:31 crc kubenswrapper[4815]: I1207 19:17:31.626929 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 07 19:17:31 crc kubenswrapper[4815]: I1207 19:17:31.643577 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 07 19:17:31 crc kubenswrapper[4815]: I1207 19:17:31.659680 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-xxlhj" event={"ID":"f8d9863a-2779-463c-8d73-76246a51b333","Type":"ContainerStarted","Data":"48f3e002343e239f0cee8fc41f70041db449369f6faab1576219881caedd3f19"} Dec 07 19:17:31 crc kubenswrapper[4815]: I1207 19:17:31.659765 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-xxlhj" event={"ID":"f8d9863a-2779-463c-8d73-76246a51b333","Type":"ContainerStarted","Data":"61a99deab59b2786e77d7d3b14a9b598cfcf1fe6178097309645deeddf7bc3eb"} Dec 07 19:17:31 crc kubenswrapper[4815]: I1207 19:17:31.665277 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-r924c" event={"ID":"4dbec086-a865-4859-adbf-ab61d8395463","Type":"ContainerStarted","Data":"f6d4a5da3a34ecd3946091b6fd90aa5ee96c95db062dd3ead3309532f7ca6854"} Dec 07 19:17:31 crc kubenswrapper[4815]: I1207 19:17:31.665335 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 07 19:17:31 crc kubenswrapper[4815]: I1207 19:17:31.671103 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-r87km" event={"ID":"1523d123-99b7-4f3f-a3f6-c5502562eedc","Type":"ContainerStarted","Data":"7ca77bd43e9c8a90ef4c7fc4004a597dd7ac810f7817fc12b0326d41889cff59"} Dec 07 19:17:31 crc kubenswrapper[4815]: I1207 19:17:31.673360 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9hczz" event={"ID":"c13c8a39-c00b-46e6-b721-57c74409a776","Type":"ContainerStarted","Data":"875bd86c4ff1665a7014cba5bb880d810b651e5b5eabbe493b5305376d44f469"} Dec 07 19:17:31 crc kubenswrapper[4815]: I1207 19:17:31.675632 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-nz5vs" event={"ID":"a0f846e4-03b3-4bce-a909-75339183ebcb","Type":"ContainerStarted","Data":"5452d0b8514f3028677d531a1b50ab5cba6d72e52fbe93ce8baf4beb321c83ed"} Dec 07 19:17:31 crc kubenswrapper[4815]: I1207 19:17:31.676996 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-nz5vs" event={"ID":"a0f846e4-03b3-4bce-a909-75339183ebcb","Type":"ContainerStarted","Data":"089344ae8cb8bc48b699b1c5e28e1db219798f073e0d3b26db2d0c72059757ca"} Dec 07 19:17:31 crc kubenswrapper[4815]: I1207 19:17:31.683290 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 07 19:17:31 crc kubenswrapper[4815]: I1207 19:17:31.702931 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 07 19:17:31 crc kubenswrapper[4815]: I1207 19:17:31.706521 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-mh7gn"] Dec 07 19:17:31 crc kubenswrapper[4815]: W1207 19:17:31.713534 4815 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f6cc294_684f_4ac2_8eb1_183af364c619.slice/crio-f2e64ceb5a41b2db28d78f7abd922bde732d98001eab561a7aecde4718a39fe9 WatchSource:0}: Error finding container f2e64ceb5a41b2db28d78f7abd922bde732d98001eab561a7aecde4718a39fe9: Status 404 returned error can't find the container with id f2e64ceb5a41b2db28d78f7abd922bde732d98001eab561a7aecde4718a39fe9 Dec 07 19:17:31 crc kubenswrapper[4815]: I1207 19:17:31.723257 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 07 19:17:31 crc kubenswrapper[4815]: I1207 19:17:31.743613 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 07 19:17:31 crc kubenswrapper[4815]: I1207 19:17:31.764204 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 07 19:17:31 crc kubenswrapper[4815]: I1207 19:17:31.783697 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 07 19:17:31 crc kubenswrapper[4815]: I1207 19:17:31.802777 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 07 19:17:31 crc kubenswrapper[4815]: I1207 19:17:31.829483 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 07 19:17:31 crc kubenswrapper[4815]: I1207 19:17:31.843144 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 07 19:17:31 crc kubenswrapper[4815]: I1207 19:17:31.863389 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 07 19:17:31 crc kubenswrapper[4815]: I1207 19:17:31.884515 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 07 19:17:31 crc kubenswrapper[4815]: I1207 19:17:31.903382 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 07 19:17:31 crc kubenswrapper[4815]: I1207 19:17:31.923162 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 07 19:17:31 crc kubenswrapper[4815]: I1207 19:17:31.944162 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 07 19:17:31 crc kubenswrapper[4815]: I1207 19:17:31.963097 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 07 19:17:31 crc kubenswrapper[4815]: I1207 19:17:31.983830 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.002505 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.023054 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.044200 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.063645 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.088309 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.109470 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.123510 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.143709 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.164402 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.182951 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.205251 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.222979 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.243579 4815 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.263943 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.282737 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.322237 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9bkv\" (UniqueName: \"kubernetes.io/projected/d4648241-9b66-421e-b267-fc03442657a8-kube-api-access-b9bkv\") pod \"apiserver-76f77b778f-z4njj\" (UID: \"d4648241-9b66-421e-b267-fc03442657a8\") " pod="openshift-apiserver/apiserver-76f77b778f-z4njj" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.337248 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2rw2\" (UniqueName: \"kubernetes.io/projected/4a3d0d98-5bcb-44a8-beda-1a8559f504ad-kube-api-access-q2rw2\") pod \"cluster-image-registry-operator-dc59b4c8b-cdxxd\" (UID: \"4a3d0d98-5bcb-44a8-beda-1a8559f504ad\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cdxxd" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.360189 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9p4zx\" (UniqueName: \"kubernetes.io/projected/df54df8a-669c-4230-b377-640a79b757ab-kube-api-access-9p4zx\") pod \"controller-manager-879f6c89f-6hr2j\" (UID: \"df54df8a-669c-4230-b377-640a79b757ab\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6hr2j" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.379948 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4a3d0d98-5bcb-44a8-beda-1a8559f504ad-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-cdxxd\" (UID: \"4a3d0d98-5bcb-44a8-beda-1a8559f504ad\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cdxxd" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.396254 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t85jj\" (UniqueName: \"kubernetes.io/projected/d1f45a14-a0f6-4585-8171-2ef3dc5e5d9b-kube-api-access-t85jj\") pod \"machine-api-operator-5694c8668f-mxwsj\" (UID: \"d1f45a14-a0f6-4585-8171-2ef3dc5e5d9b\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-mxwsj" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.398159 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-mxwsj" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.416940 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bftb\" (UniqueName: \"kubernetes.io/projected/a2bbe4d3-67cf-4517-93c7-528e1694f76c-kube-api-access-7bftb\") pod \"apiserver-7bbb656c7d-k72f8\" (UID: \"a2bbe4d3-67cf-4517-93c7-528e1694f76c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k72f8" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.430292 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k72f8" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.436898 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9m8v\" (UniqueName: \"kubernetes.io/projected/bc66ff05-f3e1-457d-bed3-b56751586ac4-kube-api-access-x9m8v\") pod \"openshift-controller-manager-operator-756b6f6bc6-qpls2\" (UID: \"bc66ff05-f3e1-457d-bed3-b56751586ac4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qpls2" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.456231 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htccl\" (UniqueName: \"kubernetes.io/projected/e56a0152-c9d6-4e9c-9a59-4ef99aab6524-kube-api-access-htccl\") pod \"openshift-apiserver-operator-796bbdcf4f-ks27d\" (UID: \"e56a0152-c9d6-4e9c-9a59-4ef99aab6524\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ks27d" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.462422 4815 request.go:700] Waited for 1.909745305s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-config-operator/serviceaccounts/openshift-config-operator/token Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.462825 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-z4njj" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.486982 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cdxxd" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.487340 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hctpn\" (UniqueName: \"kubernetes.io/projected/25d8da8c-5fc5-42ea-8779-2394e32fadee-kube-api-access-hctpn\") pod \"openshift-config-operator-7777fb866f-kznql\" (UID: \"25d8da8c-5fc5-42ea-8779-2394e32fadee\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-kznql" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.497960 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzd2z\" (UniqueName: \"kubernetes.io/projected/cb417697-bbf5-4de5-ae9c-c04c37623e57-kube-api-access-kzd2z\") pod \"downloads-7954f5f757-gpv67\" (UID: \"cb417697-bbf5-4de5-ae9c-c04c37623e57\") " pod="openshift-console/downloads-7954f5f757-gpv67" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.521686 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6tnc\" (UniqueName: \"kubernetes.io/projected/8e21be4e-1543-4e91-b451-6b7d9f258195-kube-api-access-v6tnc\") pod \"console-operator-58897d9998-c2mqh\" (UID: \"8e21be4e-1543-4e91-b451-6b7d9f258195\") " pod="openshift-console-operator/console-operator-58897d9998-c2mqh" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.540620 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rftw2\" (UniqueName: \"kubernetes.io/projected/7ddadb50-73b1-4948-807b-fe26ca78ea76-kube-api-access-rftw2\") pod \"control-plane-machine-set-operator-78cbb6b69f-xnwvg\" (UID: \"7ddadb50-73b1-4948-807b-fe26ca78ea76\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xnwvg" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.558429 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgmqb\" (UniqueName: \"kubernetes.io/projected/e44f9afe-8dfb-40a8-802d-2dd4bae4d7e3-kube-api-access-zgmqb\") pod \"router-default-5444994796-t955b\" (UID: \"e44f9afe-8dfb-40a8-802d-2dd4bae4d7e3\") " pod="openshift-ingress/router-default-5444994796-t955b" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.578356 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnjc8\" (UniqueName: \"kubernetes.io/projected/00cce455-9eba-4d80-a518-6c05a8efece5-kube-api-access-nnjc8\") pod \"etcd-operator-b45778765-c74lm\" (UID: \"00cce455-9eba-4d80-a518-6c05a8efece5\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c74lm" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.589191 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-c74lm" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.594274 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-t955b" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.602188 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-mxwsj"] Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.611421 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jngql\" (UniqueName: \"kubernetes.io/projected/6029b56f-3736-417d-b832-46b53a99a505-kube-api-access-jngql\") pod \"dns-operator-744455d44c-7gwm4\" (UID: \"6029b56f-3736-417d-b832-46b53a99a505\") " pod="openshift-dns-operator/dns-operator-744455d44c-7gwm4" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.620831 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7d44182d-206b-47ba-9355-dd21174fbea9-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-cxk4m\" (UID: \"7d44182d-206b-47ba-9355-dd21174fbea9\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cxk4m" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.628942 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xnwvg" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.639101 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-6hr2j" Dec 07 19:17:32 crc kubenswrapper[4815]: W1207 19:17:32.648158 4815 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1f45a14_a0f6_4585_8171_2ef3dc5e5d9b.slice/crio-135ab1325f0144e69807e5891782e84c45846c42c77254f6bc7104e190b66ca0 WatchSource:0}: Error finding container 135ab1325f0144e69807e5891782e84c45846c42c77254f6bc7104e190b66ca0: Status 404 returned error can't find the container with id 135ab1325f0144e69807e5891782e84c45846c42c77254f6bc7104e190b66ca0 Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.654784 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-c2mqh" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.670128 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ks27d" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.680278 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-kznql" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.690536 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mh7gn" event={"ID":"9f6cc294-684f-4ac2-8eb1-183af364c619","Type":"ContainerStarted","Data":"503e958d61e1dfcaca0ec3d4a2ae6fa25c4b0e6db4e19faac84c19fc7c1d4225"} Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.690590 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mh7gn" event={"ID":"9f6cc294-684f-4ac2-8eb1-183af364c619","Type":"ContainerStarted","Data":"f2e64ceb5a41b2db28d78f7abd922bde732d98001eab561a7aecde4718a39fe9"} Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.691443 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mh7gn" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.696366 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qpls2" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.699531 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/30ebc473-999f-4200-a9bf-714efb9646e1-bound-sa-token\") pod \"ingress-operator-5b745b69d9-ntzvg\" (UID: \"30ebc473-999f-4200-a9bf-714efb9646e1\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ntzvg" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.699648 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k99gl\" (UID: \"4c3b59c8-1311-4f2a-8ec6-699405d3a4ac\") " pod="openshift-image-registry/image-registry-697d97f7c8-k99gl" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.699674 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4w97x\" (UniqueName: \"kubernetes.io/projected/30ebc473-999f-4200-a9bf-714efb9646e1-kube-api-access-4w97x\") pod \"ingress-operator-5b745b69d9-ntzvg\" (UID: \"30ebc473-999f-4200-a9bf-714efb9646e1\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ntzvg" Dec 07 19:17:32 crc kubenswrapper[4815]: E1207 19:17:32.700147 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-07 19:17:33.2001341 +0000 UTC m=+157.779124145 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k99gl" (UID: "4c3b59c8-1311-4f2a-8ec6-699405d3a4ac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.700613 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4c3b59c8-1311-4f2a-8ec6-699405d3a4ac-bound-sa-token\") pod \"image-registry-697d97f7c8-k99gl\" (UID: \"4c3b59c8-1311-4f2a-8ec6-699405d3a4ac\") " pod="openshift-image-registry/image-registry-697d97f7c8-k99gl" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.700637 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4c3b59c8-1311-4f2a-8ec6-699405d3a4ac-ca-trust-extracted\") pod \"image-registry-697d97f7c8-k99gl\" (UID: \"4c3b59c8-1311-4f2a-8ec6-699405d3a4ac\") " pod="openshift-image-registry/image-registry-697d97f7c8-k99gl" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.700706 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/30ebc473-999f-4200-a9bf-714efb9646e1-metrics-tls\") pod \"ingress-operator-5b745b69d9-ntzvg\" (UID: \"30ebc473-999f-4200-a9bf-714efb9646e1\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ntzvg" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.700728 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l99rh\" (UniqueName: \"kubernetes.io/projected/4c3b59c8-1311-4f2a-8ec6-699405d3a4ac-kube-api-access-l99rh\") pod \"image-registry-697d97f7c8-k99gl\" (UID: \"4c3b59c8-1311-4f2a-8ec6-699405d3a4ac\") " pod="openshift-image-registry/image-registry-697d97f7c8-k99gl" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.700791 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8kkv\" (UniqueName: \"kubernetes.io/projected/d399274f-1d12-4ffb-b2b6-e2b3619c86e6-kube-api-access-k8kkv\") pod \"migrator-59844c95c7-prjml\" (UID: \"d399274f-1d12-4ffb-b2b6-e2b3619c86e6\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-prjml" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.700842 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e93e31e5-1375-4931-b77d-dd8b94f4cd4e-config\") pod \"kube-apiserver-operator-766d6c64bb-fp4mr\" (UID: \"e93e31e5-1375-4931-b77d-dd8b94f4cd4e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fp4mr" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.700858 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e93e31e5-1375-4931-b77d-dd8b94f4cd4e-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-fp4mr\" (UID: \"e93e31e5-1375-4931-b77d-dd8b94f4cd4e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fp4mr" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.700874 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/30ebc473-999f-4200-a9bf-714efb9646e1-trusted-ca\") pod \"ingress-operator-5b745b69d9-ntzvg\" (UID: \"30ebc473-999f-4200-a9bf-714efb9646e1\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ntzvg" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.700936 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4c3b59c8-1311-4f2a-8ec6-699405d3a4ac-trusted-ca\") pod \"image-registry-697d97f7c8-k99gl\" (UID: \"4c3b59c8-1311-4f2a-8ec6-699405d3a4ac\") " pod="openshift-image-registry/image-registry-697d97f7c8-k99gl" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.700955 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4c3b59c8-1311-4f2a-8ec6-699405d3a4ac-registry-tls\") pod \"image-registry-697d97f7c8-k99gl\" (UID: \"4c3b59c8-1311-4f2a-8ec6-699405d3a4ac\") " pod="openshift-image-registry/image-registry-697d97f7c8-k99gl" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.700974 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4c3b59c8-1311-4f2a-8ec6-699405d3a4ac-registry-certificates\") pod \"image-registry-697d97f7c8-k99gl\" (UID: \"4c3b59c8-1311-4f2a-8ec6-699405d3a4ac\") " pod="openshift-image-registry/image-registry-697d97f7c8-k99gl" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.701014 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4c3b59c8-1311-4f2a-8ec6-699405d3a4ac-installation-pull-secrets\") pod \"image-registry-697d97f7c8-k99gl\" (UID: \"4c3b59c8-1311-4f2a-8ec6-699405d3a4ac\") " pod="openshift-image-registry/image-registry-697d97f7c8-k99gl" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.701036 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e93e31e5-1375-4931-b77d-dd8b94f4cd4e-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-fp4mr\" (UID: \"e93e31e5-1375-4931-b77d-dd8b94f4cd4e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fp4mr" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.705512 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-gpv67" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.706627 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mh7gn" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.715381 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9hczz" event={"ID":"c13c8a39-c00b-46e6-b721-57c74409a776","Type":"ContainerStarted","Data":"276719f46dc97add464a457c9086bbce04b6be935aaef090bd16a0fa65e93bba"} Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.715418 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9hczz" event={"ID":"c13c8a39-c00b-46e6-b721-57c74409a776","Type":"ContainerStarted","Data":"51fa4070544ec9545aa81d7bf53af2e7fb4b4f2c0001e3cb6a4a578adacd81a4"} Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.718303 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-mxwsj" event={"ID":"d1f45a14-a0f6-4585-8171-2ef3dc5e5d9b","Type":"ContainerStarted","Data":"135ab1325f0144e69807e5891782e84c45846c42c77254f6bc7104e190b66ca0"} Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.719861 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-r924c" event={"ID":"4dbec086-a865-4859-adbf-ab61d8395463","Type":"ContainerStarted","Data":"1b93670e74fe118c42b4ca9ec9658221c91c2bcd3013cb5a33e119010bddbd87"} Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.720638 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-r924c" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.724253 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-r87km" event={"ID":"1523d123-99b7-4f3f-a3f6-c5502562eedc","Type":"ContainerStarted","Data":"8d9992a3e0b314123caa37d98ec24b013996bb56be770c44683d1b7353b9fbb6"} Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.724294 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-r87km" event={"ID":"1523d123-99b7-4f3f-a3f6-c5502562eedc","Type":"ContainerStarted","Data":"80626eed8a3873a9d6ac65a062dd422dfdf1ce9a59b4c880a7ba2202d5b14fcf"} Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.725699 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-t955b" event={"ID":"e44f9afe-8dfb-40a8-802d-2dd4bae4d7e3","Type":"ContainerStarted","Data":"7b68ae9fc38224fc367d08baadb19f6799ae3f4c68b1203a59ffc314934225d4"} Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.729626 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-r924c" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.806164 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.806367 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0f38c9c5-3002-4d5a-a880-42359a37848b-config-volume\") pod \"dns-default-rsb58\" (UID: \"0f38c9c5-3002-4d5a-a880-42359a37848b\") " pod="openshift-dns/dns-default-rsb58" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.806386 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmq4p\" (UniqueName: \"kubernetes.io/projected/5261a18d-d284-49f4-bc7b-196d2f5f5042-kube-api-access-lmq4p\") pod \"catalog-operator-68c6474976-qcs4x\" (UID: \"5261a18d-d284-49f4-bc7b-196d2f5f5042\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qcs4x" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.806408 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b23efab-81af-4c7a-9462-8d4333c2ce44-proxy-tls\") pod \"machine-config-controller-84d6567774-lvls8\" (UID: \"0b23efab-81af-4c7a-9462-8d4333c2ce44\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-lvls8" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.806454 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e93e31e5-1375-4931-b77d-dd8b94f4cd4e-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-fp4mr\" (UID: \"e93e31e5-1375-4931-b77d-dd8b94f4cd4e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fp4mr" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.806472 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/6f27c552-b00b-43f1-ac31-fc592228c5ca-certs\") pod \"machine-config-server-qfbp2\" (UID: \"6f27c552-b00b-43f1-ac31-fc592228c5ca\") " pod="openshift-machine-config-operator/machine-config-server-qfbp2" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.806488 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2r7bl\" (UniqueName: \"kubernetes.io/projected/1dea4684-eec3-484b-8bc7-f524b4671833-kube-api-access-2r7bl\") pod \"package-server-manager-789f6589d5-rw7gt\" (UID: \"1dea4684-eec3-484b-8bc7-f524b4671833\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rw7gt" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.806507 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cps7w\" (UniqueName: \"kubernetes.io/projected/ff9a9a52-3b06-411e-88d9-a0afcce6a632-kube-api-access-cps7w\") pod \"machine-config-operator-74547568cd-4gqqs\" (UID: \"ff9a9a52-3b06-411e-88d9-a0afcce6a632\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4gqqs" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.806523 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdlnz\" (UniqueName: \"kubernetes.io/projected/0b23efab-81af-4c7a-9462-8d4333c2ce44-kube-api-access-zdlnz\") pod \"machine-config-controller-84d6567774-lvls8\" (UID: \"0b23efab-81af-4c7a-9462-8d4333c2ce44\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-lvls8" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.806554 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e58da229-9be5-4d48-a1af-74d5316d09f3-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-2r46v\" (UID: \"e58da229-9be5-4d48-a1af-74d5316d09f3\") " pod="openshift-marketplace/marketplace-operator-79b997595-2r46v" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.806576 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/5261a18d-d284-49f4-bc7b-196d2f5f5042-srv-cert\") pod \"catalog-operator-68c6474976-qcs4x\" (UID: \"5261a18d-d284-49f4-bc7b-196d2f5f5042\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qcs4x" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.806592 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/666ce368-deb6-4d44-bb99-a11029a75bf2-apiservice-cert\") pod \"packageserver-d55dfcdfc-bls26\" (UID: \"666ce368-deb6-4d44-bb99-a11029a75bf2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bls26" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.806614 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/30ebc473-999f-4200-a9bf-714efb9646e1-bound-sa-token\") pod \"ingress-operator-5b745b69d9-ntzvg\" (UID: \"30ebc473-999f-4200-a9bf-714efb9646e1\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ntzvg" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.806639 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0f38c9c5-3002-4d5a-a880-42359a37848b-metrics-tls\") pod \"dns-default-rsb58\" (UID: \"0f38c9c5-3002-4d5a-a880-42359a37848b\") " pod="openshift-dns/dns-default-rsb58" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.806652 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8zzm\" (UniqueName: \"kubernetes.io/projected/666ce368-deb6-4d44-bb99-a11029a75bf2-kube-api-access-z8zzm\") pod \"packageserver-d55dfcdfc-bls26\" (UID: \"666ce368-deb6-4d44-bb99-a11029a75bf2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bls26" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.806696 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c9c9f5c1-6d1f-4cec-8365-3010eb0355a8-profile-collector-cert\") pod \"olm-operator-6b444d44fb-mwpns\" (UID: \"c9c9f5c1-6d1f-4cec-8365-3010eb0355a8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mwpns" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.806711 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssx7j\" (UniqueName: \"kubernetes.io/projected/850d8c0c-0380-4408-8a94-1cc7cf978944-kube-api-access-ssx7j\") pod \"service-ca-operator-777779d784-5qcv4\" (UID: \"850d8c0c-0380-4408-8a94-1cc7cf978944\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-5qcv4" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.806732 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/55faac8b-3bb2-43c9-ad70-88da5d7932c7-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-nm8gg\" (UID: \"55faac8b-3bb2-43c9-ad70-88da5d7932c7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nm8gg" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.806747 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wp9mp\" (UniqueName: \"kubernetes.io/projected/0b1f9a05-fdb4-42cd-8835-44ab845941ad-kube-api-access-wp9mp\") pod \"collect-profiles-29418915-rch2v\" (UID: \"0b1f9a05-fdb4-42cd-8835-44ab845941ad\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29418915-rch2v" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.806761 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwqsg\" (UniqueName: \"kubernetes.io/projected/2677e4e7-8b16-4e36-9361-1896eaa6c973-kube-api-access-pwqsg\") pod \"csi-hostpathplugin-74mbw\" (UID: \"2677e4e7-8b16-4e36-9361-1896eaa6c973\") " pod="hostpath-provisioner/csi-hostpathplugin-74mbw" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.806796 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/1dea4684-eec3-484b-8bc7-f524b4671833-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-rw7gt\" (UID: \"1dea4684-eec3-484b-8bc7-f524b4671833\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rw7gt" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.806838 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/666ce368-deb6-4d44-bb99-a11029a75bf2-webhook-cert\") pod \"packageserver-d55dfcdfc-bls26\" (UID: \"666ce368-deb6-4d44-bb99-a11029a75bf2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bls26" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.806868 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4w97x\" (UniqueName: \"kubernetes.io/projected/30ebc473-999f-4200-a9bf-714efb9646e1-kube-api-access-4w97x\") pod \"ingress-operator-5b745b69d9-ntzvg\" (UID: \"30ebc473-999f-4200-a9bf-714efb9646e1\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ntzvg" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.806884 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/5261a18d-d284-49f4-bc7b-196d2f5f5042-profile-collector-cert\") pod \"catalog-operator-68c6474976-qcs4x\" (UID: \"5261a18d-d284-49f4-bc7b-196d2f5f5042\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qcs4x" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.806898 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c6a12b6a-72ab-4b3a-858b-9e4657c1e03f-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-2zzzb\" (UID: \"c6a12b6a-72ab-4b3a-858b-9e4657c1e03f\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2zzzb" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.806932 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/2677e4e7-8b16-4e36-9361-1896eaa6c973-registration-dir\") pod \"csi-hostpathplugin-74mbw\" (UID: \"2677e4e7-8b16-4e36-9361-1896eaa6c973\") " pod="hostpath-provisioner/csi-hostpathplugin-74mbw" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.806959 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4c3b59c8-1311-4f2a-8ec6-699405d3a4ac-bound-sa-token\") pod \"image-registry-697d97f7c8-k99gl\" (UID: \"4c3b59c8-1311-4f2a-8ec6-699405d3a4ac\") " pod="openshift-image-registry/image-registry-697d97f7c8-k99gl" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.806973 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/850d8c0c-0380-4408-8a94-1cc7cf978944-serving-cert\") pod \"service-ca-operator-777779d784-5qcv4\" (UID: \"850d8c0c-0380-4408-8a94-1cc7cf978944\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-5qcv4" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.806987 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b23efab-81af-4c7a-9462-8d4333c2ce44-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-lvls8\" (UID: \"0b23efab-81af-4c7a-9462-8d4333c2ce44\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-lvls8" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.807005 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4c3b59c8-1311-4f2a-8ec6-699405d3a4ac-ca-trust-extracted\") pod \"image-registry-697d97f7c8-k99gl\" (UID: \"4c3b59c8-1311-4f2a-8ec6-699405d3a4ac\") " pod="openshift-image-registry/image-registry-697d97f7c8-k99gl" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.807060 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c9c9f5c1-6d1f-4cec-8365-3010eb0355a8-srv-cert\") pod \"olm-operator-6b444d44fb-mwpns\" (UID: \"c9c9f5c1-6d1f-4cec-8365-3010eb0355a8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mwpns" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.807076 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/30ebc473-999f-4200-a9bf-714efb9646e1-metrics-tls\") pod \"ingress-operator-5b745b69d9-ntzvg\" (UID: \"30ebc473-999f-4200-a9bf-714efb9646e1\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ntzvg" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.807092 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xc5wj\" (UniqueName: \"kubernetes.io/projected/0f38c9c5-3002-4d5a-a880-42359a37848b-kube-api-access-xc5wj\") pod \"dns-default-rsb58\" (UID: \"0f38c9c5-3002-4d5a-a880-42359a37848b\") " pod="openshift-dns/dns-default-rsb58" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.807138 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/2677e4e7-8b16-4e36-9361-1896eaa6c973-socket-dir\") pod \"csi-hostpathplugin-74mbw\" (UID: \"2677e4e7-8b16-4e36-9361-1896eaa6c973\") " pod="hostpath-provisioner/csi-hostpathplugin-74mbw" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.807155 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdtkt\" (UniqueName: \"kubernetes.io/projected/205b2c9d-23e8-4ce7-9ec9-86a7046a8679-kube-api-access-tdtkt\") pod \"service-ca-9c57cc56f-7pcrk\" (UID: \"205b2c9d-23e8-4ce7-9ec9-86a7046a8679\") " pod="openshift-service-ca/service-ca-9c57cc56f-7pcrk" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.807170 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/666ce368-deb6-4d44-bb99-a11029a75bf2-tmpfs\") pod \"packageserver-d55dfcdfc-bls26\" (UID: \"666ce368-deb6-4d44-bb99-a11029a75bf2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bls26" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.807183 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/2677e4e7-8b16-4e36-9361-1896eaa6c973-plugins-dir\") pod \"csi-hostpathplugin-74mbw\" (UID: \"2677e4e7-8b16-4e36-9361-1896eaa6c973\") " pod="hostpath-provisioner/csi-hostpathplugin-74mbw" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.807215 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l99rh\" (UniqueName: \"kubernetes.io/projected/4c3b59c8-1311-4f2a-8ec6-699405d3a4ac-kube-api-access-l99rh\") pod \"image-registry-697d97f7c8-k99gl\" (UID: \"4c3b59c8-1311-4f2a-8ec6-699405d3a4ac\") " pod="openshift-image-registry/image-registry-697d97f7c8-k99gl" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.807230 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e58da229-9be5-4d48-a1af-74d5316d09f3-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-2r46v\" (UID: \"e58da229-9be5-4d48-a1af-74d5316d09f3\") " pod="openshift-marketplace/marketplace-operator-79b997595-2r46v" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.807253 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55faac8b-3bb2-43c9-ad70-88da5d7932c7-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-nm8gg\" (UID: \"55faac8b-3bb2-43c9-ad70-88da5d7932c7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nm8gg" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.807280 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/850d8c0c-0380-4408-8a94-1cc7cf978944-config\") pod \"service-ca-operator-777779d784-5qcv4\" (UID: \"850d8c0c-0380-4408-8a94-1cc7cf978944\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-5qcv4" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.807359 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3e7aced-7e72-4b56-837d-6894066d39e9-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-4lfqc\" (UID: \"d3e7aced-7e72-4b56-837d-6894066d39e9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4lfqc" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.807375 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/6f27c552-b00b-43f1-ac31-fc592228c5ca-node-bootstrap-token\") pod \"machine-config-server-qfbp2\" (UID: \"6f27c552-b00b-43f1-ac31-fc592228c5ca\") " pod="openshift-machine-config-operator/machine-config-server-qfbp2" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.807388 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/205b2c9d-23e8-4ce7-9ec9-86a7046a8679-signing-key\") pod \"service-ca-9c57cc56f-7pcrk\" (UID: \"205b2c9d-23e8-4ce7-9ec9-86a7046a8679\") " pod="openshift-service-ca/service-ca-9c57cc56f-7pcrk" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.807408 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8kkv\" (UniqueName: \"kubernetes.io/projected/d399274f-1d12-4ffb-b2b6-e2b3619c86e6-kube-api-access-k8kkv\") pod \"migrator-59844c95c7-prjml\" (UID: \"d399274f-1d12-4ffb-b2b6-e2b3619c86e6\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-prjml" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.807423 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0b1f9a05-fdb4-42cd-8835-44ab845941ad-config-volume\") pod \"collect-profiles-29418915-rch2v\" (UID: \"0b1f9a05-fdb4-42cd-8835-44ab845941ad\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29418915-rch2v" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.807466 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhf5g\" (UniqueName: \"kubernetes.io/projected/6f27c552-b00b-43f1-ac31-fc592228c5ca-kube-api-access-dhf5g\") pod \"machine-config-server-qfbp2\" (UID: \"6f27c552-b00b-43f1-ac31-fc592228c5ca\") " pod="openshift-machine-config-operator/machine-config-server-qfbp2" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.807491 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/55faac8b-3bb2-43c9-ad70-88da5d7932c7-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-nm8gg\" (UID: \"55faac8b-3bb2-43c9-ad70-88da5d7932c7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nm8gg" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.807508 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e93e31e5-1375-4931-b77d-dd8b94f4cd4e-config\") pod \"kube-apiserver-operator-766d6c64bb-fp4mr\" (UID: \"e93e31e5-1375-4931-b77d-dd8b94f4cd4e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fp4mr" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.807522 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e93e31e5-1375-4931-b77d-dd8b94f4cd4e-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-fp4mr\" (UID: \"e93e31e5-1375-4931-b77d-dd8b94f4cd4e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fp4mr" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.807537 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/30ebc473-999f-4200-a9bf-714efb9646e1-trusted-ca\") pod \"ingress-operator-5b745b69d9-ntzvg\" (UID: \"30ebc473-999f-4200-a9bf-714efb9646e1\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ntzvg" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.807579 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2x22z\" (UniqueName: \"kubernetes.io/projected/8d96d1d6-7b0d-43b9-9589-83f4c76200c6-kube-api-access-2x22z\") pod \"ingress-canary-sbvc7\" (UID: \"8d96d1d6-7b0d-43b9-9589-83f4c76200c6\") " pod="openshift-ingress-canary/ingress-canary-sbvc7" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.807648 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5v82\" (UniqueName: \"kubernetes.io/projected/d3e7aced-7e72-4b56-837d-6894066d39e9-kube-api-access-p5v82\") pod \"kube-storage-version-migrator-operator-b67b599dd-4lfqc\" (UID: \"d3e7aced-7e72-4b56-837d-6894066d39e9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4lfqc" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.807700 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0b1f9a05-fdb4-42cd-8835-44ab845941ad-secret-volume\") pod \"collect-profiles-29418915-rch2v\" (UID: \"0b1f9a05-fdb4-42cd-8835-44ab845941ad\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29418915-rch2v" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.807715 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ff9a9a52-3b06-411e-88d9-a0afcce6a632-auth-proxy-config\") pod \"machine-config-operator-74547568cd-4gqqs\" (UID: \"ff9a9a52-3b06-411e-88d9-a0afcce6a632\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4gqqs" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.807759 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4c3b59c8-1311-4f2a-8ec6-699405d3a4ac-trusted-ca\") pod \"image-registry-697d97f7c8-k99gl\" (UID: \"4c3b59c8-1311-4f2a-8ec6-699405d3a4ac\") " pod="openshift-image-registry/image-registry-697d97f7c8-k99gl" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.807775 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4c3b59c8-1311-4f2a-8ec6-699405d3a4ac-registry-tls\") pod \"image-registry-697d97f7c8-k99gl\" (UID: \"4c3b59c8-1311-4f2a-8ec6-699405d3a4ac\") " pod="openshift-image-registry/image-registry-697d97f7c8-k99gl" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.807789 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ff9a9a52-3b06-411e-88d9-a0afcce6a632-proxy-tls\") pod \"machine-config-operator-74547568cd-4gqqs\" (UID: \"ff9a9a52-3b06-411e-88d9-a0afcce6a632\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4gqqs" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.807820 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xv7p\" (UniqueName: \"kubernetes.io/projected/c6a12b6a-72ab-4b3a-858b-9e4657c1e03f-kube-api-access-2xv7p\") pod \"multus-admission-controller-857f4d67dd-2zzzb\" (UID: \"c6a12b6a-72ab-4b3a-858b-9e4657c1e03f\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2zzzb" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.807836 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/2677e4e7-8b16-4e36-9361-1896eaa6c973-mountpoint-dir\") pod \"csi-hostpathplugin-74mbw\" (UID: \"2677e4e7-8b16-4e36-9361-1896eaa6c973\") " pod="hostpath-provisioner/csi-hostpathplugin-74mbw" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.807850 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/2677e4e7-8b16-4e36-9361-1896eaa6c973-csi-data-dir\") pod \"csi-hostpathplugin-74mbw\" (UID: \"2677e4e7-8b16-4e36-9361-1896eaa6c973\") " pod="hostpath-provisioner/csi-hostpathplugin-74mbw" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.807887 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8d96d1d6-7b0d-43b9-9589-83f4c76200c6-cert\") pod \"ingress-canary-sbvc7\" (UID: \"8d96d1d6-7b0d-43b9-9589-83f4c76200c6\") " pod="openshift-ingress-canary/ingress-canary-sbvc7" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.807902 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/205b2c9d-23e8-4ce7-9ec9-86a7046a8679-signing-cabundle\") pod \"service-ca-9c57cc56f-7pcrk\" (UID: \"205b2c9d-23e8-4ce7-9ec9-86a7046a8679\") " pod="openshift-service-ca/service-ca-9c57cc56f-7pcrk" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.809035 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ff9a9a52-3b06-411e-88d9-a0afcce6a632-images\") pod \"machine-config-operator-74547568cd-4gqqs\" (UID: \"ff9a9a52-3b06-411e-88d9-a0afcce6a632\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4gqqs" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.809061 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4c3b59c8-1311-4f2a-8ec6-699405d3a4ac-registry-certificates\") pod \"image-registry-697d97f7c8-k99gl\" (UID: \"4c3b59c8-1311-4f2a-8ec6-699405d3a4ac\") " pod="openshift-image-registry/image-registry-697d97f7c8-k99gl" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.809109 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4c3b59c8-1311-4f2a-8ec6-699405d3a4ac-installation-pull-secrets\") pod \"image-registry-697d97f7c8-k99gl\" (UID: \"4c3b59c8-1311-4f2a-8ec6-699405d3a4ac\") " pod="openshift-image-registry/image-registry-697d97f7c8-k99gl" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.809138 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d3e7aced-7e72-4b56-837d-6894066d39e9-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-4lfqc\" (UID: \"d3e7aced-7e72-4b56-837d-6894066d39e9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4lfqc" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.809162 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sl946\" (UniqueName: \"kubernetes.io/projected/c9c9f5c1-6d1f-4cec-8365-3010eb0355a8-kube-api-access-sl946\") pod \"olm-operator-6b444d44fb-mwpns\" (UID: \"c9c9f5c1-6d1f-4cec-8365-3010eb0355a8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mwpns" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.809178 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8d2t\" (UniqueName: \"kubernetes.io/projected/e58da229-9be5-4d48-a1af-74d5316d09f3-kube-api-access-z8d2t\") pod \"marketplace-operator-79b997595-2r46v\" (UID: \"e58da229-9be5-4d48-a1af-74d5316d09f3\") " pod="openshift-marketplace/marketplace-operator-79b997595-2r46v" Dec 07 19:17:32 crc kubenswrapper[4815]: E1207 19:17:32.809276 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-07 19:17:33.309260684 +0000 UTC m=+157.888250729 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.811603 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4c3b59c8-1311-4f2a-8ec6-699405d3a4ac-ca-trust-extracted\") pod \"image-registry-697d97f7c8-k99gl\" (UID: \"4c3b59c8-1311-4f2a-8ec6-699405d3a4ac\") " pod="openshift-image-registry/image-registry-697d97f7c8-k99gl" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.817558 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4c3b59c8-1311-4f2a-8ec6-699405d3a4ac-registry-certificates\") pod \"image-registry-697d97f7c8-k99gl\" (UID: \"4c3b59c8-1311-4f2a-8ec6-699405d3a4ac\") " pod="openshift-image-registry/image-registry-697d97f7c8-k99gl" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.818675 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e93e31e5-1375-4931-b77d-dd8b94f4cd4e-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-fp4mr\" (UID: \"e93e31e5-1375-4931-b77d-dd8b94f4cd4e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fp4mr" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.819999 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4c3b59c8-1311-4f2a-8ec6-699405d3a4ac-trusted-ca\") pod \"image-registry-697d97f7c8-k99gl\" (UID: \"4c3b59c8-1311-4f2a-8ec6-699405d3a4ac\") " pod="openshift-image-registry/image-registry-697d97f7c8-k99gl" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.821250 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e93e31e5-1375-4931-b77d-dd8b94f4cd4e-config\") pod \"kube-apiserver-operator-766d6c64bb-fp4mr\" (UID: \"e93e31e5-1375-4931-b77d-dd8b94f4cd4e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fp4mr" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.829038 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/30ebc473-999f-4200-a9bf-714efb9646e1-trusted-ca\") pod \"ingress-operator-5b745b69d9-ntzvg\" (UID: \"30ebc473-999f-4200-a9bf-714efb9646e1\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ntzvg" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.833340 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/30ebc473-999f-4200-a9bf-714efb9646e1-metrics-tls\") pod \"ingress-operator-5b745b69d9-ntzvg\" (UID: \"30ebc473-999f-4200-a9bf-714efb9646e1\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ntzvg" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.835105 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4c3b59c8-1311-4f2a-8ec6-699405d3a4ac-registry-tls\") pod \"image-registry-697d97f7c8-k99gl\" (UID: \"4c3b59c8-1311-4f2a-8ec6-699405d3a4ac\") " pod="openshift-image-registry/image-registry-697d97f7c8-k99gl" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.838232 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4c3b59c8-1311-4f2a-8ec6-699405d3a4ac-installation-pull-secrets\") pod \"image-registry-697d97f7c8-k99gl\" (UID: \"4c3b59c8-1311-4f2a-8ec6-699405d3a4ac\") " pod="openshift-image-registry/image-registry-697d97f7c8-k99gl" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.853091 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-7gwm4" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.879262 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-c74lm"] Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.880361 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4w97x\" (UniqueName: \"kubernetes.io/projected/30ebc473-999f-4200-a9bf-714efb9646e1-kube-api-access-4w97x\") pod \"ingress-operator-5b745b69d9-ntzvg\" (UID: \"30ebc473-999f-4200-a9bf-714efb9646e1\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ntzvg" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.881815 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4c3b59c8-1311-4f2a-8ec6-699405d3a4ac-bound-sa-token\") pod \"image-registry-697d97f7c8-k99gl\" (UID: \"4c3b59c8-1311-4f2a-8ec6-699405d3a4ac\") " pod="openshift-image-registry/image-registry-697d97f7c8-k99gl" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.884123 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/30ebc473-999f-4200-a9bf-714efb9646e1-bound-sa-token\") pod \"ingress-operator-5b745b69d9-ntzvg\" (UID: \"30ebc473-999f-4200-a9bf-714efb9646e1\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ntzvg" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.899671 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ntzvg" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.905010 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8kkv\" (UniqueName: \"kubernetes.io/projected/d399274f-1d12-4ffb-b2b6-e2b3619c86e6-kube-api-access-k8kkv\") pod \"migrator-59844c95c7-prjml\" (UID: \"d399274f-1d12-4ffb-b2b6-e2b3619c86e6\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-prjml" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.907668 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cxk4m" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.910304 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/666ce368-deb6-4d44-bb99-a11029a75bf2-tmpfs\") pod \"packageserver-d55dfcdfc-bls26\" (UID: \"666ce368-deb6-4d44-bb99-a11029a75bf2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bls26" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.910435 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/2677e4e7-8b16-4e36-9361-1896eaa6c973-plugins-dir\") pod \"csi-hostpathplugin-74mbw\" (UID: \"2677e4e7-8b16-4e36-9361-1896eaa6c973\") " pod="hostpath-provisioner/csi-hostpathplugin-74mbw" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.910558 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e58da229-9be5-4d48-a1af-74d5316d09f3-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-2r46v\" (UID: \"e58da229-9be5-4d48-a1af-74d5316d09f3\") " pod="openshift-marketplace/marketplace-operator-79b997595-2r46v" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.912769 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55faac8b-3bb2-43c9-ad70-88da5d7932c7-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-nm8gg\" (UID: \"55faac8b-3bb2-43c9-ad70-88da5d7932c7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nm8gg" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.912870 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/850d8c0c-0380-4408-8a94-1cc7cf978944-config\") pod \"service-ca-operator-777779d784-5qcv4\" (UID: \"850d8c0c-0380-4408-8a94-1cc7cf978944\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-5qcv4" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.912987 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3e7aced-7e72-4b56-837d-6894066d39e9-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-4lfqc\" (UID: \"d3e7aced-7e72-4b56-837d-6894066d39e9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4lfqc" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.913087 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/6f27c552-b00b-43f1-ac31-fc592228c5ca-node-bootstrap-token\") pod \"machine-config-server-qfbp2\" (UID: \"6f27c552-b00b-43f1-ac31-fc592228c5ca\") " pod="openshift-machine-config-operator/machine-config-server-qfbp2" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.913180 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/205b2c9d-23e8-4ce7-9ec9-86a7046a8679-signing-key\") pod \"service-ca-9c57cc56f-7pcrk\" (UID: \"205b2c9d-23e8-4ce7-9ec9-86a7046a8679\") " pod="openshift-service-ca/service-ca-9c57cc56f-7pcrk" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.913279 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0b1f9a05-fdb4-42cd-8835-44ab845941ad-config-volume\") pod \"collect-profiles-29418915-rch2v\" (UID: \"0b1f9a05-fdb4-42cd-8835-44ab845941ad\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29418915-rch2v" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.913377 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhf5g\" (UniqueName: \"kubernetes.io/projected/6f27c552-b00b-43f1-ac31-fc592228c5ca-kube-api-access-dhf5g\") pod \"machine-config-server-qfbp2\" (UID: \"6f27c552-b00b-43f1-ac31-fc592228c5ca\") " pod="openshift-machine-config-operator/machine-config-server-qfbp2" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.913467 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/55faac8b-3bb2-43c9-ad70-88da5d7932c7-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-nm8gg\" (UID: \"55faac8b-3bb2-43c9-ad70-88da5d7932c7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nm8gg" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.913565 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2x22z\" (UniqueName: \"kubernetes.io/projected/8d96d1d6-7b0d-43b9-9589-83f4c76200c6-kube-api-access-2x22z\") pod \"ingress-canary-sbvc7\" (UID: \"8d96d1d6-7b0d-43b9-9589-83f4c76200c6\") " pod="openshift-ingress-canary/ingress-canary-sbvc7" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.913658 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5v82\" (UniqueName: \"kubernetes.io/projected/d3e7aced-7e72-4b56-837d-6894066d39e9-kube-api-access-p5v82\") pod \"kube-storage-version-migrator-operator-b67b599dd-4lfqc\" (UID: \"d3e7aced-7e72-4b56-837d-6894066d39e9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4lfqc" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.913747 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0b1f9a05-fdb4-42cd-8835-44ab845941ad-secret-volume\") pod \"collect-profiles-29418915-rch2v\" (UID: \"0b1f9a05-fdb4-42cd-8835-44ab845941ad\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29418915-rch2v" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.914226 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ff9a9a52-3b06-411e-88d9-a0afcce6a632-auth-proxy-config\") pod \"machine-config-operator-74547568cd-4gqqs\" (UID: \"ff9a9a52-3b06-411e-88d9-a0afcce6a632\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4gqqs" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.914328 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ff9a9a52-3b06-411e-88d9-a0afcce6a632-proxy-tls\") pod \"machine-config-operator-74547568cd-4gqqs\" (UID: \"ff9a9a52-3b06-411e-88d9-a0afcce6a632\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4gqqs" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.914426 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xv7p\" (UniqueName: \"kubernetes.io/projected/c6a12b6a-72ab-4b3a-858b-9e4657c1e03f-kube-api-access-2xv7p\") pod \"multus-admission-controller-857f4d67dd-2zzzb\" (UID: \"c6a12b6a-72ab-4b3a-858b-9e4657c1e03f\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2zzzb" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.914526 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/2677e4e7-8b16-4e36-9361-1896eaa6c973-mountpoint-dir\") pod \"csi-hostpathplugin-74mbw\" (UID: \"2677e4e7-8b16-4e36-9361-1896eaa6c973\") " pod="hostpath-provisioner/csi-hostpathplugin-74mbw" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.914622 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/2677e4e7-8b16-4e36-9361-1896eaa6c973-csi-data-dir\") pod \"csi-hostpathplugin-74mbw\" (UID: \"2677e4e7-8b16-4e36-9361-1896eaa6c973\") " pod="hostpath-provisioner/csi-hostpathplugin-74mbw" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.914715 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8d96d1d6-7b0d-43b9-9589-83f4c76200c6-cert\") pod \"ingress-canary-sbvc7\" (UID: \"8d96d1d6-7b0d-43b9-9589-83f4c76200c6\") " pod="openshift-ingress-canary/ingress-canary-sbvc7" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.914801 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/205b2c9d-23e8-4ce7-9ec9-86a7046a8679-signing-cabundle\") pod \"service-ca-9c57cc56f-7pcrk\" (UID: \"205b2c9d-23e8-4ce7-9ec9-86a7046a8679\") " pod="openshift-service-ca/service-ca-9c57cc56f-7pcrk" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.914909 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ff9a9a52-3b06-411e-88d9-a0afcce6a632-images\") pod \"machine-config-operator-74547568cd-4gqqs\" (UID: \"ff9a9a52-3b06-411e-88d9-a0afcce6a632\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4gqqs" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.915016 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d3e7aced-7e72-4b56-837d-6894066d39e9-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-4lfqc\" (UID: \"d3e7aced-7e72-4b56-837d-6894066d39e9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4lfqc" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.915121 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sl946\" (UniqueName: \"kubernetes.io/projected/c9c9f5c1-6d1f-4cec-8365-3010eb0355a8-kube-api-access-sl946\") pod \"olm-operator-6b444d44fb-mwpns\" (UID: \"c9c9f5c1-6d1f-4cec-8365-3010eb0355a8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mwpns" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.915226 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8d2t\" (UniqueName: \"kubernetes.io/projected/e58da229-9be5-4d48-a1af-74d5316d09f3-kube-api-access-z8d2t\") pod \"marketplace-operator-79b997595-2r46v\" (UID: \"e58da229-9be5-4d48-a1af-74d5316d09f3\") " pod="openshift-marketplace/marketplace-operator-79b997595-2r46v" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.915325 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0f38c9c5-3002-4d5a-a880-42359a37848b-config-volume\") pod \"dns-default-rsb58\" (UID: \"0f38c9c5-3002-4d5a-a880-42359a37848b\") " pod="openshift-dns/dns-default-rsb58" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.915412 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmq4p\" (UniqueName: \"kubernetes.io/projected/5261a18d-d284-49f4-bc7b-196d2f5f5042-kube-api-access-lmq4p\") pod \"catalog-operator-68c6474976-qcs4x\" (UID: \"5261a18d-d284-49f4-bc7b-196d2f5f5042\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qcs4x" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.915538 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b23efab-81af-4c7a-9462-8d4333c2ce44-proxy-tls\") pod \"machine-config-controller-84d6567774-lvls8\" (UID: \"0b23efab-81af-4c7a-9462-8d4333c2ce44\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-lvls8" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.915612 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/850d8c0c-0380-4408-8a94-1cc7cf978944-config\") pod \"service-ca-operator-777779d784-5qcv4\" (UID: \"850d8c0c-0380-4408-8a94-1cc7cf978944\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-5qcv4" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.915701 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/6f27c552-b00b-43f1-ac31-fc592228c5ca-certs\") pod \"machine-config-server-qfbp2\" (UID: \"6f27c552-b00b-43f1-ac31-fc592228c5ca\") " pod="openshift-machine-config-operator/machine-config-server-qfbp2" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.915788 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2r7bl\" (UniqueName: \"kubernetes.io/projected/1dea4684-eec3-484b-8bc7-f524b4671833-kube-api-access-2r7bl\") pod \"package-server-manager-789f6589d5-rw7gt\" (UID: \"1dea4684-eec3-484b-8bc7-f524b4671833\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rw7gt" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.915884 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cps7w\" (UniqueName: \"kubernetes.io/projected/ff9a9a52-3b06-411e-88d9-a0afcce6a632-kube-api-access-cps7w\") pod \"machine-config-operator-74547568cd-4gqqs\" (UID: \"ff9a9a52-3b06-411e-88d9-a0afcce6a632\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4gqqs" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.915999 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdlnz\" (UniqueName: \"kubernetes.io/projected/0b23efab-81af-4c7a-9462-8d4333c2ce44-kube-api-access-zdlnz\") pod \"machine-config-controller-84d6567774-lvls8\" (UID: \"0b23efab-81af-4c7a-9462-8d4333c2ce44\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-lvls8" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.916102 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/5261a18d-d284-49f4-bc7b-196d2f5f5042-srv-cert\") pod \"catalog-operator-68c6474976-qcs4x\" (UID: \"5261a18d-d284-49f4-bc7b-196d2f5f5042\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qcs4x" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.916192 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/666ce368-deb6-4d44-bb99-a11029a75bf2-apiservice-cert\") pod \"packageserver-d55dfcdfc-bls26\" (UID: \"666ce368-deb6-4d44-bb99-a11029a75bf2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bls26" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.916309 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e58da229-9be5-4d48-a1af-74d5316d09f3-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-2r46v\" (UID: \"e58da229-9be5-4d48-a1af-74d5316d09f3\") " pod="openshift-marketplace/marketplace-operator-79b997595-2r46v" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.916411 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0f38c9c5-3002-4d5a-a880-42359a37848b-metrics-tls\") pod \"dns-default-rsb58\" (UID: \"0f38c9c5-3002-4d5a-a880-42359a37848b\") " pod="openshift-dns/dns-default-rsb58" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.916500 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8zzm\" (UniqueName: \"kubernetes.io/projected/666ce368-deb6-4d44-bb99-a11029a75bf2-kube-api-access-z8zzm\") pod \"packageserver-d55dfcdfc-bls26\" (UID: \"666ce368-deb6-4d44-bb99-a11029a75bf2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bls26" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.916587 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c9c9f5c1-6d1f-4cec-8365-3010eb0355a8-profile-collector-cert\") pod \"olm-operator-6b444d44fb-mwpns\" (UID: \"c9c9f5c1-6d1f-4cec-8365-3010eb0355a8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mwpns" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.916677 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ssx7j\" (UniqueName: \"kubernetes.io/projected/850d8c0c-0380-4408-8a94-1cc7cf978944-kube-api-access-ssx7j\") pod \"service-ca-operator-777779d784-5qcv4\" (UID: \"850d8c0c-0380-4408-8a94-1cc7cf978944\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-5qcv4" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.916778 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k99gl\" (UID: \"4c3b59c8-1311-4f2a-8ec6-699405d3a4ac\") " pod="openshift-image-registry/image-registry-697d97f7c8-k99gl" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.916862 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/55faac8b-3bb2-43c9-ad70-88da5d7932c7-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-nm8gg\" (UID: \"55faac8b-3bb2-43c9-ad70-88da5d7932c7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nm8gg" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.916954 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wp9mp\" (UniqueName: \"kubernetes.io/projected/0b1f9a05-fdb4-42cd-8835-44ab845941ad-kube-api-access-wp9mp\") pod \"collect-profiles-29418915-rch2v\" (UID: \"0b1f9a05-fdb4-42cd-8835-44ab845941ad\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29418915-rch2v" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.917052 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwqsg\" (UniqueName: \"kubernetes.io/projected/2677e4e7-8b16-4e36-9361-1896eaa6c973-kube-api-access-pwqsg\") pod \"csi-hostpathplugin-74mbw\" (UID: \"2677e4e7-8b16-4e36-9361-1896eaa6c973\") " pod="hostpath-provisioner/csi-hostpathplugin-74mbw" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.917162 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/666ce368-deb6-4d44-bb99-a11029a75bf2-webhook-cert\") pod \"packageserver-d55dfcdfc-bls26\" (UID: \"666ce368-deb6-4d44-bb99-a11029a75bf2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bls26" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.917272 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/1dea4684-eec3-484b-8bc7-f524b4671833-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-rw7gt\" (UID: \"1dea4684-eec3-484b-8bc7-f524b4671833\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rw7gt" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.917389 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/5261a18d-d284-49f4-bc7b-196d2f5f5042-profile-collector-cert\") pod \"catalog-operator-68c6474976-qcs4x\" (UID: \"5261a18d-d284-49f4-bc7b-196d2f5f5042\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qcs4x" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.917700 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3e7aced-7e72-4b56-837d-6894066d39e9-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-4lfqc\" (UID: \"d3e7aced-7e72-4b56-837d-6894066d39e9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4lfqc" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.914589 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e58da229-9be5-4d48-a1af-74d5316d09f3-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-2r46v\" (UID: \"e58da229-9be5-4d48-a1af-74d5316d09f3\") " pod="openshift-marketplace/marketplace-operator-79b997595-2r46v" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.917901 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/2677e4e7-8b16-4e36-9361-1896eaa6c973-csi-data-dir\") pod \"csi-hostpathplugin-74mbw\" (UID: \"2677e4e7-8b16-4e36-9361-1896eaa6c973\") " pod="hostpath-provisioner/csi-hostpathplugin-74mbw" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.917905 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0b1f9a05-fdb4-42cd-8835-44ab845941ad-config-volume\") pod \"collect-profiles-29418915-rch2v\" (UID: \"0b1f9a05-fdb4-42cd-8835-44ab845941ad\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29418915-rch2v" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.911041 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/666ce368-deb6-4d44-bb99-a11029a75bf2-tmpfs\") pod \"packageserver-d55dfcdfc-bls26\" (UID: \"666ce368-deb6-4d44-bb99-a11029a75bf2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bls26" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.914752 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/2677e4e7-8b16-4e36-9361-1896eaa6c973-mountpoint-dir\") pod \"csi-hostpathplugin-74mbw\" (UID: \"2677e4e7-8b16-4e36-9361-1896eaa6c973\") " pod="hostpath-provisioner/csi-hostpathplugin-74mbw" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.917491 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c6a12b6a-72ab-4b3a-858b-9e4657c1e03f-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-2zzzb\" (UID: \"c6a12b6a-72ab-4b3a-858b-9e4657c1e03f\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2zzzb" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.922677 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/2677e4e7-8b16-4e36-9361-1896eaa6c973-registration-dir\") pod \"csi-hostpathplugin-74mbw\" (UID: \"2677e4e7-8b16-4e36-9361-1896eaa6c973\") " pod="hostpath-provisioner/csi-hostpathplugin-74mbw" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.922762 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/850d8c0c-0380-4408-8a94-1cc7cf978944-serving-cert\") pod \"service-ca-operator-777779d784-5qcv4\" (UID: \"850d8c0c-0380-4408-8a94-1cc7cf978944\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-5qcv4" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.922836 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b23efab-81af-4c7a-9462-8d4333c2ce44-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-lvls8\" (UID: \"0b23efab-81af-4c7a-9462-8d4333c2ce44\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-lvls8" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.922951 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c9c9f5c1-6d1f-4cec-8365-3010eb0355a8-srv-cert\") pod \"olm-operator-6b444d44fb-mwpns\" (UID: \"c9c9f5c1-6d1f-4cec-8365-3010eb0355a8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mwpns" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.923033 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xc5wj\" (UniqueName: \"kubernetes.io/projected/0f38c9c5-3002-4d5a-a880-42359a37848b-kube-api-access-xc5wj\") pod \"dns-default-rsb58\" (UID: \"0f38c9c5-3002-4d5a-a880-42359a37848b\") " pod="openshift-dns/dns-default-rsb58" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.923127 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/2677e4e7-8b16-4e36-9361-1896eaa6c973-socket-dir\") pod \"csi-hostpathplugin-74mbw\" (UID: \"2677e4e7-8b16-4e36-9361-1896eaa6c973\") " pod="hostpath-provisioner/csi-hostpathplugin-74mbw" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.923224 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdtkt\" (UniqueName: \"kubernetes.io/projected/205b2c9d-23e8-4ce7-9ec9-86a7046a8679-kube-api-access-tdtkt\") pod \"service-ca-9c57cc56f-7pcrk\" (UID: \"205b2c9d-23e8-4ce7-9ec9-86a7046a8679\") " pod="openshift-service-ca/service-ca-9c57cc56f-7pcrk" Dec 07 19:17:32 crc kubenswrapper[4815]: E1207 19:17:32.923726 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-07 19:17:33.423714445 +0000 UTC m=+158.002704490 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k99gl" (UID: "4c3b59c8-1311-4f2a-8ec6-699405d3a4ac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.911308 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/2677e4e7-8b16-4e36-9361-1896eaa6c973-plugins-dir\") pod \"csi-hostpathplugin-74mbw\" (UID: \"2677e4e7-8b16-4e36-9361-1896eaa6c973\") " pod="hostpath-provisioner/csi-hostpathplugin-74mbw" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.927348 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0f38c9c5-3002-4d5a-a880-42359a37848b-config-volume\") pod \"dns-default-rsb58\" (UID: \"0f38c9c5-3002-4d5a-a880-42359a37848b\") " pod="openshift-dns/dns-default-rsb58" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.928507 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/205b2c9d-23e8-4ce7-9ec9-86a7046a8679-signing-cabundle\") pod \"service-ca-9c57cc56f-7pcrk\" (UID: \"205b2c9d-23e8-4ce7-9ec9-86a7046a8679\") " pod="openshift-service-ca/service-ca-9c57cc56f-7pcrk" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.930193 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ff9a9a52-3b06-411e-88d9-a0afcce6a632-auth-proxy-config\") pod \"machine-config-operator-74547568cd-4gqqs\" (UID: \"ff9a9a52-3b06-411e-88d9-a0afcce6a632\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4gqqs" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.915150 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55faac8b-3bb2-43c9-ad70-88da5d7932c7-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-nm8gg\" (UID: \"55faac8b-3bb2-43c9-ad70-88da5d7932c7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nm8gg" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.959666 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ff9a9a52-3b06-411e-88d9-a0afcce6a632-images\") pod \"machine-config-operator-74547568cd-4gqqs\" (UID: \"ff9a9a52-3b06-411e-88d9-a0afcce6a632\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4gqqs" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.962095 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-prjml" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.973116 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e58da229-9be5-4d48-a1af-74d5316d09f3-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-2r46v\" (UID: \"e58da229-9be5-4d48-a1af-74d5316d09f3\") " pod="openshift-marketplace/marketplace-operator-79b997595-2r46v" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.978444 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b23efab-81af-4c7a-9462-8d4333c2ce44-proxy-tls\") pod \"machine-config-controller-84d6567774-lvls8\" (UID: \"0b23efab-81af-4c7a-9462-8d4333c2ce44\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-lvls8" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.987224 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0b1f9a05-fdb4-42cd-8835-44ab845941ad-secret-volume\") pod \"collect-profiles-29418915-rch2v\" (UID: \"0b1f9a05-fdb4-42cd-8835-44ab845941ad\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29418915-rch2v" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.987803 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/6f27c552-b00b-43f1-ac31-fc592228c5ca-node-bootstrap-token\") pod \"machine-config-server-qfbp2\" (UID: \"6f27c552-b00b-43f1-ac31-fc592228c5ca\") " pod="openshift-machine-config-operator/machine-config-server-qfbp2" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.988111 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d3e7aced-7e72-4b56-837d-6894066d39e9-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-4lfqc\" (UID: \"d3e7aced-7e72-4b56-837d-6894066d39e9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4lfqc" Dec 07 19:17:32 crc kubenswrapper[4815]: I1207 19:17:32.992297 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/2677e4e7-8b16-4e36-9361-1896eaa6c973-registration-dir\") pod \"csi-hostpathplugin-74mbw\" (UID: \"2677e4e7-8b16-4e36-9361-1896eaa6c973\") " pod="hostpath-provisioner/csi-hostpathplugin-74mbw" Dec 07 19:17:33 crc kubenswrapper[4815]: I1207 19:17:32.998739 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/666ce368-deb6-4d44-bb99-a11029a75bf2-webhook-cert\") pod \"packageserver-d55dfcdfc-bls26\" (UID: \"666ce368-deb6-4d44-bb99-a11029a75bf2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bls26" Dec 07 19:17:33 crc kubenswrapper[4815]: I1207 19:17:33.005296 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/5261a18d-d284-49f4-bc7b-196d2f5f5042-srv-cert\") pod \"catalog-operator-68c6474976-qcs4x\" (UID: \"5261a18d-d284-49f4-bc7b-196d2f5f5042\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qcs4x" Dec 07 19:17:33 crc kubenswrapper[4815]: I1207 19:17:33.005707 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/666ce368-deb6-4d44-bb99-a11029a75bf2-apiservice-cert\") pod \"packageserver-d55dfcdfc-bls26\" (UID: \"666ce368-deb6-4d44-bb99-a11029a75bf2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bls26" Dec 07 19:17:33 crc kubenswrapper[4815]: I1207 19:17:33.007080 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0f38c9c5-3002-4d5a-a880-42359a37848b-metrics-tls\") pod \"dns-default-rsb58\" (UID: \"0f38c9c5-3002-4d5a-a880-42359a37848b\") " pod="openshift-dns/dns-default-rsb58" Dec 07 19:17:33 crc kubenswrapper[4815]: I1207 19:17:33.008414 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c6a12b6a-72ab-4b3a-858b-9e4657c1e03f-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-2zzzb\" (UID: \"c6a12b6a-72ab-4b3a-858b-9e4657c1e03f\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2zzzb" Dec 07 19:17:33 crc kubenswrapper[4815]: I1207 19:17:33.011514 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/5261a18d-d284-49f4-bc7b-196d2f5f5042-profile-collector-cert\") pod \"catalog-operator-68c6474976-qcs4x\" (UID: \"5261a18d-d284-49f4-bc7b-196d2f5f5042\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qcs4x" Dec 07 19:17:33 crc kubenswrapper[4815]: I1207 19:17:33.017849 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e93e31e5-1375-4931-b77d-dd8b94f4cd4e-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-fp4mr\" (UID: \"e93e31e5-1375-4931-b77d-dd8b94f4cd4e\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fp4mr" Dec 07 19:17:33 crc kubenswrapper[4815]: I1207 19:17:33.021659 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c9c9f5c1-6d1f-4cec-8365-3010eb0355a8-srv-cert\") pod \"olm-operator-6b444d44fb-mwpns\" (UID: \"c9c9f5c1-6d1f-4cec-8365-3010eb0355a8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mwpns" Dec 07 19:17:33 crc kubenswrapper[4815]: I1207 19:17:33.033739 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/1dea4684-eec3-484b-8bc7-f524b4671833-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-rw7gt\" (UID: \"1dea4684-eec3-484b-8bc7-f524b4671833\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rw7gt" Dec 07 19:17:33 crc kubenswrapper[4815]: I1207 19:17:33.035966 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/850d8c0c-0380-4408-8a94-1cc7cf978944-serving-cert\") pod \"service-ca-operator-777779d784-5qcv4\" (UID: \"850d8c0c-0380-4408-8a94-1cc7cf978944\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-5qcv4" Dec 07 19:17:33 crc kubenswrapper[4815]: I1207 19:17:33.036038 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/2677e4e7-8b16-4e36-9361-1896eaa6c973-socket-dir\") pod \"csi-hostpathplugin-74mbw\" (UID: \"2677e4e7-8b16-4e36-9361-1896eaa6c973\") " pod="hostpath-provisioner/csi-hostpathplugin-74mbw" Dec 07 19:17:33 crc kubenswrapper[4815]: I1207 19:17:33.036049 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-k72f8"] Dec 07 19:17:33 crc kubenswrapper[4815]: I1207 19:17:33.036595 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/6f27c552-b00b-43f1-ac31-fc592228c5ca-certs\") pod \"machine-config-server-qfbp2\" (UID: \"6f27c552-b00b-43f1-ac31-fc592228c5ca\") " pod="openshift-machine-config-operator/machine-config-server-qfbp2" Dec 07 19:17:33 crc kubenswrapper[4815]: I1207 19:17:33.036868 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b23efab-81af-4c7a-9462-8d4333c2ce44-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-lvls8\" (UID: \"0b23efab-81af-4c7a-9462-8d4333c2ce44\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-lvls8" Dec 07 19:17:33 crc kubenswrapper[4815]: I1207 19:17:33.037656 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cdxxd"] Dec 07 19:17:33 crc kubenswrapper[4815]: I1207 19:17:33.054450 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-z4njj"] Dec 07 19:17:33 crc kubenswrapper[4815]: I1207 19:17:33.054480 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ks27d"] Dec 07 19:17:33 crc kubenswrapper[4815]: I1207 19:17:33.055137 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/205b2c9d-23e8-4ce7-9ec9-86a7046a8679-signing-key\") pod \"service-ca-9c57cc56f-7pcrk\" (UID: \"205b2c9d-23e8-4ce7-9ec9-86a7046a8679\") " pod="openshift-service-ca/service-ca-9c57cc56f-7pcrk" Dec 07 19:17:33 crc kubenswrapper[4815]: I1207 19:17:33.055622 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8d96d1d6-7b0d-43b9-9589-83f4c76200c6-cert\") pod \"ingress-canary-sbvc7\" (UID: \"8d96d1d6-7b0d-43b9-9589-83f4c76200c6\") " pod="openshift-ingress-canary/ingress-canary-sbvc7" Dec 07 19:17:33 crc kubenswrapper[4815]: I1207 19:17:33.068133 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 07 19:17:33 crc kubenswrapper[4815]: E1207 19:17:33.068553 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-07 19:17:33.568536705 +0000 UTC m=+158.147526750 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:33 crc kubenswrapper[4815]: I1207 19:17:33.071056 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2x22z\" (UniqueName: \"kubernetes.io/projected/8d96d1d6-7b0d-43b9-9589-83f4c76200c6-kube-api-access-2x22z\") pod \"ingress-canary-sbvc7\" (UID: \"8d96d1d6-7b0d-43b9-9589-83f4c76200c6\") " pod="openshift-ingress-canary/ingress-canary-sbvc7" Dec 07 19:17:33 crc kubenswrapper[4815]: I1207 19:17:33.095981 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l99rh\" (UniqueName: \"kubernetes.io/projected/4c3b59c8-1311-4f2a-8ec6-699405d3a4ac-kube-api-access-l99rh\") pod \"image-registry-697d97f7c8-k99gl\" (UID: \"4c3b59c8-1311-4f2a-8ec6-699405d3a4ac\") " pod="openshift-image-registry/image-registry-697d97f7c8-k99gl" Dec 07 19:17:33 crc kubenswrapper[4815]: I1207 19:17:33.096336 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/55faac8b-3bb2-43c9-ad70-88da5d7932c7-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-nm8gg\" (UID: \"55faac8b-3bb2-43c9-ad70-88da5d7932c7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nm8gg" Dec 07 19:17:33 crc kubenswrapper[4815]: W1207 19:17:33.096494 4815 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2bbe4d3_67cf_4517_93c7_528e1694f76c.slice/crio-531ab9cae2bb91a76d1707062c2608bc4872744f8d0a89634eca2ae361b0d426 WatchSource:0}: Error finding container 531ab9cae2bb91a76d1707062c2608bc4872744f8d0a89634eca2ae361b0d426: Status 404 returned error can't find the container with id 531ab9cae2bb91a76d1707062c2608bc4872744f8d0a89634eca2ae361b0d426 Dec 07 19:17:33 crc kubenswrapper[4815]: I1207 19:17:33.096849 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ff9a9a52-3b06-411e-88d9-a0afcce6a632-proxy-tls\") pod \"machine-config-operator-74547568cd-4gqqs\" (UID: \"ff9a9a52-3b06-411e-88d9-a0afcce6a632\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4gqqs" Dec 07 19:17:33 crc kubenswrapper[4815]: I1207 19:17:33.096860 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/55faac8b-3bb2-43c9-ad70-88da5d7932c7-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-nm8gg\" (UID: \"55faac8b-3bb2-43c9-ad70-88da5d7932c7\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nm8gg" Dec 07 19:17:33 crc kubenswrapper[4815]: I1207 19:17:33.097292 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c9c9f5c1-6d1f-4cec-8365-3010eb0355a8-profile-collector-cert\") pod \"olm-operator-6b444d44fb-mwpns\" (UID: \"c9c9f5c1-6d1f-4cec-8365-3010eb0355a8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mwpns" Dec 07 19:17:33 crc kubenswrapper[4815]: I1207 19:17:33.097617 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2r7bl\" (UniqueName: \"kubernetes.io/projected/1dea4684-eec3-484b-8bc7-f524b4671833-kube-api-access-2r7bl\") pod \"package-server-manager-789f6589d5-rw7gt\" (UID: \"1dea4684-eec3-484b-8bc7-f524b4671833\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rw7gt" Dec 07 19:17:33 crc kubenswrapper[4815]: I1207 19:17:33.098602 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhf5g\" (UniqueName: \"kubernetes.io/projected/6f27c552-b00b-43f1-ac31-fc592228c5ca-kube-api-access-dhf5g\") pod \"machine-config-server-qfbp2\" (UID: \"6f27c552-b00b-43f1-ac31-fc592228c5ca\") " pod="openshift-machine-config-operator/machine-config-server-qfbp2" Dec 07 19:17:33 crc kubenswrapper[4815]: I1207 19:17:33.117326 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5v82\" (UniqueName: \"kubernetes.io/projected/d3e7aced-7e72-4b56-837d-6894066d39e9-kube-api-access-p5v82\") pod \"kube-storage-version-migrator-operator-b67b599dd-4lfqc\" (UID: \"d3e7aced-7e72-4b56-837d-6894066d39e9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4lfqc" Dec 07 19:17:33 crc kubenswrapper[4815]: I1207 19:17:33.136679 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdtkt\" (UniqueName: \"kubernetes.io/projected/205b2c9d-23e8-4ce7-9ec9-86a7046a8679-kube-api-access-tdtkt\") pod \"service-ca-9c57cc56f-7pcrk\" (UID: \"205b2c9d-23e8-4ce7-9ec9-86a7046a8679\") " pod="openshift-service-ca/service-ca-9c57cc56f-7pcrk" Dec 07 19:17:33 crc kubenswrapper[4815]: I1207 19:17:33.140476 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssx7j\" (UniqueName: \"kubernetes.io/projected/850d8c0c-0380-4408-8a94-1cc7cf978944-kube-api-access-ssx7j\") pod \"service-ca-operator-777779d784-5qcv4\" (UID: \"850d8c0c-0380-4408-8a94-1cc7cf978944\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-5qcv4" Dec 07 19:17:33 crc kubenswrapper[4815]: I1207 19:17:33.150750 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8d2t\" (UniqueName: \"kubernetes.io/projected/e58da229-9be5-4d48-a1af-74d5316d09f3-kube-api-access-z8d2t\") pod \"marketplace-operator-79b997595-2r46v\" (UID: \"e58da229-9be5-4d48-a1af-74d5316d09f3\") " pod="openshift-marketplace/marketplace-operator-79b997595-2r46v" Dec 07 19:17:33 crc kubenswrapper[4815]: I1207 19:17:33.156017 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmq4p\" (UniqueName: \"kubernetes.io/projected/5261a18d-d284-49f4-bc7b-196d2f5f5042-kube-api-access-lmq4p\") pod \"catalog-operator-68c6474976-qcs4x\" (UID: \"5261a18d-d284-49f4-bc7b-196d2f5f5042\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qcs4x" Dec 07 19:17:33 crc kubenswrapper[4815]: I1207 19:17:33.170343 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k99gl\" (UID: \"4c3b59c8-1311-4f2a-8ec6-699405d3a4ac\") " pod="openshift-image-registry/image-registry-697d97f7c8-k99gl" Dec 07 19:17:33 crc kubenswrapper[4815]: E1207 19:17:33.170661 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-07 19:17:33.670650175 +0000 UTC m=+158.249640210 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k99gl" (UID: "4c3b59c8-1311-4f2a-8ec6-699405d3a4ac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:33 crc kubenswrapper[4815]: I1207 19:17:33.172590 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sl946\" (UniqueName: \"kubernetes.io/projected/c9c9f5c1-6d1f-4cec-8365-3010eb0355a8-kube-api-access-sl946\") pod \"olm-operator-6b444d44fb-mwpns\" (UID: \"c9c9f5c1-6d1f-4cec-8365-3010eb0355a8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mwpns" Dec 07 19:17:33 crc kubenswrapper[4815]: I1207 19:17:33.180443 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xv7p\" (UniqueName: \"kubernetes.io/projected/c6a12b6a-72ab-4b3a-858b-9e4657c1e03f-kube-api-access-2xv7p\") pod \"multus-admission-controller-857f4d67dd-2zzzb\" (UID: \"c6a12b6a-72ab-4b3a-858b-9e4657c1e03f\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2zzzb" Dec 07 19:17:33 crc kubenswrapper[4815]: I1207 19:17:33.201712 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cps7w\" (UniqueName: \"kubernetes.io/projected/ff9a9a52-3b06-411e-88d9-a0afcce6a632-kube-api-access-cps7w\") pod \"machine-config-operator-74547568cd-4gqqs\" (UID: \"ff9a9a52-3b06-411e-88d9-a0afcce6a632\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4gqqs" Dec 07 19:17:33 crc kubenswrapper[4815]: I1207 19:17:33.219289 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdlnz\" (UniqueName: \"kubernetes.io/projected/0b23efab-81af-4c7a-9462-8d4333c2ce44-kube-api-access-zdlnz\") pod \"machine-config-controller-84d6567774-lvls8\" (UID: \"0b23efab-81af-4c7a-9462-8d4333c2ce44\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-lvls8" Dec 07 19:17:33 crc kubenswrapper[4815]: I1207 19:17:33.219436 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fp4mr" Dec 07 19:17:33 crc kubenswrapper[4815]: I1207 19:17:33.238826 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nm8gg" Dec 07 19:17:33 crc kubenswrapper[4815]: I1207 19:17:33.256118 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-5qcv4" Dec 07 19:17:33 crc kubenswrapper[4815]: I1207 19:17:33.266322 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8zzm\" (UniqueName: \"kubernetes.io/projected/666ce368-deb6-4d44-bb99-a11029a75bf2-kube-api-access-z8zzm\") pod \"packageserver-d55dfcdfc-bls26\" (UID: \"666ce368-deb6-4d44-bb99-a11029a75bf2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bls26" Dec 07 19:17:33 crc kubenswrapper[4815]: I1207 19:17:33.275388 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 07 19:17:33 crc kubenswrapper[4815]: E1207 19:17:33.275714 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-07 19:17:33.775698237 +0000 UTC m=+158.354688282 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:33 crc kubenswrapper[4815]: I1207 19:17:33.278766 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-2zzzb" Dec 07 19:17:33 crc kubenswrapper[4815]: I1207 19:17:33.279097 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mwpns" Dec 07 19:17:33 crc kubenswrapper[4815]: I1207 19:17:33.283780 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-7pcrk" Dec 07 19:17:33 crc kubenswrapper[4815]: I1207 19:17:33.284128 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wp9mp\" (UniqueName: \"kubernetes.io/projected/0b1f9a05-fdb4-42cd-8835-44ab845941ad-kube-api-access-wp9mp\") pod \"collect-profiles-29418915-rch2v\" (UID: \"0b1f9a05-fdb4-42cd-8835-44ab845941ad\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29418915-rch2v" Dec 07 19:17:33 crc kubenswrapper[4815]: I1207 19:17:33.292381 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4lfqc" Dec 07 19:17:33 crc kubenswrapper[4815]: I1207 19:17:33.308303 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-lvls8" Dec 07 19:17:33 crc kubenswrapper[4815]: I1207 19:17:33.316741 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-qfbp2" Dec 07 19:17:33 crc kubenswrapper[4815]: I1207 19:17:33.320055 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bls26" Dec 07 19:17:33 crc kubenswrapper[4815]: I1207 19:17:33.328350 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4gqqs" Dec 07 19:17:33 crc kubenswrapper[4815]: I1207 19:17:33.330558 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xc5wj\" (UniqueName: \"kubernetes.io/projected/0f38c9c5-3002-4d5a-a880-42359a37848b-kube-api-access-xc5wj\") pod \"dns-default-rsb58\" (UID: \"0f38c9c5-3002-4d5a-a880-42359a37848b\") " pod="openshift-dns/dns-default-rsb58" Dec 07 19:17:33 crc kubenswrapper[4815]: I1207 19:17:33.333909 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qcs4x" Dec 07 19:17:33 crc kubenswrapper[4815]: I1207 19:17:33.337436 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwqsg\" (UniqueName: \"kubernetes.io/projected/2677e4e7-8b16-4e36-9361-1896eaa6c973-kube-api-access-pwqsg\") pod \"csi-hostpathplugin-74mbw\" (UID: \"2677e4e7-8b16-4e36-9361-1896eaa6c973\") " pod="hostpath-provisioner/csi-hostpathplugin-74mbw" Dec 07 19:17:33 crc kubenswrapper[4815]: I1207 19:17:33.344326 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29418915-rch2v" Dec 07 19:17:33 crc kubenswrapper[4815]: I1207 19:17:33.354250 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-2r46v" Dec 07 19:17:33 crc kubenswrapper[4815]: I1207 19:17:33.362117 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-sbvc7" Dec 07 19:17:33 crc kubenswrapper[4815]: I1207 19:17:33.379632 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k99gl\" (UID: \"4c3b59c8-1311-4f2a-8ec6-699405d3a4ac\") " pod="openshift-image-registry/image-registry-697d97f7c8-k99gl" Dec 07 19:17:33 crc kubenswrapper[4815]: E1207 19:17:33.379989 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-07 19:17:33.879976166 +0000 UTC m=+158.458966211 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k99gl" (UID: "4c3b59c8-1311-4f2a-8ec6-699405d3a4ac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:33 crc kubenswrapper[4815]: I1207 19:17:33.380094 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rw7gt" Dec 07 19:17:33 crc kubenswrapper[4815]: I1207 19:17:33.388736 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-74mbw" Dec 07 19:17:33 crc kubenswrapper[4815]: I1207 19:17:33.392996 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qpls2"] Dec 07 19:17:33 crc kubenswrapper[4815]: I1207 19:17:33.409611 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xnwvg"] Dec 07 19:17:33 crc kubenswrapper[4815]: I1207 19:17:33.438549 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-c2mqh"] Dec 07 19:17:33 crc kubenswrapper[4815]: I1207 19:17:33.443160 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-gpv67"] Dec 07 19:17:33 crc kubenswrapper[4815]: I1207 19:17:33.465109 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-kznql"] Dec 07 19:17:33 crc kubenswrapper[4815]: I1207 19:17:33.482596 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 07 19:17:33 crc kubenswrapper[4815]: E1207 19:17:33.483136 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-07 19:17:33.983117715 +0000 UTC m=+158.562107760 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:33 crc kubenswrapper[4815]: I1207 19:17:33.555668 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-xxlhj" podStartSLOduration=128.555647678 podStartE2EDuration="2m8.555647678s" podCreationTimestamp="2025-12-07 19:15:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:17:33.517896665 +0000 UTC m=+158.096886710" watchObservedRunningTime="2025-12-07 19:17:33.555647678 +0000 UTC m=+158.134637723" Dec 07 19:17:33 crc kubenswrapper[4815]: I1207 19:17:33.579213 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-6hr2j"] Dec 07 19:17:33 crc kubenswrapper[4815]: I1207 19:17:33.591805 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-7gwm4"] Dec 07 19:17:33 crc kubenswrapper[4815]: I1207 19:17:33.599806 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-rsb58" Dec 07 19:17:33 crc kubenswrapper[4815]: I1207 19:17:33.611499 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k99gl\" (UID: \"4c3b59c8-1311-4f2a-8ec6-699405d3a4ac\") " pod="openshift-image-registry/image-registry-697d97f7c8-k99gl" Dec 07 19:17:33 crc kubenswrapper[4815]: E1207 19:17:33.614050 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-07 19:17:34.11403017 +0000 UTC m=+158.693020215 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k99gl" (UID: "4c3b59c8-1311-4f2a-8ec6-699405d3a4ac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:33 crc kubenswrapper[4815]: I1207 19:17:33.638551 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cxk4m"] Dec 07 19:17:33 crc kubenswrapper[4815]: I1207 19:17:33.720066 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 07 19:17:33 crc kubenswrapper[4815]: E1207 19:17:33.720251 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-07 19:17:34.220221553 +0000 UTC m=+158.799211598 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:33 crc kubenswrapper[4815]: I1207 19:17:33.720441 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k99gl\" (UID: \"4c3b59c8-1311-4f2a-8ec6-699405d3a4ac\") " pod="openshift-image-registry/image-registry-697d97f7c8-k99gl" Dec 07 19:17:33 crc kubenswrapper[4815]: E1207 19:17:33.720785 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-07 19:17:34.220769578 +0000 UTC m=+158.799759623 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k99gl" (UID: "4c3b59c8-1311-4f2a-8ec6-699405d3a4ac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:33 crc kubenswrapper[4815]: I1207 19:17:33.735736 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-z4njj" event={"ID":"d4648241-9b66-421e-b267-fc03442657a8","Type":"ContainerStarted","Data":"7216254915e3c1f9f588f06aea0d0c5303b0e16352adf8979c315f364ce133b3"} Dec 07 19:17:33 crc kubenswrapper[4815]: I1207 19:17:33.736491 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-c2mqh" event={"ID":"8e21be4e-1543-4e91-b451-6b7d9f258195","Type":"ContainerStarted","Data":"bfe8729e16bce949bb290441236944047e20098d490074f3d027e720a1bbddb5"} Dec 07 19:17:33 crc kubenswrapper[4815]: I1207 19:17:33.752274 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ks27d" event={"ID":"e56a0152-c9d6-4e9c-9a59-4ef99aab6524","Type":"ContainerStarted","Data":"5ab29d37142b3176cadc6ef46c09b6aa993e3a9f057e171c5dcf6ff08c6ffaef"} Dec 07 19:17:33 crc kubenswrapper[4815]: I1207 19:17:33.752603 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ks27d" event={"ID":"e56a0152-c9d6-4e9c-9a59-4ef99aab6524","Type":"ContainerStarted","Data":"72ce6ff50c1b57b643c4b0105da5ec68a8ed62b36b3f9d67ae7a6a2a6cca1874"} Dec 07 19:17:33 crc kubenswrapper[4815]: I1207 19:17:33.820838 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 07 19:17:33 crc kubenswrapper[4815]: E1207 19:17:33.821554 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-07 19:17:34.321540051 +0000 UTC m=+158.900530096 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:33 crc kubenswrapper[4815]: I1207 19:17:33.842147 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-prjml"] Dec 07 19:17:33 crc kubenswrapper[4815]: I1207 19:17:33.842182 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-ntzvg"] Dec 07 19:17:33 crc kubenswrapper[4815]: I1207 19:17:33.842194 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-t955b" event={"ID":"e44f9afe-8dfb-40a8-802d-2dd4bae4d7e3","Type":"ContainerStarted","Data":"fd299beb23cb28207bb5d9d270aa9c15d88c466902e4359a738d6067b9ecb2cb"} Dec 07 19:17:33 crc kubenswrapper[4815]: I1207 19:17:33.842524 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fp4mr"] Dec 07 19:17:33 crc kubenswrapper[4815]: I1207 19:17:33.858707 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nm8gg"] Dec 07 19:17:33 crc kubenswrapper[4815]: I1207 19:17:33.865120 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k72f8" event={"ID":"a2bbe4d3-67cf-4517-93c7-528e1694f76c","Type":"ContainerStarted","Data":"531ab9cae2bb91a76d1707062c2608bc4872744f8d0a89634eca2ae361b0d426"} Dec 07 19:17:33 crc kubenswrapper[4815]: I1207 19:17:33.874199 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-c74lm" event={"ID":"00cce455-9eba-4d80-a518-6c05a8efece5","Type":"ContainerStarted","Data":"5eb41a37bd354ccd566f880daf3b143acb5cb6475cd90a9a9b8fcabed1dc6650"} Dec 07 19:17:33 crc kubenswrapper[4815]: I1207 19:17:33.883687 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xnwvg" event={"ID":"7ddadb50-73b1-4948-807b-fe26ca78ea76","Type":"ContainerStarted","Data":"9d6b29e9b9042286267f9c3b39c09a0938db4c963e9c42487c6857459aa1a0ea"} Dec 07 19:17:33 crc kubenswrapper[4815]: I1207 19:17:33.895284 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qpls2" event={"ID":"bc66ff05-f3e1-457d-bed3-b56751586ac4","Type":"ContainerStarted","Data":"2d3c1b9737057b044d62850e37317eb71f4cfb1b73e6bd1a99abbf3fb93dc09a"} Dec 07 19:17:33 crc kubenswrapper[4815]: I1207 19:17:33.926434 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k99gl\" (UID: \"4c3b59c8-1311-4f2a-8ec6-699405d3a4ac\") " pod="openshift-image-registry/image-registry-697d97f7c8-k99gl" Dec 07 19:17:33 crc kubenswrapper[4815]: E1207 19:17:33.927385 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-07 19:17:34.427364514 +0000 UTC m=+159.006354649 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k99gl" (UID: "4c3b59c8-1311-4f2a-8ec6-699405d3a4ac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:33 crc kubenswrapper[4815]: I1207 19:17:33.970799 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cdxxd" event={"ID":"4a3d0d98-5bcb-44a8-beda-1a8559f504ad","Type":"ContainerStarted","Data":"2419e99d14be49a9f3984af3a8b4b438d9662357f0dffecf940b6e78821d2fe7"} Dec 07 19:17:33 crc kubenswrapper[4815]: I1207 19:17:33.971117 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cdxxd" event={"ID":"4a3d0d98-5bcb-44a8-beda-1a8559f504ad","Type":"ContainerStarted","Data":"9663b9ad73b1295d7300288ccea8511011c4f8f2585c4193f238139ef8ab1ac9"} Dec 07 19:17:33 crc kubenswrapper[4815]: I1207 19:17:33.981014 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-kznql" event={"ID":"25d8da8c-5fc5-42ea-8779-2394e32fadee","Type":"ContainerStarted","Data":"82cb8f7dba3432d72f7d1dded94469a0b6643658c1020a98af2353b39c286d79"} Dec 07 19:17:33 crc kubenswrapper[4815]: I1207 19:17:33.990672 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-gpv67" event={"ID":"cb417697-bbf5-4de5-ae9c-c04c37623e57","Type":"ContainerStarted","Data":"9e495c0b2455f26ded3c05de21ea2a66e85761f7bd9fa04fb863a93493fbac65"} Dec 07 19:17:34 crc kubenswrapper[4815]: I1207 19:17:34.022419 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-mxwsj" event={"ID":"d1f45a14-a0f6-4585-8171-2ef3dc5e5d9b","Type":"ContainerStarted","Data":"d13d32673da46b188a4bdea8aaccd27202aa95768518faf753ef0cb4e30370d3"} Dec 07 19:17:34 crc kubenswrapper[4815]: I1207 19:17:34.022452 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-mxwsj" event={"ID":"d1f45a14-a0f6-4585-8171-2ef3dc5e5d9b","Type":"ContainerStarted","Data":"61d879389730348f0bfe702d958ee435b1c39b9e972ff8b0db2d822dc25e52b4"} Dec 07 19:17:34 crc kubenswrapper[4815]: I1207 19:17:34.027591 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 07 19:17:34 crc kubenswrapper[4815]: E1207 19:17:34.027746 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-07 19:17:34.527723546 +0000 UTC m=+159.106713611 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:34 crc kubenswrapper[4815]: I1207 19:17:34.027824 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k99gl\" (UID: \"4c3b59c8-1311-4f2a-8ec6-699405d3a4ac\") " pod="openshift-image-registry/image-registry-697d97f7c8-k99gl" Dec 07 19:17:34 crc kubenswrapper[4815]: E1207 19:17:34.028349 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-07 19:17:34.528333343 +0000 UTC m=+159.107323388 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k99gl" (UID: "4c3b59c8-1311-4f2a-8ec6-699405d3a4ac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:34 crc kubenswrapper[4815]: I1207 19:17:34.132208 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 07 19:17:34 crc kubenswrapper[4815]: E1207 19:17:34.132418 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-07 19:17:34.632387547 +0000 UTC m=+159.211377592 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:34 crc kubenswrapper[4815]: I1207 19:17:34.132784 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k99gl\" (UID: \"4c3b59c8-1311-4f2a-8ec6-699405d3a4ac\") " pod="openshift-image-registry/image-registry-697d97f7c8-k99gl" Dec 07 19:17:34 crc kubenswrapper[4815]: E1207 19:17:34.134472 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-07 19:17:34.634459284 +0000 UTC m=+159.213449329 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k99gl" (UID: "4c3b59c8-1311-4f2a-8ec6-699405d3a4ac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:34 crc kubenswrapper[4815]: I1207 19:17:34.233836 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 07 19:17:34 crc kubenswrapper[4815]: E1207 19:17:34.233937 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-07 19:17:34.733922111 +0000 UTC m=+159.312912156 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:34 crc kubenswrapper[4815]: I1207 19:17:34.234120 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k99gl\" (UID: \"4c3b59c8-1311-4f2a-8ec6-699405d3a4ac\") " pod="openshift-image-registry/image-registry-697d97f7c8-k99gl" Dec 07 19:17:34 crc kubenswrapper[4815]: E1207 19:17:34.234387 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-07 19:17:34.734380664 +0000 UTC m=+159.313370709 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k99gl" (UID: "4c3b59c8-1311-4f2a-8ec6-699405d3a4ac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:34 crc kubenswrapper[4815]: I1207 19:17:34.337646 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 07 19:17:34 crc kubenswrapper[4815]: E1207 19:17:34.338398 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-07 19:17:34.838366876 +0000 UTC m=+159.417356921 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:34 crc kubenswrapper[4815]: I1207 19:17:34.439053 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k99gl\" (UID: \"4c3b59c8-1311-4f2a-8ec6-699405d3a4ac\") " pod="openshift-image-registry/image-registry-697d97f7c8-k99gl" Dec 07 19:17:34 crc kubenswrapper[4815]: E1207 19:17:34.439376 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-07 19:17:34.939364765 +0000 UTC m=+159.518354810 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k99gl" (UID: "4c3b59c8-1311-4f2a-8ec6-699405d3a4ac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:34 crc kubenswrapper[4815]: I1207 19:17:34.540564 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 07 19:17:34 crc kubenswrapper[4815]: E1207 19:17:34.540839 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-07 19:17:35.040822407 +0000 UTC m=+159.619812452 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:34 crc kubenswrapper[4815]: I1207 19:17:34.572446 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-nz5vs" podStartSLOduration=129.5724279 podStartE2EDuration="2m9.5724279s" podCreationTimestamp="2025-12-07 19:15:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:17:34.480967274 +0000 UTC m=+159.059957319" watchObservedRunningTime="2025-12-07 19:17:34.5724279 +0000 UTC m=+159.151417945" Dec 07 19:17:34 crc kubenswrapper[4815]: I1207 19:17:34.575577 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-7pcrk"] Dec 07 19:17:34 crc kubenswrapper[4815]: I1207 19:17:34.623152 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-t955b" Dec 07 19:17:34 crc kubenswrapper[4815]: I1207 19:17:34.634458 4815 patch_prober.go:28] interesting pod/router-default-5444994796-t955b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 07 19:17:34 crc kubenswrapper[4815]: [-]has-synced failed: reason withheld Dec 07 19:17:34 crc kubenswrapper[4815]: [+]process-running ok Dec 07 19:17:34 crc kubenswrapper[4815]: healthz check failed Dec 07 19:17:34 crc kubenswrapper[4815]: I1207 19:17:34.634516 4815 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-t955b" podUID="e44f9afe-8dfb-40a8-802d-2dd4bae4d7e3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 07 19:17:34 crc kubenswrapper[4815]: I1207 19:17:34.645729 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k99gl\" (UID: \"4c3b59c8-1311-4f2a-8ec6-699405d3a4ac\") " pod="openshift-image-registry/image-registry-697d97f7c8-k99gl" Dec 07 19:17:34 crc kubenswrapper[4815]: E1207 19:17:34.646054 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-07 19:17:35.146041383 +0000 UTC m=+159.725031428 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k99gl" (UID: "4c3b59c8-1311-4f2a-8ec6-699405d3a4ac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:34 crc kubenswrapper[4815]: I1207 19:17:34.746652 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 07 19:17:34 crc kubenswrapper[4815]: E1207 19:17:34.747244 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-07 19:17:35.247229928 +0000 UTC m=+159.826219973 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:34 crc kubenswrapper[4815]: I1207 19:17:34.848069 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k99gl\" (UID: \"4c3b59c8-1311-4f2a-8ec6-699405d3a4ac\") " pod="openshift-image-registry/image-registry-697d97f7c8-k99gl" Dec 07 19:17:34 crc kubenswrapper[4815]: E1207 19:17:34.848952 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-07 19:17:35.348909476 +0000 UTC m=+159.927899521 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k99gl" (UID: "4c3b59c8-1311-4f2a-8ec6-699405d3a4ac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:34 crc kubenswrapper[4815]: I1207 19:17:34.947356 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-r924c" podStartSLOduration=129.947337215 podStartE2EDuration="2m9.947337215s" podCreationTimestamp="2025-12-07 19:15:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:17:34.901680044 +0000 UTC m=+159.480670089" watchObservedRunningTime="2025-12-07 19:17:34.947337215 +0000 UTC m=+159.526327260" Dec 07 19:17:34 crc kubenswrapper[4815]: I1207 19:17:34.948531 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-ks27d" podStartSLOduration=129.948526608 podStartE2EDuration="2m9.948526608s" podCreationTimestamp="2025-12-07 19:15:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:17:34.947767537 +0000 UTC m=+159.526757582" watchObservedRunningTime="2025-12-07 19:17:34.948526608 +0000 UTC m=+159.527516653" Dec 07 19:17:34 crc kubenswrapper[4815]: I1207 19:17:34.949536 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 07 19:17:34 crc kubenswrapper[4815]: E1207 19:17:34.950201 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-07 19:17:35.450164883 +0000 UTC m=+160.029154928 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:34 crc kubenswrapper[4815]: I1207 19:17:34.963613 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qcs4x"] Dec 07 19:17:34 crc kubenswrapper[4815]: I1207 19:17:34.980630 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bls26"] Dec 07 19:17:34 crc kubenswrapper[4815]: I1207 19:17:34.991252 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2r46v"] Dec 07 19:17:35 crc kubenswrapper[4815]: I1207 19:17:35.006831 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-2zzzb"] Dec 07 19:17:35 crc kubenswrapper[4815]: I1207 19:17:35.021991 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-5qcv4"] Dec 07 19:17:35 crc kubenswrapper[4815]: I1207 19:17:35.063329 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-4gqqs"] Dec 07 19:17:35 crc kubenswrapper[4815]: I1207 19:17:35.066431 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mwpns"] Dec 07 19:17:35 crc kubenswrapper[4815]: I1207 19:17:35.068271 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k99gl\" (UID: \"4c3b59c8-1311-4f2a-8ec6-699405d3a4ac\") " pod="openshift-image-registry/image-registry-697d97f7c8-k99gl" Dec 07 19:17:35 crc kubenswrapper[4815]: E1207 19:17:35.069105 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-07 19:17:35.569090908 +0000 UTC m=+160.148080953 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k99gl" (UID: "4c3b59c8-1311-4f2a-8ec6-699405d3a4ac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:35 crc kubenswrapper[4815]: I1207 19:17:35.069692 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cdxxd" podStartSLOduration=130.069682254 podStartE2EDuration="2m10.069682254s" podCreationTimestamp="2025-12-07 19:15:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:17:34.981858698 +0000 UTC m=+159.560848743" watchObservedRunningTime="2025-12-07 19:17:35.069682254 +0000 UTC m=+159.648672299" Dec 07 19:17:35 crc kubenswrapper[4815]: I1207 19:17:35.069746 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4lfqc"] Dec 07 19:17:35 crc kubenswrapper[4815]: I1207 19:17:35.093477 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-mxwsj" podStartSLOduration=129.093456351 podStartE2EDuration="2m9.093456351s" podCreationTimestamp="2025-12-07 19:15:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:17:35.018774468 +0000 UTC m=+159.597764513" watchObservedRunningTime="2025-12-07 19:17:35.093456351 +0000 UTC m=+159.672446396" Dec 07 19:17:35 crc kubenswrapper[4815]: I1207 19:17:35.104020 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-lvls8"] Dec 07 19:17:35 crc kubenswrapper[4815]: I1207 19:17:35.109149 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-rsb58"] Dec 07 19:17:35 crc kubenswrapper[4815]: I1207 19:17:35.110214 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-c74lm" podStartSLOduration=130.110197353 podStartE2EDuration="2m10.110197353s" podCreationTimestamp="2025-12-07 19:15:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:17:35.077598063 +0000 UTC m=+159.656588108" watchObservedRunningTime="2025-12-07 19:17:35.110197353 +0000 UTC m=+159.689187398" Dec 07 19:17:35 crc kubenswrapper[4815]: I1207 19:17:35.120175 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-7gwm4" event={"ID":"6029b56f-3736-417d-b832-46b53a99a505","Type":"ContainerStarted","Data":"b9f899146c8afb2d3c6a6ad73578b733cd4e4513485941fd97d49f3b26edea97"} Dec 07 19:17:35 crc kubenswrapper[4815]: I1207 19:17:35.171899 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 07 19:17:35 crc kubenswrapper[4815]: E1207 19:17:35.172168 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-07 19:17:35.672141874 +0000 UTC m=+160.251131919 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:35 crc kubenswrapper[4815]: I1207 19:17:35.173024 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k99gl\" (UID: \"4c3b59c8-1311-4f2a-8ec6-699405d3a4ac\") " pod="openshift-image-registry/image-registry-697d97f7c8-k99gl" Dec 07 19:17:35 crc kubenswrapper[4815]: E1207 19:17:35.173319 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-07 19:17:35.673307756 +0000 UTC m=+160.252297801 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k99gl" (UID: "4c3b59c8-1311-4f2a-8ec6-699405d3a4ac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:35 crc kubenswrapper[4815]: I1207 19:17:35.180815 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29418915-rch2v"] Dec 07 19:17:35 crc kubenswrapper[4815]: I1207 19:17:35.187405 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-sbvc7"] Dec 07 19:17:35 crc kubenswrapper[4815]: I1207 19:17:35.188981 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-r87km" podStartSLOduration=130.188963418 podStartE2EDuration="2m10.188963418s" podCreationTimestamp="2025-12-07 19:15:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:17:35.115941152 +0000 UTC m=+159.694931197" watchObservedRunningTime="2025-12-07 19:17:35.188963418 +0000 UTC m=+159.767953463" Dec 07 19:17:35 crc kubenswrapper[4815]: I1207 19:17:35.189748 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mh7gn" podStartSLOduration=129.18974397 podStartE2EDuration="2m9.18974397s" podCreationTimestamp="2025-12-07 19:15:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:17:35.167473385 +0000 UTC m=+159.746463430" watchObservedRunningTime="2025-12-07 19:17:35.18974397 +0000 UTC m=+159.768734015" Dec 07 19:17:35 crc kubenswrapper[4815]: I1207 19:17:35.198237 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qpls2" event={"ID":"bc66ff05-f3e1-457d-bed3-b56751586ac4","Type":"ContainerStarted","Data":"81357d90344d1d1fff04264bb39ce272e4fddcf7a796bc34dde38953a8c23b9f"} Dec 07 19:17:35 crc kubenswrapper[4815]: I1207 19:17:35.214168 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-t955b" podStartSLOduration=130.214152894 podStartE2EDuration="2m10.214152894s" podCreationTimestamp="2025-12-07 19:15:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:17:35.192398623 +0000 UTC m=+159.771388658" watchObservedRunningTime="2025-12-07 19:17:35.214152894 +0000 UTC m=+159.793142939" Dec 07 19:17:35 crc kubenswrapper[4815]: I1207 19:17:35.214487 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-74mbw"] Dec 07 19:17:35 crc kubenswrapper[4815]: I1207 19:17:35.223263 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-7pcrk" event={"ID":"205b2c9d-23e8-4ce7-9ec9-86a7046a8679","Type":"ContainerStarted","Data":"be93b49e2c3a93eeae080924bd607eb2bdd90f46baf265cff0a4695e76934975"} Dec 07 19:17:35 crc kubenswrapper[4815]: I1207 19:17:35.223296 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-7pcrk" event={"ID":"205b2c9d-23e8-4ce7-9ec9-86a7046a8679","Type":"ContainerStarted","Data":"53de5b90436f4772fa17173a9f55f209d8c408b52097dd65898b5b61c840e3eb"} Dec 07 19:17:35 crc kubenswrapper[4815]: I1207 19:17:35.240330 4815 generic.go:334] "Generic (PLEG): container finished" podID="a2bbe4d3-67cf-4517-93c7-528e1694f76c" containerID="6603d5b97b53f5336f6ebf82b41bc3db3de8eb85a5423dea039fc7f763ff646e" exitCode=0 Dec 07 19:17:35 crc kubenswrapper[4815]: I1207 19:17:35.240444 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k72f8" event={"ID":"a2bbe4d3-67cf-4517-93c7-528e1694f76c","Type":"ContainerDied","Data":"6603d5b97b53f5336f6ebf82b41bc3db3de8eb85a5423dea039fc7f763ff646e"} Dec 07 19:17:35 crc kubenswrapper[4815]: I1207 19:17:35.252135 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-c2mqh" event={"ID":"8e21be4e-1543-4e91-b451-6b7d9f258195","Type":"ContainerStarted","Data":"4f46e10e8f74342ea8664657dd24bed34ed0993d8643f45fd06b03d76b9fa6ab"} Dec 07 19:17:35 crc kubenswrapper[4815]: I1207 19:17:35.252359 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-c2mqh" Dec 07 19:17:35 crc kubenswrapper[4815]: I1207 19:17:35.267134 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qpls2" podStartSLOduration=130.267117067 podStartE2EDuration="2m10.267117067s" podCreationTimestamp="2025-12-07 19:15:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:17:35.266708626 +0000 UTC m=+159.845698691" watchObservedRunningTime="2025-12-07 19:17:35.267117067 +0000 UTC m=+159.846107112" Dec 07 19:17:35 crc kubenswrapper[4815]: I1207 19:17:35.270150 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9hczz" podStartSLOduration=130.27013612 podStartE2EDuration="2m10.27013612s" podCreationTimestamp="2025-12-07 19:15:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:17:35.242356183 +0000 UTC m=+159.821346228" watchObservedRunningTime="2025-12-07 19:17:35.27013612 +0000 UTC m=+159.849126165" Dec 07 19:17:35 crc kubenswrapper[4815]: I1207 19:17:35.272649 4815 patch_prober.go:28] interesting pod/console-operator-58897d9998-c2mqh container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/readyz\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Dec 07 19:17:35 crc kubenswrapper[4815]: I1207 19:17:35.274642 4815 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-c2mqh" podUID="8e21be4e-1543-4e91-b451-6b7d9f258195" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.10:8443/readyz\": dial tcp 10.217.0.10:8443: connect: connection refused" Dec 07 19:17:35 crc kubenswrapper[4815]: I1207 19:17:35.273993 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 07 19:17:35 crc kubenswrapper[4815]: E1207 19:17:35.274039 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-07 19:17:35.774027478 +0000 UTC m=+160.353017523 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:35 crc kubenswrapper[4815]: I1207 19:17:35.280821 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k99gl\" (UID: \"4c3b59c8-1311-4f2a-8ec6-699405d3a4ac\") " pod="openshift-image-registry/image-registry-697d97f7c8-k99gl" Dec 07 19:17:35 crc kubenswrapper[4815]: I1207 19:17:35.279420 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-z4njj" event={"ID":"d4648241-9b66-421e-b267-fc03442657a8","Type":"ContainerDied","Data":"8ef16ecade3ab9b32c9902ba0c9582b8f037d82c65883ae3413c844fcff3fa99"} Dec 07 19:17:35 crc kubenswrapper[4815]: I1207 19:17:35.279398 4815 generic.go:334] "Generic (PLEG): container finished" podID="d4648241-9b66-421e-b267-fc03442657a8" containerID="8ef16ecade3ab9b32c9902ba0c9582b8f037d82c65883ae3413c844fcff3fa99" exitCode=0 Dec 07 19:17:35 crc kubenswrapper[4815]: E1207 19:17:35.282158 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-07 19:17:35.782146692 +0000 UTC m=+160.361136737 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k99gl" (UID: "4c3b59c8-1311-4f2a-8ec6-699405d3a4ac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:35 crc kubenswrapper[4815]: I1207 19:17:35.295532 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-7pcrk" podStartSLOduration=129.295513851 podStartE2EDuration="2m9.295513851s" podCreationTimestamp="2025-12-07 19:15:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:17:35.295009267 +0000 UTC m=+159.873999312" watchObservedRunningTime="2025-12-07 19:17:35.295513851 +0000 UTC m=+159.874503896" Dec 07 19:17:35 crc kubenswrapper[4815]: I1207 19:17:35.297431 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cxk4m" event={"ID":"7d44182d-206b-47ba-9355-dd21174fbea9","Type":"ContainerStarted","Data":"85e436a35c7c4e42762e60c5707c9ed05a3d423cf2ee0084f92c95ff51bcbc54"} Dec 07 19:17:35 crc kubenswrapper[4815]: I1207 19:17:35.314502 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-qfbp2" event={"ID":"6f27c552-b00b-43f1-ac31-fc592228c5ca","Type":"ContainerStarted","Data":"3a2b7e330857c22cfb9c8d4811f04eee0c4043df81a42044780d8b3532e7ee59"} Dec 07 19:17:35 crc kubenswrapper[4815]: I1207 19:17:35.314785 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-qfbp2" event={"ID":"6f27c552-b00b-43f1-ac31-fc592228c5ca","Type":"ContainerStarted","Data":"1cf5ed5a8256bba6d6e3e9065b6c1846ab2bf44f5871a6b5a3b342645ff4719c"} Dec 07 19:17:35 crc kubenswrapper[4815]: I1207 19:17:35.320115 4815 generic.go:334] "Generic (PLEG): container finished" podID="25d8da8c-5fc5-42ea-8779-2394e32fadee" containerID="a3d9304fb88d494a7354222fec5d8a3f2106b85de1256cccd35e356b7be4d1ef" exitCode=0 Dec 07 19:17:35 crc kubenswrapper[4815]: I1207 19:17:35.320171 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-kznql" event={"ID":"25d8da8c-5fc5-42ea-8779-2394e32fadee","Type":"ContainerDied","Data":"a3d9304fb88d494a7354222fec5d8a3f2106b85de1256cccd35e356b7be4d1ef"} Dec 07 19:17:35 crc kubenswrapper[4815]: I1207 19:17:35.335511 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-c74lm" event={"ID":"00cce455-9eba-4d80-a518-6c05a8efece5","Type":"ContainerStarted","Data":"76d5373889f3506d4ae706a43365b166ba3df66b2a0193b480867c61b532bff5"} Dec 07 19:17:35 crc kubenswrapper[4815]: I1207 19:17:35.342948 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xnwvg" event={"ID":"7ddadb50-73b1-4948-807b-fe26ca78ea76","Type":"ContainerStarted","Data":"45b75e8629fa4434278a34cbc3bdadd64a6d8bdd395da901f3ad6482c1027576"} Dec 07 19:17:35 crc kubenswrapper[4815]: I1207 19:17:35.368654 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-c2mqh" podStartSLOduration=130.368641961 podStartE2EDuration="2m10.368641961s" podCreationTimestamp="2025-12-07 19:15:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:17:35.36716188 +0000 UTC m=+159.946151925" watchObservedRunningTime="2025-12-07 19:17:35.368641961 +0000 UTC m=+159.947632006" Dec 07 19:17:35 crc kubenswrapper[4815]: I1207 19:17:35.384866 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 07 19:17:35 crc kubenswrapper[4815]: E1207 19:17:35.385005 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-07 19:17:35.884984152 +0000 UTC m=+160.463974197 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:35 crc kubenswrapper[4815]: I1207 19:17:35.385104 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k99gl\" (UID: \"4c3b59c8-1311-4f2a-8ec6-699405d3a4ac\") " pod="openshift-image-registry/image-registry-697d97f7c8-k99gl" Dec 07 19:17:35 crc kubenswrapper[4815]: E1207 19:17:35.386007 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-07 19:17:35.88599718 +0000 UTC m=+160.464987315 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k99gl" (UID: "4c3b59c8-1311-4f2a-8ec6-699405d3a4ac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:35 crc kubenswrapper[4815]: I1207 19:17:35.404614 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fp4mr" event={"ID":"e93e31e5-1375-4931-b77d-dd8b94f4cd4e","Type":"ContainerStarted","Data":"aa83260741695f8c6592c961bf13b0d1c66176a808bc25a9102d0f1286e5263b"} Dec 07 19:17:35 crc kubenswrapper[4815]: I1207 19:17:35.410081 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nm8gg" event={"ID":"55faac8b-3bb2-43c9-ad70-88da5d7932c7","Type":"ContainerStarted","Data":"d98484adfb356db5293afc9968df05024b4331f930ec7d9018fcdebf610425a8"} Dec 07 19:17:35 crc kubenswrapper[4815]: I1207 19:17:35.449164 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-prjml" event={"ID":"d399274f-1d12-4ffb-b2b6-e2b3619c86e6","Type":"ContainerStarted","Data":"f7d097fd8a54706f4964075ea5db3196ccd04c8fedad9266c9a3e47392960eba"} Dec 07 19:17:35 crc kubenswrapper[4815]: I1207 19:17:35.449201 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-prjml" event={"ID":"d399274f-1d12-4ffb-b2b6-e2b3619c86e6","Type":"ContainerStarted","Data":"7dce44c95dd90ab4ed633581999e15a108708c9dd3c07cc5deb8e1b7790ea648"} Dec 07 19:17:35 crc kubenswrapper[4815]: I1207 19:17:35.454511 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ntzvg" event={"ID":"30ebc473-999f-4200-a9bf-714efb9646e1","Type":"ContainerStarted","Data":"decd7d902283a25971ca6f8fa8711ab9e5bf9d65300c0ca836c4f4c802555579"} Dec 07 19:17:35 crc kubenswrapper[4815]: I1207 19:17:35.454567 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ntzvg" event={"ID":"30ebc473-999f-4200-a9bf-714efb9646e1","Type":"ContainerStarted","Data":"8b3a612fc4a4f29d28d62a5d0030c453d2d6332996fa77d4d27f5ed983082248"} Dec 07 19:17:35 crc kubenswrapper[4815]: I1207 19:17:35.459092 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-6hr2j" event={"ID":"df54df8a-669c-4230-b377-640a79b757ab","Type":"ContainerStarted","Data":"cf7a7eeb0161b77fa9cbe45b5ed688b4fe92d870fc90ef417650db5b3c9887b2"} Dec 07 19:17:35 crc kubenswrapper[4815]: I1207 19:17:35.459118 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-6hr2j" event={"ID":"df54df8a-669c-4230-b377-640a79b757ab","Type":"ContainerStarted","Data":"4e31e68bffd3deabc48fc1b3d681fac041f1704c95869de04ce2e847c138532e"} Dec 07 19:17:35 crc kubenswrapper[4815]: I1207 19:17:35.460134 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-6hr2j" Dec 07 19:17:35 crc kubenswrapper[4815]: I1207 19:17:35.465286 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rw7gt"] Dec 07 19:17:35 crc kubenswrapper[4815]: I1207 19:17:35.470457 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-gpv67" event={"ID":"cb417697-bbf5-4de5-ae9c-c04c37623e57","Type":"ContainerStarted","Data":"ae28185864739701227ffb53f48b43f9c4639beb5cee9249baa73534db03f9d5"} Dec 07 19:17:35 crc kubenswrapper[4815]: I1207 19:17:35.479060 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-6hr2j" Dec 07 19:17:35 crc kubenswrapper[4815]: I1207 19:17:35.486231 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-qfbp2" podStartSLOduration=5.486216578 podStartE2EDuration="5.486216578s" podCreationTimestamp="2025-12-07 19:17:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:17:35.484671406 +0000 UTC m=+160.063661441" watchObservedRunningTime="2025-12-07 19:17:35.486216578 +0000 UTC m=+160.065206623" Dec 07 19:17:35 crc kubenswrapper[4815]: I1207 19:17:35.487449 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 07 19:17:35 crc kubenswrapper[4815]: E1207 19:17:35.489091 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-07 19:17:35.989068857 +0000 UTC m=+160.568058902 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:35 crc kubenswrapper[4815]: I1207 19:17:35.489557 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k99gl\" (UID: \"4c3b59c8-1311-4f2a-8ec6-699405d3a4ac\") " pod="openshift-image-registry/image-registry-697d97f7c8-k99gl" Dec 07 19:17:35 crc kubenswrapper[4815]: E1207 19:17:35.490959 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-07 19:17:35.990950879 +0000 UTC m=+160.569940924 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k99gl" (UID: "4c3b59c8-1311-4f2a-8ec6-699405d3a4ac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:35 crc kubenswrapper[4815]: I1207 19:17:35.590768 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 07 19:17:35 crc kubenswrapper[4815]: E1207 19:17:35.592502 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-07 19:17:36.092487523 +0000 UTC m=+160.671477568 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:35 crc kubenswrapper[4815]: I1207 19:17:35.594852 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-gpv67" podStartSLOduration=130.590462358 podStartE2EDuration="2m10.590462358s" podCreationTimestamp="2025-12-07 19:15:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:17:35.569259682 +0000 UTC m=+160.148249727" watchObservedRunningTime="2025-12-07 19:17:35.590462358 +0000 UTC m=+160.169452403" Dec 07 19:17:35 crc kubenswrapper[4815]: I1207 19:17:35.597320 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-prjml" podStartSLOduration=130.597311357 podStartE2EDuration="2m10.597311357s" podCreationTimestamp="2025-12-07 19:15:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:17:35.595166647 +0000 UTC m=+160.174156692" watchObservedRunningTime="2025-12-07 19:17:35.597311357 +0000 UTC m=+160.176301402" Dec 07 19:17:35 crc kubenswrapper[4815]: I1207 19:17:35.610984 4815 patch_prober.go:28] interesting pod/router-default-5444994796-t955b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 07 19:17:35 crc kubenswrapper[4815]: [-]has-synced failed: reason withheld Dec 07 19:17:35 crc kubenswrapper[4815]: [+]process-running ok Dec 07 19:17:35 crc kubenswrapper[4815]: healthz check failed Dec 07 19:17:35 crc kubenswrapper[4815]: I1207 19:17:35.611213 4815 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-t955b" podUID="e44f9afe-8dfb-40a8-802d-2dd4bae4d7e3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 07 19:17:35 crc kubenswrapper[4815]: I1207 19:17:35.692820 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k99gl\" (UID: \"4c3b59c8-1311-4f2a-8ec6-699405d3a4ac\") " pod="openshift-image-registry/image-registry-697d97f7c8-k99gl" Dec 07 19:17:35 crc kubenswrapper[4815]: E1207 19:17:35.693132 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-07 19:17:36.193120303 +0000 UTC m=+160.772110348 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k99gl" (UID: "4c3b59c8-1311-4f2a-8ec6-699405d3a4ac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:35 crc kubenswrapper[4815]: I1207 19:17:35.710744 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-6hr2j" podStartSLOduration=130.710726719 podStartE2EDuration="2m10.710726719s" podCreationTimestamp="2025-12-07 19:15:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:17:35.701086363 +0000 UTC m=+160.280076418" watchObservedRunningTime="2025-12-07 19:17:35.710726719 +0000 UTC m=+160.289716764" Dec 07 19:17:35 crc kubenswrapper[4815]: I1207 19:17:35.803515 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 07 19:17:35 crc kubenswrapper[4815]: E1207 19:17:35.803789 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-07 19:17:36.303775969 +0000 UTC m=+160.882766014 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:35 crc kubenswrapper[4815]: I1207 19:17:35.855464 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xnwvg" podStartSLOduration=130.855448566 podStartE2EDuration="2m10.855448566s" podCreationTimestamp="2025-12-07 19:15:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:17:35.754149138 +0000 UTC m=+160.333139183" watchObservedRunningTime="2025-12-07 19:17:35.855448566 +0000 UTC m=+160.434438621" Dec 07 19:17:35 crc kubenswrapper[4815]: I1207 19:17:35.904834 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k99gl\" (UID: \"4c3b59c8-1311-4f2a-8ec6-699405d3a4ac\") " pod="openshift-image-registry/image-registry-697d97f7c8-k99gl" Dec 07 19:17:35 crc kubenswrapper[4815]: E1207 19:17:35.905161 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-07 19:17:36.405144799 +0000 UTC m=+160.984134834 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k99gl" (UID: "4c3b59c8-1311-4f2a-8ec6-699405d3a4ac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:36 crc kubenswrapper[4815]: I1207 19:17:36.008813 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 07 19:17:36 crc kubenswrapper[4815]: E1207 19:17:36.009107 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-07 19:17:36.50909224 +0000 UTC m=+161.088082285 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:36 crc kubenswrapper[4815]: I1207 19:17:36.009163 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k99gl\" (UID: \"4c3b59c8-1311-4f2a-8ec6-699405d3a4ac\") " pod="openshift-image-registry/image-registry-697d97f7c8-k99gl" Dec 07 19:17:36 crc kubenswrapper[4815]: E1207 19:17:36.009419 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-07 19:17:36.509411409 +0000 UTC m=+161.088401454 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k99gl" (UID: "4c3b59c8-1311-4f2a-8ec6-699405d3a4ac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:36 crc kubenswrapper[4815]: I1207 19:17:36.110193 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 07 19:17:36 crc kubenswrapper[4815]: E1207 19:17:36.110668 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-07 19:17:36.610655625 +0000 UTC m=+161.189645660 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:36 crc kubenswrapper[4815]: I1207 19:17:36.211604 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k99gl\" (UID: \"4c3b59c8-1311-4f2a-8ec6-699405d3a4ac\") " pod="openshift-image-registry/image-registry-697d97f7c8-k99gl" Dec 07 19:17:36 crc kubenswrapper[4815]: E1207 19:17:36.211940 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-07 19:17:36.711904131 +0000 UTC m=+161.290894176 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k99gl" (UID: "4c3b59c8-1311-4f2a-8ec6-699405d3a4ac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:36 crc kubenswrapper[4815]: I1207 19:17:36.312469 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 07 19:17:36 crc kubenswrapper[4815]: E1207 19:17:36.312736 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-07 19:17:36.812715746 +0000 UTC m=+161.391705791 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:36 crc kubenswrapper[4815]: I1207 19:17:36.425565 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k99gl\" (UID: \"4c3b59c8-1311-4f2a-8ec6-699405d3a4ac\") " pod="openshift-image-registry/image-registry-697d97f7c8-k99gl" Dec 07 19:17:36 crc kubenswrapper[4815]: E1207 19:17:36.425849 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-07 19:17:36.92583793 +0000 UTC m=+161.504827975 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k99gl" (UID: "4c3b59c8-1311-4f2a-8ec6-699405d3a4ac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:36 crc kubenswrapper[4815]: I1207 19:17:36.527666 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 07 19:17:36 crc kubenswrapper[4815]: E1207 19:17:36.528100 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-07 19:17:37.028084414 +0000 UTC m=+161.607074469 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:36 crc kubenswrapper[4815]: I1207 19:17:36.605459 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29418915-rch2v" event={"ID":"0b1f9a05-fdb4-42cd-8835-44ab845941ad","Type":"ContainerStarted","Data":"bb77becfe2fd380c377e5c61a4161e67d662c57941151ee64ec98d16711606a6"} Dec 07 19:17:36 crc kubenswrapper[4815]: I1207 19:17:36.605750 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29418915-rch2v" event={"ID":"0b1f9a05-fdb4-42cd-8835-44ab845941ad","Type":"ContainerStarted","Data":"654f1e03ef13e52a96e4f3e9ef72628192d247c3d3e42eb6920e0a25ae255033"} Dec 07 19:17:36 crc kubenswrapper[4815]: I1207 19:17:36.628927 4815 patch_prober.go:28] interesting pod/router-default-5444994796-t955b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 07 19:17:36 crc kubenswrapper[4815]: [-]has-synced failed: reason withheld Dec 07 19:17:36 crc kubenswrapper[4815]: [+]process-running ok Dec 07 19:17:36 crc kubenswrapper[4815]: healthz check failed Dec 07 19:17:36 crc kubenswrapper[4815]: I1207 19:17:36.628981 4815 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-t955b" podUID="e44f9afe-8dfb-40a8-802d-2dd4bae4d7e3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 07 19:17:36 crc kubenswrapper[4815]: I1207 19:17:36.635707 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k99gl\" (UID: \"4c3b59c8-1311-4f2a-8ec6-699405d3a4ac\") " pod="openshift-image-registry/image-registry-697d97f7c8-k99gl" Dec 07 19:17:36 crc kubenswrapper[4815]: E1207 19:17:36.636026 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-07 19:17:37.135992674 +0000 UTC m=+161.714982719 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k99gl" (UID: "4c3b59c8-1311-4f2a-8ec6-699405d3a4ac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:36 crc kubenswrapper[4815]: I1207 19:17:36.638330 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29418915-rch2v" podStartSLOduration=131.638314838 podStartE2EDuration="2m11.638314838s" podCreationTimestamp="2025-12-07 19:15:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:17:36.636243271 +0000 UTC m=+161.215233316" watchObservedRunningTime="2025-12-07 19:17:36.638314838 +0000 UTC m=+161.217304883" Dec 07 19:17:36 crc kubenswrapper[4815]: I1207 19:17:36.656985 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nm8gg" event={"ID":"55faac8b-3bb2-43c9-ad70-88da5d7932c7","Type":"ContainerStarted","Data":"629b3d04ae09eaa3290a62dbfdd385ea8019e9c0663170e969ca26b8e74af457"} Dec 07 19:17:36 crc kubenswrapper[4815]: I1207 19:17:36.674831 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-2r46v" event={"ID":"e58da229-9be5-4d48-a1af-74d5316d09f3","Type":"ContainerStarted","Data":"aac1a4ad813ee6b3188f289419041212a420e2ddd65928e238c66f9b75ebe0cf"} Dec 07 19:17:36 crc kubenswrapper[4815]: I1207 19:17:36.690576 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ntzvg" event={"ID":"30ebc473-999f-4200-a9bf-714efb9646e1","Type":"ContainerStarted","Data":"d3559dafdf563b8b3c5a62ccece420fa6f328a6889d90edd257aec40f112d043"} Dec 07 19:17:36 crc kubenswrapper[4815]: I1207 19:17:36.701501 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qcs4x" event={"ID":"5261a18d-d284-49f4-bc7b-196d2f5f5042","Type":"ContainerStarted","Data":"74fbbf946ce64a755fc237e427a6b8dd70175d766e2abe4a11d0f2f95c586069"} Dec 07 19:17:36 crc kubenswrapper[4815]: I1207 19:17:36.701566 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qcs4x" event={"ID":"5261a18d-d284-49f4-bc7b-196d2f5f5042","Type":"ContainerStarted","Data":"1abe726c9e39f3d97ce069bf363ece55ead02a907a18c4caef944c90e9c7e2b2"} Dec 07 19:17:36 crc kubenswrapper[4815]: I1207 19:17:36.702858 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qcs4x" Dec 07 19:17:36 crc kubenswrapper[4815]: I1207 19:17:36.745065 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 07 19:17:36 crc kubenswrapper[4815]: I1207 19:17:36.745718 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nm8gg" podStartSLOduration=131.745704804 podStartE2EDuration="2m11.745704804s" podCreationTimestamp="2025-12-07 19:15:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:17:36.69703646 +0000 UTC m=+161.276026505" watchObservedRunningTime="2025-12-07 19:17:36.745704804 +0000 UTC m=+161.324694849" Dec 07 19:17:36 crc kubenswrapper[4815]: E1207 19:17:36.746051 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-07 19:17:37.246030483 +0000 UTC m=+161.825020528 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:36 crc kubenswrapper[4815]: I1207 19:17:36.746804 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-ntzvg" podStartSLOduration=131.746798455 podStartE2EDuration="2m11.746798455s" podCreationTimestamp="2025-12-07 19:15:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:17:36.744552763 +0000 UTC m=+161.323542798" watchObservedRunningTime="2025-12-07 19:17:36.746798455 +0000 UTC m=+161.325788500" Dec 07 19:17:36 crc kubenswrapper[4815]: I1207 19:17:36.746872 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-7gwm4" event={"ID":"6029b56f-3736-417d-b832-46b53a99a505","Type":"ContainerStarted","Data":"065f25fff4081a7f04944e040bcbe73a134c25e05400ce50576178460f586859"} Dec 07 19:17:36 crc kubenswrapper[4815]: I1207 19:17:36.783028 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k72f8" event={"ID":"a2bbe4d3-67cf-4517-93c7-528e1694f76c","Type":"ContainerStarted","Data":"4bda7e64baf940c692dfca217c9996e31ed704e395edf8b5ec57867b4c6913d8"} Dec 07 19:17:36 crc kubenswrapper[4815]: I1207 19:17:36.789115 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-5qcv4" event={"ID":"850d8c0c-0380-4408-8a94-1cc7cf978944","Type":"ContainerStarted","Data":"54928169c7a10bdf618ca2c4f4e3379bc567664d97fd4f44f35c2adfc162a9a1"} Dec 07 19:17:36 crc kubenswrapper[4815]: I1207 19:17:36.789172 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-5qcv4" event={"ID":"850d8c0c-0380-4408-8a94-1cc7cf978944","Type":"ContainerStarted","Data":"013d7dc90317f70669c53a46eaafcd559d0c0266a08e05eb1fdab573c27ded80"} Dec 07 19:17:36 crc kubenswrapper[4815]: I1207 19:17:36.791557 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fp4mr" event={"ID":"e93e31e5-1375-4931-b77d-dd8b94f4cd4e","Type":"ContainerStarted","Data":"34701886899a81be8b8b6b53ff39864814cdb5310e9972cf2154866477479872"} Dec 07 19:17:36 crc kubenswrapper[4815]: I1207 19:17:36.807072 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-lvls8" event={"ID":"0b23efab-81af-4c7a-9462-8d4333c2ce44","Type":"ContainerStarted","Data":"ae91fc618d159cfa0cc9a9541fb55f457d5375c5200d0e1b82c0ed3678be02f4"} Dec 07 19:17:36 crc kubenswrapper[4815]: I1207 19:17:36.807115 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-lvls8" event={"ID":"0b23efab-81af-4c7a-9462-8d4333c2ce44","Type":"ContainerStarted","Data":"24651519806a6ad8394dbb52e5bf1cbe551ddc64535a2dff596f339fb13a1ca8"} Dec 07 19:17:36 crc kubenswrapper[4815]: I1207 19:17:36.826475 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mwpns" event={"ID":"c9c9f5c1-6d1f-4cec-8365-3010eb0355a8","Type":"ContainerStarted","Data":"b76162d1b8ed3c18898b5312a8cd22dd2bae3cfb8e8ddd41435c02c428c57f7f"} Dec 07 19:17:36 crc kubenswrapper[4815]: I1207 19:17:36.826523 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mwpns" Dec 07 19:17:36 crc kubenswrapper[4815]: I1207 19:17:36.826542 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mwpns" event={"ID":"c9c9f5c1-6d1f-4cec-8365-3010eb0355a8","Type":"ContainerStarted","Data":"f74909a344bcfa1e0054200494d3057d7d21128776f6e1ee9dd99d40e7e17eb9"} Dec 07 19:17:36 crc kubenswrapper[4815]: I1207 19:17:36.828044 4815 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-mwpns container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.33:8443/healthz\": dial tcp 10.217.0.33:8443: connect: connection refused" start-of-body= Dec 07 19:17:36 crc kubenswrapper[4815]: I1207 19:17:36.828085 4815 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mwpns" podUID="c9c9f5c1-6d1f-4cec-8365-3010eb0355a8" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.33:8443/healthz\": dial tcp 10.217.0.33:8443: connect: connection refused" Dec 07 19:17:36 crc kubenswrapper[4815]: I1207 19:17:36.844362 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-2zzzb" event={"ID":"c6a12b6a-72ab-4b3a-858b-9e4657c1e03f","Type":"ContainerStarted","Data":"d8f79d538ac863821522133d25d156a68cb50907f0c414187b8f0cf3c62f1b18"} Dec 07 19:17:36 crc kubenswrapper[4815]: I1207 19:17:36.846091 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k99gl\" (UID: \"4c3b59c8-1311-4f2a-8ec6-699405d3a4ac\") " pod="openshift-image-registry/image-registry-697d97f7c8-k99gl" Dec 07 19:17:36 crc kubenswrapper[4815]: E1207 19:17:36.846332 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-07 19:17:37.346321413 +0000 UTC m=+161.925311458 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k99gl" (UID: "4c3b59c8-1311-4f2a-8ec6-699405d3a4ac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:36 crc kubenswrapper[4815]: I1207 19:17:36.883481 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bls26" event={"ID":"666ce368-deb6-4d44-bb99-a11029a75bf2","Type":"ContainerStarted","Data":"6f77e5da223f811e618d3406c7674ced510e7be7368c3d7dfffd2c25fcb6e147"} Dec 07 19:17:36 crc kubenswrapper[4815]: I1207 19:17:36.897903 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cxk4m" event={"ID":"7d44182d-206b-47ba-9355-dd21174fbea9","Type":"ContainerStarted","Data":"aadda910587a27c07f5c746b536a4540a6ac3013d1452e1ba2a6558682449fe2"} Dec 07 19:17:36 crc kubenswrapper[4815]: I1207 19:17:36.900364 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rsb58" event={"ID":"0f38c9c5-3002-4d5a-a880-42359a37848b","Type":"ContainerStarted","Data":"7c018c9d8708c4aa6e96da18a0378ff8a3b0c1960286ab561c1c3d271778ed7e"} Dec 07 19:17:36 crc kubenswrapper[4815]: I1207 19:17:36.915645 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-prjml" event={"ID":"d399274f-1d12-4ffb-b2b6-e2b3619c86e6","Type":"ContainerStarted","Data":"3685e8b05edb927c97adc0085fcb497391eea8beddeb5c599038ab65761e31f3"} Dec 07 19:17:36 crc kubenswrapper[4815]: I1207 19:17:36.930506 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qcs4x" podStartSLOduration=130.930489267 podStartE2EDuration="2m10.930489267s" podCreationTimestamp="2025-12-07 19:15:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:17:36.865520724 +0000 UTC m=+161.444510769" watchObservedRunningTime="2025-12-07 19:17:36.930489267 +0000 UTC m=+161.509479312" Dec 07 19:17:36 crc kubenswrapper[4815]: I1207 19:17:36.948575 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 07 19:17:36 crc kubenswrapper[4815]: E1207 19:17:36.950084 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-07 19:17:37.450065178 +0000 UTC m=+162.029055223 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:36 crc kubenswrapper[4815]: I1207 19:17:36.955268 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4gqqs" event={"ID":"ff9a9a52-3b06-411e-88d9-a0afcce6a632","Type":"ContainerStarted","Data":"9eac2acb6e6bd7beb31f4751fe6bd22a36179e6c924a4443cea9bdec4d08df58"} Dec 07 19:17:36 crc kubenswrapper[4815]: I1207 19:17:36.955312 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4gqqs" event={"ID":"ff9a9a52-3b06-411e-88d9-a0afcce6a632","Type":"ContainerStarted","Data":"75095a02b294aaf36c312e95cfe6e394b7f1dd6c997e05c5389e4db2f29378d0"} Dec 07 19:17:36 crc kubenswrapper[4815]: I1207 19:17:36.969193 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qcs4x" Dec 07 19:17:36 crc kubenswrapper[4815]: I1207 19:17:36.985592 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-fp4mr" podStartSLOduration=131.985571258 podStartE2EDuration="2m11.985571258s" podCreationTimestamp="2025-12-07 19:15:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:17:36.983298555 +0000 UTC m=+161.562288600" watchObservedRunningTime="2025-12-07 19:17:36.985571258 +0000 UTC m=+161.564561303" Dec 07 19:17:36 crc kubenswrapper[4815]: I1207 19:17:36.985751 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-lvls8" podStartSLOduration=130.985747453 podStartE2EDuration="2m10.985747453s" podCreationTimestamp="2025-12-07 19:15:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:17:36.929959222 +0000 UTC m=+161.508949267" watchObservedRunningTime="2025-12-07 19:17:36.985747453 +0000 UTC m=+161.564737498" Dec 07 19:17:36 crc kubenswrapper[4815]: I1207 19:17:36.992250 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4lfqc" event={"ID":"d3e7aced-7e72-4b56-837d-6894066d39e9","Type":"ContainerStarted","Data":"f49e55692228e0c139e55647eaa645a01dfab7dee58483ab383e288c61374056"} Dec 07 19:17:36 crc kubenswrapper[4815]: I1207 19:17:36.992298 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4lfqc" event={"ID":"d3e7aced-7e72-4b56-837d-6894066d39e9","Type":"ContainerStarted","Data":"1cf018d0caeb19d56fab636b9e44838a9c8bb9f88a2aac323c1c89fa51ec0ad4"} Dec 07 19:17:37 crc kubenswrapper[4815]: I1207 19:17:37.018775 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-74mbw" event={"ID":"2677e4e7-8b16-4e36-9361-1896eaa6c973","Type":"ContainerStarted","Data":"086f39551344dd491ddc68ec2bc75c943e1a4282025dc42ea42fc3b7402cb969"} Dec 07 19:17:37 crc kubenswrapper[4815]: I1207 19:17:37.043203 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-sbvc7" event={"ID":"8d96d1d6-7b0d-43b9-9589-83f4c76200c6","Type":"ContainerStarted","Data":"f14e10a3e7e01c9f46c8d5e9d28b3a66d3f47e266a22d0b454da81324964ce6b"} Dec 07 19:17:37 crc kubenswrapper[4815]: I1207 19:17:37.062106 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k99gl\" (UID: \"4c3b59c8-1311-4f2a-8ec6-699405d3a4ac\") " pod="openshift-image-registry/image-registry-697d97f7c8-k99gl" Dec 07 19:17:37 crc kubenswrapper[4815]: E1207 19:17:37.062544 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-07 19:17:37.562524474 +0000 UTC m=+162.141514599 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k99gl" (UID: "4c3b59c8-1311-4f2a-8ec6-699405d3a4ac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:37 crc kubenswrapper[4815]: I1207 19:17:37.085558 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rw7gt" event={"ID":"1dea4684-eec3-484b-8bc7-f524b4671833","Type":"ContainerStarted","Data":"30bcac9306641a03e834fccf05a89c4d12fa6f7e5454adefc7f99f3ce0f32a36"} Dec 07 19:17:37 crc kubenswrapper[4815]: I1207 19:17:37.087016 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-gpv67" Dec 07 19:17:37 crc kubenswrapper[4815]: I1207 19:17:37.115137 4815 patch_prober.go:28] interesting pod/downloads-7954f5f757-gpv67 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Dec 07 19:17:37 crc kubenswrapper[4815]: I1207 19:17:37.115198 4815 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-gpv67" podUID="cb417697-bbf5-4de5-ae9c-c04c37623e57" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Dec 07 19:17:37 crc kubenswrapper[4815]: I1207 19:17:37.146298 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mwpns" podStartSLOduration=131.146282947 podStartE2EDuration="2m11.146282947s" podCreationTimestamp="2025-12-07 19:15:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:17:37.145693881 +0000 UTC m=+161.724683926" watchObservedRunningTime="2025-12-07 19:17:37.146282947 +0000 UTC m=+161.725272992" Dec 07 19:17:37 crc kubenswrapper[4815]: I1207 19:17:37.148343 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-5qcv4" podStartSLOduration=131.148335614 podStartE2EDuration="2m11.148335614s" podCreationTimestamp="2025-12-07 19:15:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:17:37.063550212 +0000 UTC m=+161.642540257" watchObservedRunningTime="2025-12-07 19:17:37.148335614 +0000 UTC m=+161.727325659" Dec 07 19:17:37 crc kubenswrapper[4815]: I1207 19:17:37.164401 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 07 19:17:37 crc kubenswrapper[4815]: E1207 19:17:37.165631 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-07 19:17:37.665616891 +0000 UTC m=+162.244606936 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:37 crc kubenswrapper[4815]: I1207 19:17:37.227439 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k72f8" podStartSLOduration=131.227422798 podStartE2EDuration="2m11.227422798s" podCreationTimestamp="2025-12-07 19:15:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:17:37.193493181 +0000 UTC m=+161.772483226" watchObservedRunningTime="2025-12-07 19:17:37.227422798 +0000 UTC m=+161.806412843" Dec 07 19:17:37 crc kubenswrapper[4815]: I1207 19:17:37.278652 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k99gl\" (UID: \"4c3b59c8-1311-4f2a-8ec6-699405d3a4ac\") " pod="openshift-image-registry/image-registry-697d97f7c8-k99gl" Dec 07 19:17:37 crc kubenswrapper[4815]: E1207 19:17:37.281769 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-07 19:17:37.781758219 +0000 UTC m=+162.360748264 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k99gl" (UID: "4c3b59c8-1311-4f2a-8ec6-699405d3a4ac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:37 crc kubenswrapper[4815]: I1207 19:17:37.286020 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4lfqc" podStartSLOduration=132.286007906 podStartE2EDuration="2m12.286007906s" podCreationTimestamp="2025-12-07 19:15:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:17:37.228965401 +0000 UTC m=+161.807955456" watchObservedRunningTime="2025-12-07 19:17:37.286007906 +0000 UTC m=+161.864997951" Dec 07 19:17:37 crc kubenswrapper[4815]: I1207 19:17:37.379652 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 07 19:17:37 crc kubenswrapper[4815]: E1207 19:17:37.379937 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-07 19:17:37.87991064 +0000 UTC m=+162.458900685 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:37 crc kubenswrapper[4815]: I1207 19:17:37.402330 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-sbvc7" podStartSLOduration=7.402317838 podStartE2EDuration="7.402317838s" podCreationTimestamp="2025-12-07 19:17:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:17:37.291786186 +0000 UTC m=+161.870776231" watchObservedRunningTime="2025-12-07 19:17:37.402317838 +0000 UTC m=+161.981307883" Dec 07 19:17:37 crc kubenswrapper[4815]: I1207 19:17:37.430303 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k72f8" Dec 07 19:17:37 crc kubenswrapper[4815]: I1207 19:17:37.430645 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k72f8" Dec 07 19:17:37 crc kubenswrapper[4815]: I1207 19:17:37.480333 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k99gl\" (UID: \"4c3b59c8-1311-4f2a-8ec6-699405d3a4ac\") " pod="openshift-image-registry/image-registry-697d97f7c8-k99gl" Dec 07 19:17:37 crc kubenswrapper[4815]: E1207 19:17:37.480883 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-07 19:17:37.980871178 +0000 UTC m=+162.559861223 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k99gl" (UID: "4c3b59c8-1311-4f2a-8ec6-699405d3a4ac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:37 crc kubenswrapper[4815]: I1207 19:17:37.584377 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 07 19:17:37 crc kubenswrapper[4815]: E1207 19:17:37.584650 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-07 19:17:38.084634814 +0000 UTC m=+162.663624859 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:37 crc kubenswrapper[4815]: I1207 19:17:37.601041 4815 patch_prober.go:28] interesting pod/router-default-5444994796-t955b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 07 19:17:37 crc kubenswrapper[4815]: [-]has-synced failed: reason withheld Dec 07 19:17:37 crc kubenswrapper[4815]: [+]process-running ok Dec 07 19:17:37 crc kubenswrapper[4815]: healthz check failed Dec 07 19:17:37 crc kubenswrapper[4815]: I1207 19:17:37.601092 4815 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-t955b" podUID="e44f9afe-8dfb-40a8-802d-2dd4bae4d7e3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 07 19:17:37 crc kubenswrapper[4815]: I1207 19:17:37.691590 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k99gl\" (UID: \"4c3b59c8-1311-4f2a-8ec6-699405d3a4ac\") " pod="openshift-image-registry/image-registry-697d97f7c8-k99gl" Dec 07 19:17:37 crc kubenswrapper[4815]: E1207 19:17:37.691877 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-07 19:17:38.191866155 +0000 UTC m=+162.770856200 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k99gl" (UID: "4c3b59c8-1311-4f2a-8ec6-699405d3a4ac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:37 crc kubenswrapper[4815]: I1207 19:17:37.793024 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 07 19:17:37 crc kubenswrapper[4815]: E1207 19:17:37.793598 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-07 19:17:38.293583325 +0000 UTC m=+162.872573370 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:37 crc kubenswrapper[4815]: I1207 19:17:37.894694 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k99gl\" (UID: \"4c3b59c8-1311-4f2a-8ec6-699405d3a4ac\") " pod="openshift-image-registry/image-registry-697d97f7c8-k99gl" Dec 07 19:17:37 crc kubenswrapper[4815]: E1207 19:17:37.895086 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-07 19:17:38.395070838 +0000 UTC m=+162.974060883 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k99gl" (UID: "4c3b59c8-1311-4f2a-8ec6-699405d3a4ac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:37 crc kubenswrapper[4815]: I1207 19:17:37.995353 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 07 19:17:37 crc kubenswrapper[4815]: E1207 19:17:37.995816 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-07 19:17:38.49579833 +0000 UTC m=+163.074788375 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:38 crc kubenswrapper[4815]: I1207 19:17:38.085032 4815 patch_prober.go:28] interesting pod/console-operator-58897d9998-c2mqh container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 07 19:17:38 crc kubenswrapper[4815]: I1207 19:17:38.085094 4815 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-c2mqh" podUID="8e21be4e-1543-4e91-b451-6b7d9f258195" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.10:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 07 19:17:38 crc kubenswrapper[4815]: I1207 19:17:38.089561 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-74mbw" event={"ID":"2677e4e7-8b16-4e36-9361-1896eaa6c973","Type":"ContainerStarted","Data":"a0defb558c41d3770021ca1de985d1f2164f5205007b8c78dfe1e4bb2bfebfc7"} Dec 07 19:17:38 crc kubenswrapper[4815]: I1207 19:17:38.091241 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-lvls8" event={"ID":"0b23efab-81af-4c7a-9462-8d4333c2ce44","Type":"ContainerStarted","Data":"adc0fe999661c93ad7757bc0ea21b8a99f893242ac48a01b4ab54f275fcb49c3"} Dec 07 19:17:38 crc kubenswrapper[4815]: I1207 19:17:38.095609 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-2zzzb" event={"ID":"c6a12b6a-72ab-4b3a-858b-9e4657c1e03f","Type":"ContainerStarted","Data":"c742154e1f80a96874b7e54bd1fbfa459b8b1d986862503b0e6da51c301621f6"} Dec 07 19:17:38 crc kubenswrapper[4815]: I1207 19:17:38.095642 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-2zzzb" event={"ID":"c6a12b6a-72ab-4b3a-858b-9e4657c1e03f","Type":"ContainerStarted","Data":"777844851b96f7f0c261b88ac90e0591643b2575e5caa5139bc0b91763e3211c"} Dec 07 19:17:38 crc kubenswrapper[4815]: I1207 19:17:38.096597 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k99gl\" (UID: \"4c3b59c8-1311-4f2a-8ec6-699405d3a4ac\") " pod="openshift-image-registry/image-registry-697d97f7c8-k99gl" Dec 07 19:17:38 crc kubenswrapper[4815]: E1207 19:17:38.096868 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-07 19:17:38.596858581 +0000 UTC m=+163.175848626 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k99gl" (UID: "4c3b59c8-1311-4f2a-8ec6-699405d3a4ac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:38 crc kubenswrapper[4815]: I1207 19:17:38.102763 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-7gwm4" event={"ID":"6029b56f-3736-417d-b832-46b53a99a505","Type":"ContainerStarted","Data":"e762e100605fbe62fb5223c3b7aa0ecd54182d0d391fe829f11ed89433d858aa"} Dec 07 19:17:38 crc kubenswrapper[4815]: I1207 19:17:38.106905 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-z4njj" event={"ID":"d4648241-9b66-421e-b267-fc03442657a8","Type":"ContainerStarted","Data":"b79afd01dab3edbe8edeca79f1445a70cfa9c0c49981d471ee6ac6af1cfac0bb"} Dec 07 19:17:38 crc kubenswrapper[4815]: I1207 19:17:38.106945 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-z4njj" event={"ID":"d4648241-9b66-421e-b267-fc03442657a8","Type":"ContainerStarted","Data":"fd1cdd38dff8c0114e0a095a0c92929176d88e520c463095c7f5b206b60cc071"} Dec 07 19:17:38 crc kubenswrapper[4815]: I1207 19:17:38.110549 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-sbvc7" event={"ID":"8d96d1d6-7b0d-43b9-9589-83f4c76200c6","Type":"ContainerStarted","Data":"c8a86ea7ad9ee7bfc12844096e49dcca2f063e172ea36778105b9c30323ed61e"} Dec 07 19:17:38 crc kubenswrapper[4815]: I1207 19:17:38.116472 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rw7gt" event={"ID":"1dea4684-eec3-484b-8bc7-f524b4671833","Type":"ContainerStarted","Data":"a0ed9fcc087e7982cb02d36c2d821431f17e847fe381fb7b2fef66801b1ac699"} Dec 07 19:17:38 crc kubenswrapper[4815]: I1207 19:17:38.116503 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rw7gt" event={"ID":"1dea4684-eec3-484b-8bc7-f524b4671833","Type":"ContainerStarted","Data":"8ff69994e51f695bc9304492eec30a19cf96d27f35e634ce74337d53597c38c0"} Dec 07 19:17:38 crc kubenswrapper[4815]: I1207 19:17:38.116948 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rw7gt" Dec 07 19:17:38 crc kubenswrapper[4815]: I1207 19:17:38.119335 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-2r46v" event={"ID":"e58da229-9be5-4d48-a1af-74d5316d09f3","Type":"ContainerStarted","Data":"254c86c7a7a54b29d6f66e30ae0211606dfb4924aac8c6f2bd2717b2514b73ad"} Dec 07 19:17:38 crc kubenswrapper[4815]: I1207 19:17:38.119976 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-2r46v" Dec 07 19:17:38 crc kubenswrapper[4815]: I1207 19:17:38.121567 4815 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-2r46v container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" start-of-body= Dec 07 19:17:38 crc kubenswrapper[4815]: I1207 19:17:38.121608 4815 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-2r46v" podUID="e58da229-9be5-4d48-a1af-74d5316d09f3" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" Dec 07 19:17:38 crc kubenswrapper[4815]: I1207 19:17:38.124789 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4gqqs" event={"ID":"ff9a9a52-3b06-411e-88d9-a0afcce6a632","Type":"ContainerStarted","Data":"4a5499ea58269e3e73b1b2576c83d0a84a0334d4cf4301936f56af6ccb2baa64"} Dec 07 19:17:38 crc kubenswrapper[4815]: I1207 19:17:38.126378 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bls26" event={"ID":"666ce368-deb6-4d44-bb99-a11029a75bf2","Type":"ContainerStarted","Data":"b3ffbd4ad4ddf26edec1db4136476507269e2675c03b712fcd941c410ffdb09b"} Dec 07 19:17:38 crc kubenswrapper[4815]: I1207 19:17:38.127045 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bls26" Dec 07 19:17:38 crc kubenswrapper[4815]: I1207 19:17:38.128221 4815 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-bls26 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.28:5443/healthz\": dial tcp 10.217.0.28:5443: connect: connection refused" start-of-body= Dec 07 19:17:38 crc kubenswrapper[4815]: I1207 19:17:38.128253 4815 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bls26" podUID="666ce368-deb6-4d44-bb99-a11029a75bf2" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.28:5443/healthz\": dial tcp 10.217.0.28:5443: connect: connection refused" Dec 07 19:17:38 crc kubenswrapper[4815]: I1207 19:17:38.133496 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-kznql" event={"ID":"25d8da8c-5fc5-42ea-8779-2394e32fadee","Type":"ContainerStarted","Data":"e4472efa02c2e80e04b5da86634ce56e45d69c90b80ea266d8e42c231355c127"} Dec 07 19:17:38 crc kubenswrapper[4815]: I1207 19:17:38.133653 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-kznql" Dec 07 19:17:38 crc kubenswrapper[4815]: I1207 19:17:38.134901 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rsb58" event={"ID":"0f38c9c5-3002-4d5a-a880-42359a37848b","Type":"ContainerStarted","Data":"521be4f64a103a2918153b93650b66ca14bc33d2dd66033575799e339edbe3c0"} Dec 07 19:17:38 crc kubenswrapper[4815]: I1207 19:17:38.135786 4815 patch_prober.go:28] interesting pod/downloads-7954f5f757-gpv67 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Dec 07 19:17:38 crc kubenswrapper[4815]: I1207 19:17:38.135833 4815 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-gpv67" podUID="cb417697-bbf5-4de5-ae9c-c04c37623e57" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Dec 07 19:17:38 crc kubenswrapper[4815]: I1207 19:17:38.194706 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mwpns" Dec 07 19:17:38 crc kubenswrapper[4815]: I1207 19:17:38.197248 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 07 19:17:38 crc kubenswrapper[4815]: E1207 19:17:38.198206 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-07 19:17:38.69819153 +0000 UTC m=+163.277181575 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:38 crc kubenswrapper[4815]: I1207 19:17:38.207302 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-cxk4m" podStartSLOduration=133.207283231 podStartE2EDuration="2m13.207283231s" podCreationTimestamp="2025-12-07 19:15:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:17:37.545438951 +0000 UTC m=+162.124429016" watchObservedRunningTime="2025-12-07 19:17:38.207283231 +0000 UTC m=+162.786273266" Dec 07 19:17:38 crc kubenswrapper[4815]: I1207 19:17:38.298959 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k99gl\" (UID: \"4c3b59c8-1311-4f2a-8ec6-699405d3a4ac\") " pod="openshift-image-registry/image-registry-697d97f7c8-k99gl" Dec 07 19:17:38 crc kubenswrapper[4815]: E1207 19:17:38.302887 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-07 19:17:38.802871431 +0000 UTC m=+163.381861606 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k99gl" (UID: "4c3b59c8-1311-4f2a-8ec6-699405d3a4ac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:38 crc kubenswrapper[4815]: I1207 19:17:38.392210 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-2zzzb" podStartSLOduration=132.392193018 podStartE2EDuration="2m12.392193018s" podCreationTimestamp="2025-12-07 19:15:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:17:38.233367951 +0000 UTC m=+162.812357996" watchObservedRunningTime="2025-12-07 19:17:38.392193018 +0000 UTC m=+162.971183063" Dec 07 19:17:38 crc kubenswrapper[4815]: I1207 19:17:38.400492 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 07 19:17:38 crc kubenswrapper[4815]: E1207 19:17:38.400798 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-07 19:17:38.900782735 +0000 UTC m=+163.479772780 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:38 crc kubenswrapper[4815]: I1207 19:17:38.474764 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bls26" podStartSLOduration=132.474749868 podStartE2EDuration="2m12.474749868s" podCreationTimestamp="2025-12-07 19:15:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:17:38.472488256 +0000 UTC m=+163.051478291" watchObservedRunningTime="2025-12-07 19:17:38.474749868 +0000 UTC m=+163.053739913" Dec 07 19:17:38 crc kubenswrapper[4815]: I1207 19:17:38.501882 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k99gl\" (UID: \"4c3b59c8-1311-4f2a-8ec6-699405d3a4ac\") " pod="openshift-image-registry/image-registry-697d97f7c8-k99gl" Dec 07 19:17:38 crc kubenswrapper[4815]: E1207 19:17:38.502217 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-07 19:17:39.002201026 +0000 UTC m=+163.581191071 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k99gl" (UID: "4c3b59c8-1311-4f2a-8ec6-699405d3a4ac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:38 crc kubenswrapper[4815]: I1207 19:17:38.536495 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rw7gt" podStartSLOduration=132.536479083 podStartE2EDuration="2m12.536479083s" podCreationTimestamp="2025-12-07 19:15:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:17:38.530867058 +0000 UTC m=+163.109857103" watchObservedRunningTime="2025-12-07 19:17:38.536479083 +0000 UTC m=+163.115469128" Dec 07 19:17:38 crc kubenswrapper[4815]: I1207 19:17:38.602977 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 07 19:17:38 crc kubenswrapper[4815]: E1207 19:17:38.603194 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-07 19:17:39.103159155 +0000 UTC m=+163.682149200 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:38 crc kubenswrapper[4815]: I1207 19:17:38.603568 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k99gl\" (UID: \"4c3b59c8-1311-4f2a-8ec6-699405d3a4ac\") " pod="openshift-image-registry/image-registry-697d97f7c8-k99gl" Dec 07 19:17:38 crc kubenswrapper[4815]: E1207 19:17:38.603901 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-07 19:17:39.103888185 +0000 UTC m=+163.682878220 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k99gl" (UID: "4c3b59c8-1311-4f2a-8ec6-699405d3a4ac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:38 crc kubenswrapper[4815]: I1207 19:17:38.607470 4815 patch_prober.go:28] interesting pod/router-default-5444994796-t955b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 07 19:17:38 crc kubenswrapper[4815]: [-]has-synced failed: reason withheld Dec 07 19:17:38 crc kubenswrapper[4815]: [+]process-running ok Dec 07 19:17:38 crc kubenswrapper[4815]: healthz check failed Dec 07 19:17:38 crc kubenswrapper[4815]: I1207 19:17:38.607522 4815 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-t955b" podUID="e44f9afe-8dfb-40a8-802d-2dd4bae4d7e3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 07 19:17:38 crc kubenswrapper[4815]: I1207 19:17:38.608284 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k72f8" Dec 07 19:17:38 crc kubenswrapper[4815]: I1207 19:17:38.615961 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-2r46v" podStartSLOduration=132.615942448 podStartE2EDuration="2m12.615942448s" podCreationTimestamp="2025-12-07 19:15:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:17:38.594518616 +0000 UTC m=+163.173508661" watchObservedRunningTime="2025-12-07 19:17:38.615942448 +0000 UTC m=+163.194932493" Dec 07 19:17:38 crc kubenswrapper[4815]: I1207 19:17:38.653692 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4gqqs" podStartSLOduration=132.65367563 podStartE2EDuration="2m12.65367563s" podCreationTimestamp="2025-12-07 19:15:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:17:38.652749544 +0000 UTC m=+163.231739589" watchObservedRunningTime="2025-12-07 19:17:38.65367563 +0000 UTC m=+163.232665675" Dec 07 19:17:38 crc kubenswrapper[4815]: I1207 19:17:38.704782 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 07 19:17:38 crc kubenswrapper[4815]: E1207 19:17:38.705431 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-07 19:17:39.205415899 +0000 UTC m=+163.784405944 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:38 crc kubenswrapper[4815]: I1207 19:17:38.771790 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-kznql" podStartSLOduration=133.771773132 podStartE2EDuration="2m13.771773132s" podCreationTimestamp="2025-12-07 19:15:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:17:38.730588314 +0000 UTC m=+163.309578369" watchObservedRunningTime="2025-12-07 19:17:38.771773132 +0000 UTC m=+163.350763187" Dec 07 19:17:38 crc kubenswrapper[4815]: I1207 19:17:38.773468 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-z4njj" podStartSLOduration=133.773460898 podStartE2EDuration="2m13.773460898s" podCreationTimestamp="2025-12-07 19:15:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:17:38.769875249 +0000 UTC m=+163.348865284" watchObservedRunningTime="2025-12-07 19:17:38.773460898 +0000 UTC m=+163.352450943" Dec 07 19:17:38 crc kubenswrapper[4815]: I1207 19:17:38.806846 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k99gl\" (UID: \"4c3b59c8-1311-4f2a-8ec6-699405d3a4ac\") " pod="openshift-image-registry/image-registry-697d97f7c8-k99gl" Dec 07 19:17:38 crc kubenswrapper[4815]: E1207 19:17:38.807314 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-07 19:17:39.307298653 +0000 UTC m=+163.886288698 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k99gl" (UID: "4c3b59c8-1311-4f2a-8ec6-699405d3a4ac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:38 crc kubenswrapper[4815]: I1207 19:17:38.878373 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-7gwm4" podStartSLOduration=133.878355825 podStartE2EDuration="2m13.878355825s" podCreationTimestamp="2025-12-07 19:15:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:17:38.820299352 +0000 UTC m=+163.399289407" watchObservedRunningTime="2025-12-07 19:17:38.878355825 +0000 UTC m=+163.457345870" Dec 07 19:17:38 crc kubenswrapper[4815]: I1207 19:17:38.880365 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-nb88w"] Dec 07 19:17:38 crc kubenswrapper[4815]: I1207 19:17:38.881222 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nb88w" Dec 07 19:17:38 crc kubenswrapper[4815]: I1207 19:17:38.894769 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 07 19:17:38 crc kubenswrapper[4815]: I1207 19:17:38.897235 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nb88w"] Dec 07 19:17:38 crc kubenswrapper[4815]: I1207 19:17:38.911009 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 07 19:17:38 crc kubenswrapper[4815]: E1207 19:17:38.911110 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-07 19:17:39.41109469 +0000 UTC m=+163.990084735 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:38 crc kubenswrapper[4815]: I1207 19:17:38.911305 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k99gl\" (UID: \"4c3b59c8-1311-4f2a-8ec6-699405d3a4ac\") " pod="openshift-image-registry/image-registry-697d97f7c8-k99gl" Dec 07 19:17:38 crc kubenswrapper[4815]: E1207 19:17:38.911591 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-07 19:17:39.411584283 +0000 UTC m=+163.990574328 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k99gl" (UID: "4c3b59c8-1311-4f2a-8ec6-699405d3a4ac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:39 crc kubenswrapper[4815]: I1207 19:17:39.012025 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 07 19:17:39 crc kubenswrapper[4815]: E1207 19:17:39.012239 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-07 19:17:39.512214032 +0000 UTC m=+164.091204077 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:39 crc kubenswrapper[4815]: I1207 19:17:39.012326 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69cc2280-9dd4-43d1-87d6-c54a5b801a32-catalog-content\") pod \"community-operators-nb88w\" (UID: \"69cc2280-9dd4-43d1-87d6-c54a5b801a32\") " pod="openshift-marketplace/community-operators-nb88w" Dec 07 19:17:39 crc kubenswrapper[4815]: I1207 19:17:39.012379 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k99gl\" (UID: \"4c3b59c8-1311-4f2a-8ec6-699405d3a4ac\") " pod="openshift-image-registry/image-registry-697d97f7c8-k99gl" Dec 07 19:17:39 crc kubenswrapper[4815]: I1207 19:17:39.012429 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69cc2280-9dd4-43d1-87d6-c54a5b801a32-utilities\") pod \"community-operators-nb88w\" (UID: \"69cc2280-9dd4-43d1-87d6-c54a5b801a32\") " pod="openshift-marketplace/community-operators-nb88w" Dec 07 19:17:39 crc kubenswrapper[4815]: I1207 19:17:39.012462 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvgpf\" (UniqueName: \"kubernetes.io/projected/69cc2280-9dd4-43d1-87d6-c54a5b801a32-kube-api-access-hvgpf\") pod \"community-operators-nb88w\" (UID: \"69cc2280-9dd4-43d1-87d6-c54a5b801a32\") " pod="openshift-marketplace/community-operators-nb88w" Dec 07 19:17:39 crc kubenswrapper[4815]: E1207 19:17:39.012726 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-07 19:17:39.512716576 +0000 UTC m=+164.091706621 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k99gl" (UID: "4c3b59c8-1311-4f2a-8ec6-699405d3a4ac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:39 crc kubenswrapper[4815]: I1207 19:17:39.041490 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-52nff"] Dec 07 19:17:39 crc kubenswrapper[4815]: I1207 19:17:39.042400 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-52nff" Dec 07 19:17:39 crc kubenswrapper[4815]: I1207 19:17:39.045239 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 07 19:17:39 crc kubenswrapper[4815]: I1207 19:17:39.071279 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-52nff"] Dec 07 19:17:39 crc kubenswrapper[4815]: I1207 19:17:39.113705 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 07 19:17:39 crc kubenswrapper[4815]: I1207 19:17:39.113880 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69cc2280-9dd4-43d1-87d6-c54a5b801a32-catalog-content\") pod \"community-operators-nb88w\" (UID: \"69cc2280-9dd4-43d1-87d6-c54a5b801a32\") " pod="openshift-marketplace/community-operators-nb88w" Dec 07 19:17:39 crc kubenswrapper[4815]: I1207 19:17:39.113970 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69cc2280-9dd4-43d1-87d6-c54a5b801a32-utilities\") pod \"community-operators-nb88w\" (UID: \"69cc2280-9dd4-43d1-87d6-c54a5b801a32\") " pod="openshift-marketplace/community-operators-nb88w" Dec 07 19:17:39 crc kubenswrapper[4815]: I1207 19:17:39.113993 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvgpf\" (UniqueName: \"kubernetes.io/projected/69cc2280-9dd4-43d1-87d6-c54a5b801a32-kube-api-access-hvgpf\") pod \"community-operators-nb88w\" (UID: \"69cc2280-9dd4-43d1-87d6-c54a5b801a32\") " pod="openshift-marketplace/community-operators-nb88w" Dec 07 19:17:39 crc kubenswrapper[4815]: E1207 19:17:39.114247 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-07 19:17:39.61423419 +0000 UTC m=+164.193224225 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:39 crc kubenswrapper[4815]: I1207 19:17:39.114576 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69cc2280-9dd4-43d1-87d6-c54a5b801a32-catalog-content\") pod \"community-operators-nb88w\" (UID: \"69cc2280-9dd4-43d1-87d6-c54a5b801a32\") " pod="openshift-marketplace/community-operators-nb88w" Dec 07 19:17:39 crc kubenswrapper[4815]: I1207 19:17:39.114782 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69cc2280-9dd4-43d1-87d6-c54a5b801a32-utilities\") pod \"community-operators-nb88w\" (UID: \"69cc2280-9dd4-43d1-87d6-c54a5b801a32\") " pod="openshift-marketplace/community-operators-nb88w" Dec 07 19:17:39 crc kubenswrapper[4815]: I1207 19:17:39.164542 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-74mbw" event={"ID":"2677e4e7-8b16-4e36-9361-1896eaa6c973","Type":"ContainerStarted","Data":"a6db601669ced7343ea4c9f46dabd1b2452759abe080d491aba09ee64aa0ff99"} Dec 07 19:17:39 crc kubenswrapper[4815]: I1207 19:17:39.165698 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvgpf\" (UniqueName: \"kubernetes.io/projected/69cc2280-9dd4-43d1-87d6-c54a5b801a32-kube-api-access-hvgpf\") pod \"community-operators-nb88w\" (UID: \"69cc2280-9dd4-43d1-87d6-c54a5b801a32\") " pod="openshift-marketplace/community-operators-nb88w" Dec 07 19:17:39 crc kubenswrapper[4815]: I1207 19:17:39.185096 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rsb58" event={"ID":"0f38c9c5-3002-4d5a-a880-42359a37848b","Type":"ContainerStarted","Data":"ae029a3336d66c674d4bcecbc8436d37a5bb8fdf52c5d4dfe0f3b3d64299db9e"} Dec 07 19:17:39 crc kubenswrapper[4815]: I1207 19:17:39.185838 4815 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-2r46v container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" start-of-body= Dec 07 19:17:39 crc kubenswrapper[4815]: I1207 19:17:39.185882 4815 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-2r46v" podUID="e58da229-9be5-4d48-a1af-74d5316d09f3" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" Dec 07 19:17:39 crc kubenswrapper[4815]: I1207 19:17:39.187059 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-rsb58" Dec 07 19:17:39 crc kubenswrapper[4815]: I1207 19:17:39.194815 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k72f8" Dec 07 19:17:39 crc kubenswrapper[4815]: I1207 19:17:39.218401 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9de4e38-617b-41a4-b97f-155d559d497a-utilities\") pod \"certified-operators-52nff\" (UID: \"b9de4e38-617b-41a4-b97f-155d559d497a\") " pod="openshift-marketplace/certified-operators-52nff" Dec 07 19:17:39 crc kubenswrapper[4815]: I1207 19:17:39.218443 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48h9v\" (UniqueName: \"kubernetes.io/projected/b9de4e38-617b-41a4-b97f-155d559d497a-kube-api-access-48h9v\") pod \"certified-operators-52nff\" (UID: \"b9de4e38-617b-41a4-b97f-155d559d497a\") " pod="openshift-marketplace/certified-operators-52nff" Dec 07 19:17:39 crc kubenswrapper[4815]: I1207 19:17:39.218477 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9de4e38-617b-41a4-b97f-155d559d497a-catalog-content\") pod \"certified-operators-52nff\" (UID: \"b9de4e38-617b-41a4-b97f-155d559d497a\") " pod="openshift-marketplace/certified-operators-52nff" Dec 07 19:17:39 crc kubenswrapper[4815]: I1207 19:17:39.218502 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k99gl\" (UID: \"4c3b59c8-1311-4f2a-8ec6-699405d3a4ac\") " pod="openshift-image-registry/image-registry-697d97f7c8-k99gl" Dec 07 19:17:39 crc kubenswrapper[4815]: E1207 19:17:39.218761 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-07 19:17:39.718751867 +0000 UTC m=+164.297741912 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k99gl" (UID: "4c3b59c8-1311-4f2a-8ec6-699405d3a4ac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:39 crc kubenswrapper[4815]: I1207 19:17:39.223368 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nb88w" Dec 07 19:17:39 crc kubenswrapper[4815]: I1207 19:17:39.287794 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-csjhg"] Dec 07 19:17:39 crc kubenswrapper[4815]: I1207 19:17:39.288668 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-csjhg" Dec 07 19:17:39 crc kubenswrapper[4815]: I1207 19:17:39.296327 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-rsb58" podStartSLOduration=9.296316659 podStartE2EDuration="9.296316659s" podCreationTimestamp="2025-12-07 19:17:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:17:39.294338054 +0000 UTC m=+163.873328099" watchObservedRunningTime="2025-12-07 19:17:39.296316659 +0000 UTC m=+163.875306694" Dec 07 19:17:39 crc kubenswrapper[4815]: I1207 19:17:39.319727 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 07 19:17:39 crc kubenswrapper[4815]: E1207 19:17:39.319858 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-07 19:17:39.819833979 +0000 UTC m=+164.398824024 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:39 crc kubenswrapper[4815]: I1207 19:17:39.320891 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48h9v\" (UniqueName: \"kubernetes.io/projected/b9de4e38-617b-41a4-b97f-155d559d497a-kube-api-access-48h9v\") pod \"certified-operators-52nff\" (UID: \"b9de4e38-617b-41a4-b97f-155d559d497a\") " pod="openshift-marketplace/certified-operators-52nff" Dec 07 19:17:39 crc kubenswrapper[4815]: I1207 19:17:39.321131 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9de4e38-617b-41a4-b97f-155d559d497a-catalog-content\") pod \"certified-operators-52nff\" (UID: \"b9de4e38-617b-41a4-b97f-155d559d497a\") " pod="openshift-marketplace/certified-operators-52nff" Dec 07 19:17:39 crc kubenswrapper[4815]: I1207 19:17:39.321259 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k99gl\" (UID: \"4c3b59c8-1311-4f2a-8ec6-699405d3a4ac\") " pod="openshift-image-registry/image-registry-697d97f7c8-k99gl" Dec 07 19:17:39 crc kubenswrapper[4815]: I1207 19:17:39.321553 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9de4e38-617b-41a4-b97f-155d559d497a-utilities\") pod \"certified-operators-52nff\" (UID: \"b9de4e38-617b-41a4-b97f-155d559d497a\") " pod="openshift-marketplace/certified-operators-52nff" Dec 07 19:17:39 crc kubenswrapper[4815]: I1207 19:17:39.330970 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9de4e38-617b-41a4-b97f-155d559d497a-utilities\") pod \"certified-operators-52nff\" (UID: \"b9de4e38-617b-41a4-b97f-155d559d497a\") " pod="openshift-marketplace/certified-operators-52nff" Dec 07 19:17:39 crc kubenswrapper[4815]: I1207 19:17:39.331440 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9de4e38-617b-41a4-b97f-155d559d497a-catalog-content\") pod \"certified-operators-52nff\" (UID: \"b9de4e38-617b-41a4-b97f-155d559d497a\") " pod="openshift-marketplace/certified-operators-52nff" Dec 07 19:17:39 crc kubenswrapper[4815]: E1207 19:17:39.331973 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-07 19:17:39.831952093 +0000 UTC m=+164.410942138 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k99gl" (UID: "4c3b59c8-1311-4f2a-8ec6-699405d3a4ac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:39 crc kubenswrapper[4815]: I1207 19:17:39.353324 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-csjhg"] Dec 07 19:17:39 crc kubenswrapper[4815]: I1207 19:17:39.415782 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48h9v\" (UniqueName: \"kubernetes.io/projected/b9de4e38-617b-41a4-b97f-155d559d497a-kube-api-access-48h9v\") pod \"certified-operators-52nff\" (UID: \"b9de4e38-617b-41a4-b97f-155d559d497a\") " pod="openshift-marketplace/certified-operators-52nff" Dec 07 19:17:39 crc kubenswrapper[4815]: I1207 19:17:39.424118 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 07 19:17:39 crc kubenswrapper[4815]: I1207 19:17:39.424282 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0eff7fdb-d25d-4a77-b828-1901c31f091d-utilities\") pod \"community-operators-csjhg\" (UID: \"0eff7fdb-d25d-4a77-b828-1901c31f091d\") " pod="openshift-marketplace/community-operators-csjhg" Dec 07 19:17:39 crc kubenswrapper[4815]: I1207 19:17:39.424323 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmwpl\" (UniqueName: \"kubernetes.io/projected/0eff7fdb-d25d-4a77-b828-1901c31f091d-kube-api-access-qmwpl\") pod \"community-operators-csjhg\" (UID: \"0eff7fdb-d25d-4a77-b828-1901c31f091d\") " pod="openshift-marketplace/community-operators-csjhg" Dec 07 19:17:39 crc kubenswrapper[4815]: I1207 19:17:39.424379 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0eff7fdb-d25d-4a77-b828-1901c31f091d-catalog-content\") pod \"community-operators-csjhg\" (UID: \"0eff7fdb-d25d-4a77-b828-1901c31f091d\") " pod="openshift-marketplace/community-operators-csjhg" Dec 07 19:17:39 crc kubenswrapper[4815]: E1207 19:17:39.424594 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-07 19:17:39.924467778 +0000 UTC m=+164.503457823 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:39 crc kubenswrapper[4815]: I1207 19:17:39.500420 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bjqjh"] Dec 07 19:17:39 crc kubenswrapper[4815]: I1207 19:17:39.501520 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bjqjh" Dec 07 19:17:39 crc kubenswrapper[4815]: I1207 19:17:39.525444 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0eff7fdb-d25d-4a77-b828-1901c31f091d-catalog-content\") pod \"community-operators-csjhg\" (UID: \"0eff7fdb-d25d-4a77-b828-1901c31f091d\") " pod="openshift-marketplace/community-operators-csjhg" Dec 07 19:17:39 crc kubenswrapper[4815]: I1207 19:17:39.525509 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k99gl\" (UID: \"4c3b59c8-1311-4f2a-8ec6-699405d3a4ac\") " pod="openshift-image-registry/image-registry-697d97f7c8-k99gl" Dec 07 19:17:39 crc kubenswrapper[4815]: I1207 19:17:39.525552 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0eff7fdb-d25d-4a77-b828-1901c31f091d-utilities\") pod \"community-operators-csjhg\" (UID: \"0eff7fdb-d25d-4a77-b828-1901c31f091d\") " pod="openshift-marketplace/community-operators-csjhg" Dec 07 19:17:39 crc kubenswrapper[4815]: I1207 19:17:39.525589 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmwpl\" (UniqueName: \"kubernetes.io/projected/0eff7fdb-d25d-4a77-b828-1901c31f091d-kube-api-access-qmwpl\") pod \"community-operators-csjhg\" (UID: \"0eff7fdb-d25d-4a77-b828-1901c31f091d\") " pod="openshift-marketplace/community-operators-csjhg" Dec 07 19:17:39 crc kubenswrapper[4815]: E1207 19:17:39.526024 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-07 19:17:40.026006233 +0000 UTC m=+164.604996278 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k99gl" (UID: "4c3b59c8-1311-4f2a-8ec6-699405d3a4ac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:39 crc kubenswrapper[4815]: I1207 19:17:39.526160 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0eff7fdb-d25d-4a77-b828-1901c31f091d-catalog-content\") pod \"community-operators-csjhg\" (UID: \"0eff7fdb-d25d-4a77-b828-1901c31f091d\") " pod="openshift-marketplace/community-operators-csjhg" Dec 07 19:17:39 crc kubenswrapper[4815]: I1207 19:17:39.526405 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0eff7fdb-d25d-4a77-b828-1901c31f091d-utilities\") pod \"community-operators-csjhg\" (UID: \"0eff7fdb-d25d-4a77-b828-1901c31f091d\") " pod="openshift-marketplace/community-operators-csjhg" Dec 07 19:17:39 crc kubenswrapper[4815]: I1207 19:17:39.541551 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bjqjh"] Dec 07 19:17:39 crc kubenswrapper[4815]: I1207 19:17:39.600139 4815 patch_prober.go:28] interesting pod/router-default-5444994796-t955b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 07 19:17:39 crc kubenswrapper[4815]: [-]has-synced failed: reason withheld Dec 07 19:17:39 crc kubenswrapper[4815]: [+]process-running ok Dec 07 19:17:39 crc kubenswrapper[4815]: healthz check failed Dec 07 19:17:39 crc kubenswrapper[4815]: I1207 19:17:39.600228 4815 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-t955b" podUID="e44f9afe-8dfb-40a8-802d-2dd4bae4d7e3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 07 19:17:39 crc kubenswrapper[4815]: I1207 19:17:39.622574 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmwpl\" (UniqueName: \"kubernetes.io/projected/0eff7fdb-d25d-4a77-b828-1901c31f091d-kube-api-access-qmwpl\") pod \"community-operators-csjhg\" (UID: \"0eff7fdb-d25d-4a77-b828-1901c31f091d\") " pod="openshift-marketplace/community-operators-csjhg" Dec 07 19:17:39 crc kubenswrapper[4815]: I1207 19:17:39.626361 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 07 19:17:39 crc kubenswrapper[4815]: I1207 19:17:39.626593 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc15074d-3de2-4533-84f9-d400e3400019-utilities\") pod \"certified-operators-bjqjh\" (UID: \"cc15074d-3de2-4533-84f9-d400e3400019\") " pod="openshift-marketplace/certified-operators-bjqjh" Dec 07 19:17:39 crc kubenswrapper[4815]: I1207 19:17:39.626616 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4x9n\" (UniqueName: \"kubernetes.io/projected/cc15074d-3de2-4533-84f9-d400e3400019-kube-api-access-f4x9n\") pod \"certified-operators-bjqjh\" (UID: \"cc15074d-3de2-4533-84f9-d400e3400019\") " pod="openshift-marketplace/certified-operators-bjqjh" Dec 07 19:17:39 crc kubenswrapper[4815]: I1207 19:17:39.626686 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc15074d-3de2-4533-84f9-d400e3400019-catalog-content\") pod \"certified-operators-bjqjh\" (UID: \"cc15074d-3de2-4533-84f9-d400e3400019\") " pod="openshift-marketplace/certified-operators-bjqjh" Dec 07 19:17:39 crc kubenswrapper[4815]: E1207 19:17:39.626781 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-07 19:17:40.126766846 +0000 UTC m=+164.705756891 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:39 crc kubenswrapper[4815]: I1207 19:17:39.699172 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-52nff" Dec 07 19:17:39 crc kubenswrapper[4815]: I1207 19:17:39.729671 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc15074d-3de2-4533-84f9-d400e3400019-catalog-content\") pod \"certified-operators-bjqjh\" (UID: \"cc15074d-3de2-4533-84f9-d400e3400019\") " pod="openshift-marketplace/certified-operators-bjqjh" Dec 07 19:17:39 crc kubenswrapper[4815]: I1207 19:17:39.729749 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc15074d-3de2-4533-84f9-d400e3400019-utilities\") pod \"certified-operators-bjqjh\" (UID: \"cc15074d-3de2-4533-84f9-d400e3400019\") " pod="openshift-marketplace/certified-operators-bjqjh" Dec 07 19:17:39 crc kubenswrapper[4815]: I1207 19:17:39.729776 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4x9n\" (UniqueName: \"kubernetes.io/projected/cc15074d-3de2-4533-84f9-d400e3400019-kube-api-access-f4x9n\") pod \"certified-operators-bjqjh\" (UID: \"cc15074d-3de2-4533-84f9-d400e3400019\") " pod="openshift-marketplace/certified-operators-bjqjh" Dec 07 19:17:39 crc kubenswrapper[4815]: I1207 19:17:39.729797 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k99gl\" (UID: \"4c3b59c8-1311-4f2a-8ec6-699405d3a4ac\") " pod="openshift-image-registry/image-registry-697d97f7c8-k99gl" Dec 07 19:17:39 crc kubenswrapper[4815]: E1207 19:17:39.730102 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-07 19:17:40.23008937 +0000 UTC m=+164.809079415 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k99gl" (UID: "4c3b59c8-1311-4f2a-8ec6-699405d3a4ac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:39 crc kubenswrapper[4815]: I1207 19:17:39.730633 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc15074d-3de2-4533-84f9-d400e3400019-catalog-content\") pod \"certified-operators-bjqjh\" (UID: \"cc15074d-3de2-4533-84f9-d400e3400019\") " pod="openshift-marketplace/certified-operators-bjqjh" Dec 07 19:17:39 crc kubenswrapper[4815]: I1207 19:17:39.730899 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc15074d-3de2-4533-84f9-d400e3400019-utilities\") pod \"certified-operators-bjqjh\" (UID: \"cc15074d-3de2-4533-84f9-d400e3400019\") " pod="openshift-marketplace/certified-operators-bjqjh" Dec 07 19:17:39 crc kubenswrapper[4815]: I1207 19:17:39.820386 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4x9n\" (UniqueName: \"kubernetes.io/projected/cc15074d-3de2-4533-84f9-d400e3400019-kube-api-access-f4x9n\") pod \"certified-operators-bjqjh\" (UID: \"cc15074d-3de2-4533-84f9-d400e3400019\") " pod="openshift-marketplace/certified-operators-bjqjh" Dec 07 19:17:39 crc kubenswrapper[4815]: I1207 19:17:39.836382 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 07 19:17:39 crc kubenswrapper[4815]: E1207 19:17:39.836783 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-07 19:17:40.336768006 +0000 UTC m=+164.915758051 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:39 crc kubenswrapper[4815]: I1207 19:17:39.906202 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-csjhg" Dec 07 19:17:39 crc kubenswrapper[4815]: I1207 19:17:39.923265 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nb88w"] Dec 07 19:17:39 crc kubenswrapper[4815]: I1207 19:17:39.939135 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k99gl\" (UID: \"4c3b59c8-1311-4f2a-8ec6-699405d3a4ac\") " pod="openshift-image-registry/image-registry-697d97f7c8-k99gl" Dec 07 19:17:39 crc kubenswrapper[4815]: E1207 19:17:39.939428 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-07 19:17:40.439417301 +0000 UTC m=+165.018407346 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k99gl" (UID: "4c3b59c8-1311-4f2a-8ec6-699405d3a4ac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:40 crc kubenswrapper[4815]: I1207 19:17:40.040050 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 07 19:17:40 crc kubenswrapper[4815]: E1207 19:17:40.040411 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-07 19:17:40.54039603 +0000 UTC m=+165.119386075 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:40 crc kubenswrapper[4815]: I1207 19:17:40.115355 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bjqjh" Dec 07 19:17:40 crc kubenswrapper[4815]: I1207 19:17:40.141602 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k99gl\" (UID: \"4c3b59c8-1311-4f2a-8ec6-699405d3a4ac\") " pod="openshift-image-registry/image-registry-697d97f7c8-k99gl" Dec 07 19:17:40 crc kubenswrapper[4815]: E1207 19:17:40.141899 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-07 19:17:40.641888473 +0000 UTC m=+165.220878508 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k99gl" (UID: "4c3b59c8-1311-4f2a-8ec6-699405d3a4ac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:40 crc kubenswrapper[4815]: I1207 19:17:40.186869 4815 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-kznql container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.17:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 07 19:17:40 crc kubenswrapper[4815]: I1207 19:17:40.187156 4815 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-kznql" podUID="25d8da8c-5fc5-42ea-8779-2394e32fadee" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.17:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 07 19:17:40 crc kubenswrapper[4815]: I1207 19:17:40.187434 4815 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-bls26 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.28:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 07 19:17:40 crc kubenswrapper[4815]: I1207 19:17:40.187451 4815 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bls26" podUID="666ce368-deb6-4d44-bb99-a11029a75bf2" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.28:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 07 19:17:40 crc kubenswrapper[4815]: I1207 19:17:40.198062 4815 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-2r46v container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" start-of-body= Dec 07 19:17:40 crc kubenswrapper[4815]: I1207 19:17:40.198093 4815 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-2r46v" podUID="e58da229-9be5-4d48-a1af-74d5316d09f3" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" Dec 07 19:17:40 crc kubenswrapper[4815]: I1207 19:17:40.242983 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 07 19:17:40 crc kubenswrapper[4815]: E1207 19:17:40.243369 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-07 19:17:40.743354896 +0000 UTC m=+165.322344931 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:40 crc kubenswrapper[4815]: I1207 19:17:40.343106 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-52nff"] Dec 07 19:17:40 crc kubenswrapper[4815]: I1207 19:17:40.344956 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k99gl\" (UID: \"4c3b59c8-1311-4f2a-8ec6-699405d3a4ac\") " pod="openshift-image-registry/image-registry-697d97f7c8-k99gl" Dec 07 19:17:40 crc kubenswrapper[4815]: E1207 19:17:40.347930 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-07 19:17:40.847898253 +0000 UTC m=+165.426888308 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k99gl" (UID: "4c3b59c8-1311-4f2a-8ec6-699405d3a4ac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:40 crc kubenswrapper[4815]: I1207 19:17:40.446311 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 07 19:17:40 crc kubenswrapper[4815]: E1207 19:17:40.446633 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-07 19:17:40.946619039 +0000 UTC m=+165.525609084 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:40 crc kubenswrapper[4815]: I1207 19:17:40.484987 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-csjhg"] Dec 07 19:17:40 crc kubenswrapper[4815]: I1207 19:17:40.548042 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k99gl\" (UID: \"4c3b59c8-1311-4f2a-8ec6-699405d3a4ac\") " pod="openshift-image-registry/image-registry-697d97f7c8-k99gl" Dec 07 19:17:40 crc kubenswrapper[4815]: E1207 19:17:40.548373 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-07 19:17:41.048361728 +0000 UTC m=+165.627351773 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k99gl" (UID: "4c3b59c8-1311-4f2a-8ec6-699405d3a4ac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:40 crc kubenswrapper[4815]: I1207 19:17:40.605118 4815 patch_prober.go:28] interesting pod/router-default-5444994796-t955b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 07 19:17:40 crc kubenswrapper[4815]: [-]has-synced failed: reason withheld Dec 07 19:17:40 crc kubenswrapper[4815]: [+]process-running ok Dec 07 19:17:40 crc kubenswrapper[4815]: healthz check failed Dec 07 19:17:40 crc kubenswrapper[4815]: I1207 19:17:40.605224 4815 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-t955b" podUID="e44f9afe-8dfb-40a8-802d-2dd4bae4d7e3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 07 19:17:40 crc kubenswrapper[4815]: I1207 19:17:40.651380 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 07 19:17:40 crc kubenswrapper[4815]: E1207 19:17:40.651792 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-07 19:17:41.151776525 +0000 UTC m=+165.730766570 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:40 crc kubenswrapper[4815]: I1207 19:17:40.752735 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k99gl\" (UID: \"4c3b59c8-1311-4f2a-8ec6-699405d3a4ac\") " pod="openshift-image-registry/image-registry-697d97f7c8-k99gl" Dec 07 19:17:40 crc kubenswrapper[4815]: E1207 19:17:40.753141 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-07 19:17:41.253124904 +0000 UTC m=+165.832114949 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k99gl" (UID: "4c3b59c8-1311-4f2a-8ec6-699405d3a4ac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:40 crc kubenswrapper[4815]: I1207 19:17:40.866426 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 07 19:17:40 crc kubenswrapper[4815]: E1207 19:17:40.866727 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-07 19:17:41.366710921 +0000 UTC m=+165.945700966 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:40 crc kubenswrapper[4815]: I1207 19:17:40.911882 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-sbt7w"] Dec 07 19:17:40 crc kubenswrapper[4815]: I1207 19:17:40.912879 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sbt7w" Dec 07 19:17:40 crc kubenswrapper[4815]: I1207 19:17:40.953642 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 07 19:17:40 crc kubenswrapper[4815]: I1207 19:17:40.968724 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k99gl\" (UID: \"4c3b59c8-1311-4f2a-8ec6-699405d3a4ac\") " pod="openshift-image-registry/image-registry-697d97f7c8-k99gl" Dec 07 19:17:40 crc kubenswrapper[4815]: E1207 19:17:40.980041 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-07 19:17:41.480023261 +0000 UTC m=+166.059013306 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k99gl" (UID: "4c3b59c8-1311-4f2a-8ec6-699405d3a4ac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:40 crc kubenswrapper[4815]: I1207 19:17:40.993715 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sbt7w"] Dec 07 19:17:41 crc kubenswrapper[4815]: I1207 19:17:41.100508 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 07 19:17:41 crc kubenswrapper[4815]: I1207 19:17:41.100667 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04fe5f63-515f-4b66-963c-c2ce259b9bad-catalog-content\") pod \"redhat-marketplace-sbt7w\" (UID: \"04fe5f63-515f-4b66-963c-c2ce259b9bad\") " pod="openshift-marketplace/redhat-marketplace-sbt7w" Dec 07 19:17:41 crc kubenswrapper[4815]: I1207 19:17:41.100692 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04fe5f63-515f-4b66-963c-c2ce259b9bad-utilities\") pod \"redhat-marketplace-sbt7w\" (UID: \"04fe5f63-515f-4b66-963c-c2ce259b9bad\") " pod="openshift-marketplace/redhat-marketplace-sbt7w" Dec 07 19:17:41 crc kubenswrapper[4815]: I1207 19:17:41.100754 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtfrt\" (UniqueName: \"kubernetes.io/projected/04fe5f63-515f-4b66-963c-c2ce259b9bad-kube-api-access-jtfrt\") pod \"redhat-marketplace-sbt7w\" (UID: \"04fe5f63-515f-4b66-963c-c2ce259b9bad\") " pod="openshift-marketplace/redhat-marketplace-sbt7w" Dec 07 19:17:41 crc kubenswrapper[4815]: E1207 19:17:41.100853 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-07 19:17:41.600837967 +0000 UTC m=+166.179828012 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:41 crc kubenswrapper[4815]: I1207 19:17:41.106020 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-xxlhj" Dec 07 19:17:41 crc kubenswrapper[4815]: I1207 19:17:41.106059 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-xxlhj" Dec 07 19:17:41 crc kubenswrapper[4815]: I1207 19:17:41.124121 4815 patch_prober.go:28] interesting pod/console-f9d7485db-xxlhj container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Dec 07 19:17:41 crc kubenswrapper[4815]: I1207 19:17:41.124191 4815 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-xxlhj" podUID="f8d9863a-2779-463c-8d73-76246a51b333" containerName="console" probeResult="failure" output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" Dec 07 19:17:41 crc kubenswrapper[4815]: I1207 19:17:41.124135 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bjqjh"] Dec 07 19:17:41 crc kubenswrapper[4815]: I1207 19:17:41.199190 4815 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-bls26 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.28:5443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 07 19:17:41 crc kubenswrapper[4815]: I1207 19:17:41.199242 4815 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bls26" podUID="666ce368-deb6-4d44-bb99-a11029a75bf2" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.28:5443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 07 19:17:41 crc kubenswrapper[4815]: I1207 19:17:41.202006 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k99gl\" (UID: \"4c3b59c8-1311-4f2a-8ec6-699405d3a4ac\") " pod="openshift-image-registry/image-registry-697d97f7c8-k99gl" Dec 07 19:17:41 crc kubenswrapper[4815]: I1207 19:17:41.202096 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtfrt\" (UniqueName: \"kubernetes.io/projected/04fe5f63-515f-4b66-963c-c2ce259b9bad-kube-api-access-jtfrt\") pod \"redhat-marketplace-sbt7w\" (UID: \"04fe5f63-515f-4b66-963c-c2ce259b9bad\") " pod="openshift-marketplace/redhat-marketplace-sbt7w" Dec 07 19:17:41 crc kubenswrapper[4815]: I1207 19:17:41.202170 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04fe5f63-515f-4b66-963c-c2ce259b9bad-catalog-content\") pod \"redhat-marketplace-sbt7w\" (UID: \"04fe5f63-515f-4b66-963c-c2ce259b9bad\") " pod="openshift-marketplace/redhat-marketplace-sbt7w" Dec 07 19:17:41 crc kubenswrapper[4815]: I1207 19:17:41.202190 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04fe5f63-515f-4b66-963c-c2ce259b9bad-utilities\") pod \"redhat-marketplace-sbt7w\" (UID: \"04fe5f63-515f-4b66-963c-c2ce259b9bad\") " pod="openshift-marketplace/redhat-marketplace-sbt7w" Dec 07 19:17:41 crc kubenswrapper[4815]: I1207 19:17:41.202533 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04fe5f63-515f-4b66-963c-c2ce259b9bad-utilities\") pod \"redhat-marketplace-sbt7w\" (UID: \"04fe5f63-515f-4b66-963c-c2ce259b9bad\") " pod="openshift-marketplace/redhat-marketplace-sbt7w" Dec 07 19:17:41 crc kubenswrapper[4815]: E1207 19:17:41.203237 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-07 19:17:41.703221575 +0000 UTC m=+166.282211620 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k99gl" (UID: "4c3b59c8-1311-4f2a-8ec6-699405d3a4ac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:41 crc kubenswrapper[4815]: I1207 19:17:41.204731 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04fe5f63-515f-4b66-963c-c2ce259b9bad-catalog-content\") pod \"redhat-marketplace-sbt7w\" (UID: \"04fe5f63-515f-4b66-963c-c2ce259b9bad\") " pod="openshift-marketplace/redhat-marketplace-sbt7w" Dec 07 19:17:41 crc kubenswrapper[4815]: I1207 19:17:41.254213 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-74mbw" event={"ID":"2677e4e7-8b16-4e36-9361-1896eaa6c973","Type":"ContainerStarted","Data":"f82d38a95aa76252f148b0268141e400cceb584d56382d0f6097d2358755db71"} Dec 07 19:17:41 crc kubenswrapper[4815]: I1207 19:17:41.259938 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8bsfd"] Dec 07 19:17:41 crc kubenswrapper[4815]: I1207 19:17:41.261322 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8bsfd" Dec 07 19:17:41 crc kubenswrapper[4815]: I1207 19:17:41.268107 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8bsfd"] Dec 07 19:17:41 crc kubenswrapper[4815]: I1207 19:17:41.270053 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-csjhg" event={"ID":"0eff7fdb-d25d-4a77-b828-1901c31f091d","Type":"ContainerStarted","Data":"1d2f92f00630a1c5db7a72b94ed0b537a6ec7ca0eea838756639e968dad2dd29"} Dec 07 19:17:41 crc kubenswrapper[4815]: I1207 19:17:41.270093 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-csjhg" event={"ID":"0eff7fdb-d25d-4a77-b828-1901c31f091d","Type":"ContainerStarted","Data":"3adad301e389def75136f598732276aede87e3eefe4ff7a0437785bbc0cb0894"} Dec 07 19:17:41 crc kubenswrapper[4815]: I1207 19:17:41.290008 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtfrt\" (UniqueName: \"kubernetes.io/projected/04fe5f63-515f-4b66-963c-c2ce259b9bad-kube-api-access-jtfrt\") pod \"redhat-marketplace-sbt7w\" (UID: \"04fe5f63-515f-4b66-963c-c2ce259b9bad\") " pod="openshift-marketplace/redhat-marketplace-sbt7w" Dec 07 19:17:41 crc kubenswrapper[4815]: I1207 19:17:41.303900 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 07 19:17:41 crc kubenswrapper[4815]: I1207 19:17:41.304107 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-52nff" event={"ID":"b9de4e38-617b-41a4-b97f-155d559d497a","Type":"ContainerStarted","Data":"1b6b48e4a2db772642abfa012759f50320a37c8fecebacfda624d9b0bf93d0e5"} Dec 07 19:17:41 crc kubenswrapper[4815]: I1207 19:17:41.304153 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7fd60078-7b6f-4267-9087-b450e5d67a09-catalog-content\") pod \"redhat-marketplace-8bsfd\" (UID: \"7fd60078-7b6f-4267-9087-b450e5d67a09\") " pod="openshift-marketplace/redhat-marketplace-8bsfd" Dec 07 19:17:41 crc kubenswrapper[4815]: I1207 19:17:41.304178 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7fd60078-7b6f-4267-9087-b450e5d67a09-utilities\") pod \"redhat-marketplace-8bsfd\" (UID: \"7fd60078-7b6f-4267-9087-b450e5d67a09\") " pod="openshift-marketplace/redhat-marketplace-8bsfd" Dec 07 19:17:41 crc kubenswrapper[4815]: I1207 19:17:41.304157 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-52nff" event={"ID":"b9de4e38-617b-41a4-b97f-155d559d497a","Type":"ContainerStarted","Data":"05c3f5609de0256250c7d92e590cf20b8921e387c66f9b2185c74576dd3cfb82"} Dec 07 19:17:41 crc kubenswrapper[4815]: E1207 19:17:41.304404 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-07 19:17:41.804384809 +0000 UTC m=+166.383374854 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:41 crc kubenswrapper[4815]: I1207 19:17:41.304622 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lt8qc\" (UniqueName: \"kubernetes.io/projected/7fd60078-7b6f-4267-9087-b450e5d67a09-kube-api-access-lt8qc\") pod \"redhat-marketplace-8bsfd\" (UID: \"7fd60078-7b6f-4267-9087-b450e5d67a09\") " pod="openshift-marketplace/redhat-marketplace-8bsfd" Dec 07 19:17:41 crc kubenswrapper[4815]: I1207 19:17:41.307692 4815 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 07 19:17:41 crc kubenswrapper[4815]: I1207 19:17:41.323636 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nb88w" event={"ID":"69cc2280-9dd4-43d1-87d6-c54a5b801a32","Type":"ContainerStarted","Data":"da221eb599563e2ce2605987eb6b83067980590999da90411e629d285171ed0f"} Dec 07 19:17:41 crc kubenswrapper[4815]: I1207 19:17:41.323678 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nb88w" event={"ID":"69cc2280-9dd4-43d1-87d6-c54a5b801a32","Type":"ContainerStarted","Data":"58e4c181e8567b82c04d29f5203d79cd5f6b34f9ec242e1c5ded64d82435c9b7"} Dec 07 19:17:41 crc kubenswrapper[4815]: I1207 19:17:41.333822 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bjqjh" event={"ID":"cc15074d-3de2-4533-84f9-d400e3400019","Type":"ContainerStarted","Data":"c23a3434eff2074ea14c7992788964839fc73b4e20b61ccb77f1951512ff169a"} Dec 07 19:17:41 crc kubenswrapper[4815]: I1207 19:17:41.350679 4815 generic.go:334] "Generic (PLEG): container finished" podID="0b1f9a05-fdb4-42cd-8835-44ab845941ad" containerID="bb77becfe2fd380c377e5c61a4161e67d662c57941151ee64ec98d16711606a6" exitCode=0 Dec 07 19:17:41 crc kubenswrapper[4815]: I1207 19:17:41.351292 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29418915-rch2v" event={"ID":"0b1f9a05-fdb4-42cd-8835-44ab845941ad","Type":"ContainerDied","Data":"bb77becfe2fd380c377e5c61a4161e67d662c57941151ee64ec98d16711606a6"} Dec 07 19:17:41 crc kubenswrapper[4815]: I1207 19:17:41.412587 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k99gl\" (UID: \"4c3b59c8-1311-4f2a-8ec6-699405d3a4ac\") " pod="openshift-image-registry/image-registry-697d97f7c8-k99gl" Dec 07 19:17:41 crc kubenswrapper[4815]: I1207 19:17:41.412821 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lt8qc\" (UniqueName: \"kubernetes.io/projected/7fd60078-7b6f-4267-9087-b450e5d67a09-kube-api-access-lt8qc\") pod \"redhat-marketplace-8bsfd\" (UID: \"7fd60078-7b6f-4267-9087-b450e5d67a09\") " pod="openshift-marketplace/redhat-marketplace-8bsfd" Dec 07 19:17:41 crc kubenswrapper[4815]: I1207 19:17:41.412955 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7fd60078-7b6f-4267-9087-b450e5d67a09-catalog-content\") pod \"redhat-marketplace-8bsfd\" (UID: \"7fd60078-7b6f-4267-9087-b450e5d67a09\") " pod="openshift-marketplace/redhat-marketplace-8bsfd" Dec 07 19:17:41 crc kubenswrapper[4815]: I1207 19:17:41.412981 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7fd60078-7b6f-4267-9087-b450e5d67a09-utilities\") pod \"redhat-marketplace-8bsfd\" (UID: \"7fd60078-7b6f-4267-9087-b450e5d67a09\") " pod="openshift-marketplace/redhat-marketplace-8bsfd" Dec 07 19:17:41 crc kubenswrapper[4815]: E1207 19:17:41.413935 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-07 19:17:41.913905774 +0000 UTC m=+166.492895819 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k99gl" (UID: "4c3b59c8-1311-4f2a-8ec6-699405d3a4ac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:41 crc kubenswrapper[4815]: I1207 19:17:41.414624 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7fd60078-7b6f-4267-9087-b450e5d67a09-catalog-content\") pod \"redhat-marketplace-8bsfd\" (UID: \"7fd60078-7b6f-4267-9087-b450e5d67a09\") " pod="openshift-marketplace/redhat-marketplace-8bsfd" Dec 07 19:17:41 crc kubenswrapper[4815]: I1207 19:17:41.414834 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7fd60078-7b6f-4267-9087-b450e5d67a09-utilities\") pod \"redhat-marketplace-8bsfd\" (UID: \"7fd60078-7b6f-4267-9087-b450e5d67a09\") " pod="openshift-marketplace/redhat-marketplace-8bsfd" Dec 07 19:17:41 crc kubenswrapper[4815]: I1207 19:17:41.458093 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lt8qc\" (UniqueName: \"kubernetes.io/projected/7fd60078-7b6f-4267-9087-b450e5d67a09-kube-api-access-lt8qc\") pod \"redhat-marketplace-8bsfd\" (UID: \"7fd60078-7b6f-4267-9087-b450e5d67a09\") " pod="openshift-marketplace/redhat-marketplace-8bsfd" Dec 07 19:17:41 crc kubenswrapper[4815]: I1207 19:17:41.515350 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 07 19:17:41 crc kubenswrapper[4815]: E1207 19:17:41.515722 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-07 19:17:42.015707716 +0000 UTC m=+166.594697761 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:41 crc kubenswrapper[4815]: I1207 19:17:41.528177 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sbt7w" Dec 07 19:17:41 crc kubenswrapper[4815]: I1207 19:17:41.600989 4815 patch_prober.go:28] interesting pod/router-default-5444994796-t955b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 07 19:17:41 crc kubenswrapper[4815]: [-]has-synced failed: reason withheld Dec 07 19:17:41 crc kubenswrapper[4815]: [+]process-running ok Dec 07 19:17:41 crc kubenswrapper[4815]: healthz check failed Dec 07 19:17:41 crc kubenswrapper[4815]: I1207 19:17:41.601041 4815 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-t955b" podUID="e44f9afe-8dfb-40a8-802d-2dd4bae4d7e3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 07 19:17:41 crc kubenswrapper[4815]: I1207 19:17:41.616228 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k99gl\" (UID: \"4c3b59c8-1311-4f2a-8ec6-699405d3a4ac\") " pod="openshift-image-registry/image-registry-697d97f7c8-k99gl" Dec 07 19:17:41 crc kubenswrapper[4815]: E1207 19:17:41.616510 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-07 19:17:42.116498659 +0000 UTC m=+166.695488704 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k99gl" (UID: "4c3b59c8-1311-4f2a-8ec6-699405d3a4ac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:41 crc kubenswrapper[4815]: I1207 19:17:41.651607 4815 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Dec 07 19:17:41 crc kubenswrapper[4815]: I1207 19:17:41.658580 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8bsfd" Dec 07 19:17:41 crc kubenswrapper[4815]: I1207 19:17:41.696833 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-kznql" Dec 07 19:17:41 crc kubenswrapper[4815]: I1207 19:17:41.717657 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 07 19:17:41 crc kubenswrapper[4815]: E1207 19:17:41.717820 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-07 19:17:42.217794147 +0000 UTC m=+166.796784192 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:41 crc kubenswrapper[4815]: I1207 19:17:41.717926 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k99gl\" (UID: \"4c3b59c8-1311-4f2a-8ec6-699405d3a4ac\") " pod="openshift-image-registry/image-registry-697d97f7c8-k99gl" Dec 07 19:17:41 crc kubenswrapper[4815]: E1207 19:17:41.718283 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-07 19:17:42.2182685 +0000 UTC m=+166.797258545 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k99gl" (UID: "4c3b59c8-1311-4f2a-8ec6-699405d3a4ac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:41 crc kubenswrapper[4815]: I1207 19:17:41.818892 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 07 19:17:41 crc kubenswrapper[4815]: E1207 19:17:41.819047 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-07 19:17:42.319021333 +0000 UTC m=+166.898011378 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:41 crc kubenswrapper[4815]: I1207 19:17:41.819273 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k99gl\" (UID: \"4c3b59c8-1311-4f2a-8ec6-699405d3a4ac\") " pod="openshift-image-registry/image-registry-697d97f7c8-k99gl" Dec 07 19:17:41 crc kubenswrapper[4815]: E1207 19:17:41.819519 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-07 19:17:42.319508086 +0000 UTC m=+166.898498131 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k99gl" (UID: "4c3b59c8-1311-4f2a-8ec6-699405d3a4ac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:41 crc kubenswrapper[4815]: I1207 19:17:41.919986 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 07 19:17:41 crc kubenswrapper[4815]: E1207 19:17:41.920339 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-07 19:17:42.42031508 +0000 UTC m=+166.999305125 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:41 crc kubenswrapper[4815]: I1207 19:17:41.920551 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k99gl\" (UID: \"4c3b59c8-1311-4f2a-8ec6-699405d3a4ac\") " pod="openshift-image-registry/image-registry-697d97f7c8-k99gl" Dec 07 19:17:41 crc kubenswrapper[4815]: E1207 19:17:41.920830 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-07 19:17:42.420822594 +0000 UTC m=+166.999812629 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k99gl" (UID: "4c3b59c8-1311-4f2a-8ec6-699405d3a4ac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:42 crc kubenswrapper[4815]: I1207 19:17:42.021936 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 07 19:17:42 crc kubenswrapper[4815]: E1207 19:17:42.022223 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-07 19:17:42.522209185 +0000 UTC m=+167.101199230 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:42 crc kubenswrapper[4815]: I1207 19:17:42.088164 4815 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-12-07T19:17:41.65163714Z","Handler":null,"Name":""} Dec 07 19:17:42 crc kubenswrapper[4815]: I1207 19:17:42.132209 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k99gl\" (UID: \"4c3b59c8-1311-4f2a-8ec6-699405d3a4ac\") " pod="openshift-image-registry/image-registry-697d97f7c8-k99gl" Dec 07 19:17:42 crc kubenswrapper[4815]: E1207 19:17:42.132576 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-07 19:17:42.632558672 +0000 UTC m=+167.211548717 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k99gl" (UID: "4c3b59c8-1311-4f2a-8ec6-699405d3a4ac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:42 crc kubenswrapper[4815]: I1207 19:17:42.202365 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8bsfd"] Dec 07 19:17:42 crc kubenswrapper[4815]: I1207 19:17:42.233363 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 07 19:17:42 crc kubenswrapper[4815]: E1207 19:17:42.233702 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-07 19:17:42.733687666 +0000 UTC m=+167.312677701 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 07 19:17:42 crc kubenswrapper[4815]: I1207 19:17:42.240539 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ssmr7"] Dec 07 19:17:42 crc kubenswrapper[4815]: I1207 19:17:42.241451 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ssmr7" Dec 07 19:17:42 crc kubenswrapper[4815]: I1207 19:17:42.243372 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 07 19:17:42 crc kubenswrapper[4815]: I1207 19:17:42.259512 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ssmr7"] Dec 07 19:17:42 crc kubenswrapper[4815]: I1207 19:17:42.270011 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sbt7w"] Dec 07 19:17:42 crc kubenswrapper[4815]: I1207 19:17:42.273341 4815 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Dec 07 19:17:42 crc kubenswrapper[4815]: I1207 19:17:42.273372 4815 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Dec 07 19:17:42 crc kubenswrapper[4815]: I1207 19:17:42.334323 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34c69927-3b4e-4e18-8201-27eb981bad10-utilities\") pod \"redhat-operators-ssmr7\" (UID: \"34c69927-3b4e-4e18-8201-27eb981bad10\") " pod="openshift-marketplace/redhat-operators-ssmr7" Dec 07 19:17:42 crc kubenswrapper[4815]: I1207 19:17:42.334369 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k99gl\" (UID: \"4c3b59c8-1311-4f2a-8ec6-699405d3a4ac\") " pod="openshift-image-registry/image-registry-697d97f7c8-k99gl" Dec 07 19:17:42 crc kubenswrapper[4815]: I1207 19:17:42.334850 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lvjz\" (UniqueName: \"kubernetes.io/projected/34c69927-3b4e-4e18-8201-27eb981bad10-kube-api-access-8lvjz\") pod \"redhat-operators-ssmr7\" (UID: \"34c69927-3b4e-4e18-8201-27eb981bad10\") " pod="openshift-marketplace/redhat-operators-ssmr7" Dec 07 19:17:42 crc kubenswrapper[4815]: I1207 19:17:42.335102 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34c69927-3b4e-4e18-8201-27eb981bad10-catalog-content\") pod \"redhat-operators-ssmr7\" (UID: \"34c69927-3b4e-4e18-8201-27eb981bad10\") " pod="openshift-marketplace/redhat-operators-ssmr7" Dec 07 19:17:42 crc kubenswrapper[4815]: I1207 19:17:42.344168 4815 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 07 19:17:42 crc kubenswrapper[4815]: I1207 19:17:42.344428 4815 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k99gl\" (UID: \"4c3b59c8-1311-4f2a-8ec6-699405d3a4ac\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-k99gl" Dec 07 19:17:42 crc kubenswrapper[4815]: I1207 19:17:42.374383 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k99gl\" (UID: \"4c3b59c8-1311-4f2a-8ec6-699405d3a4ac\") " pod="openshift-image-registry/image-registry-697d97f7c8-k99gl" Dec 07 19:17:42 crc kubenswrapper[4815]: I1207 19:17:42.385322 4815 generic.go:334] "Generic (PLEG): container finished" podID="69cc2280-9dd4-43d1-87d6-c54a5b801a32" containerID="da221eb599563e2ce2605987eb6b83067980590999da90411e629d285171ed0f" exitCode=0 Dec 07 19:17:42 crc kubenswrapper[4815]: I1207 19:17:42.385369 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nb88w" event={"ID":"69cc2280-9dd4-43d1-87d6-c54a5b801a32","Type":"ContainerDied","Data":"da221eb599563e2ce2605987eb6b83067980590999da90411e629d285171ed0f"} Dec 07 19:17:42 crc kubenswrapper[4815]: I1207 19:17:42.398080 4815 generic.go:334] "Generic (PLEG): container finished" podID="b9de4e38-617b-41a4-b97f-155d559d497a" containerID="1b6b48e4a2db772642abfa012759f50320a37c8fecebacfda624d9b0bf93d0e5" exitCode=0 Dec 07 19:17:42 crc kubenswrapper[4815]: I1207 19:17:42.398127 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-52nff" event={"ID":"b9de4e38-617b-41a4-b97f-155d559d497a","Type":"ContainerDied","Data":"1b6b48e4a2db772642abfa012759f50320a37c8fecebacfda624d9b0bf93d0e5"} Dec 07 19:17:42 crc kubenswrapper[4815]: I1207 19:17:42.402666 4815 generic.go:334] "Generic (PLEG): container finished" podID="cc15074d-3de2-4533-84f9-d400e3400019" containerID="46387a3d812b105dfba81410ab6b66d43d087afb667def6e51cd33345f719b04" exitCode=0 Dec 07 19:17:42 crc kubenswrapper[4815]: I1207 19:17:42.402777 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bjqjh" event={"ID":"cc15074d-3de2-4533-84f9-d400e3400019","Type":"ContainerDied","Data":"46387a3d812b105dfba81410ab6b66d43d087afb667def6e51cd33345f719b04"} Dec 07 19:17:42 crc kubenswrapper[4815]: I1207 19:17:42.420328 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-74mbw" event={"ID":"2677e4e7-8b16-4e36-9361-1896eaa6c973","Type":"ContainerStarted","Data":"c4b3c3c6b216ae982a03cd7b4b6073c44fe177eb4ac17072e1392d9907fbb6fc"} Dec 07 19:17:42 crc kubenswrapper[4815]: I1207 19:17:42.421521 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sbt7w" event={"ID":"04fe5f63-515f-4b66-963c-c2ce259b9bad","Type":"ContainerStarted","Data":"2bcd036bcecafcb1b6c576ce945b9afdf691bc050ebd3b750f071a2073b52c98"} Dec 07 19:17:42 crc kubenswrapper[4815]: I1207 19:17:42.427943 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8bsfd" event={"ID":"7fd60078-7b6f-4267-9087-b450e5d67a09","Type":"ContainerStarted","Data":"5ed58502660fd6cac4ab2e6fccdc65e135de216859e48c19dec62390e0918bea"} Dec 07 19:17:42 crc kubenswrapper[4815]: I1207 19:17:42.432736 4815 generic.go:334] "Generic (PLEG): container finished" podID="0eff7fdb-d25d-4a77-b828-1901c31f091d" containerID="1d2f92f00630a1c5db7a72b94ed0b537a6ec7ca0eea838756639e968dad2dd29" exitCode=0 Dec 07 19:17:42 crc kubenswrapper[4815]: I1207 19:17:42.433357 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-csjhg" event={"ID":"0eff7fdb-d25d-4a77-b828-1901c31f091d","Type":"ContainerDied","Data":"1d2f92f00630a1c5db7a72b94ed0b537a6ec7ca0eea838756639e968dad2dd29"} Dec 07 19:17:42 crc kubenswrapper[4815]: I1207 19:17:42.442971 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 07 19:17:42 crc kubenswrapper[4815]: I1207 19:17:42.443496 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34c69927-3b4e-4e18-8201-27eb981bad10-utilities\") pod \"redhat-operators-ssmr7\" (UID: \"34c69927-3b4e-4e18-8201-27eb981bad10\") " pod="openshift-marketplace/redhat-operators-ssmr7" Dec 07 19:17:42 crc kubenswrapper[4815]: I1207 19:17:42.443565 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lvjz\" (UniqueName: \"kubernetes.io/projected/34c69927-3b4e-4e18-8201-27eb981bad10-kube-api-access-8lvjz\") pod \"redhat-operators-ssmr7\" (UID: \"34c69927-3b4e-4e18-8201-27eb981bad10\") " pod="openshift-marketplace/redhat-operators-ssmr7" Dec 07 19:17:42 crc kubenswrapper[4815]: I1207 19:17:42.443633 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34c69927-3b4e-4e18-8201-27eb981bad10-catalog-content\") pod \"redhat-operators-ssmr7\" (UID: \"34c69927-3b4e-4e18-8201-27eb981bad10\") " pod="openshift-marketplace/redhat-operators-ssmr7" Dec 07 19:17:42 crc kubenswrapper[4815]: I1207 19:17:42.444011 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34c69927-3b4e-4e18-8201-27eb981bad10-catalog-content\") pod \"redhat-operators-ssmr7\" (UID: \"34c69927-3b4e-4e18-8201-27eb981bad10\") " pod="openshift-marketplace/redhat-operators-ssmr7" Dec 07 19:17:42 crc kubenswrapper[4815]: I1207 19:17:42.444815 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34c69927-3b4e-4e18-8201-27eb981bad10-utilities\") pod \"redhat-operators-ssmr7\" (UID: \"34c69927-3b4e-4e18-8201-27eb981bad10\") " pod="openshift-marketplace/redhat-operators-ssmr7" Dec 07 19:17:42 crc kubenswrapper[4815]: I1207 19:17:42.445609 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-74mbw" podStartSLOduration=12.445589378 podStartE2EDuration="12.445589378s" podCreationTimestamp="2025-12-07 19:17:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:17:42.441845475 +0000 UTC m=+167.020835520" watchObservedRunningTime="2025-12-07 19:17:42.445589378 +0000 UTC m=+167.024579423" Dec 07 19:17:42 crc kubenswrapper[4815]: I1207 19:17:42.457264 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 07 19:17:42 crc kubenswrapper[4815]: I1207 19:17:42.463261 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-k99gl" Dec 07 19:17:42 crc kubenswrapper[4815]: I1207 19:17:42.463668 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-z4njj" Dec 07 19:17:42 crc kubenswrapper[4815]: I1207 19:17:42.463738 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-z4njj" Dec 07 19:17:42 crc kubenswrapper[4815]: I1207 19:17:42.472424 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lvjz\" (UniqueName: \"kubernetes.io/projected/34c69927-3b4e-4e18-8201-27eb981bad10-kube-api-access-8lvjz\") pod \"redhat-operators-ssmr7\" (UID: \"34c69927-3b4e-4e18-8201-27eb981bad10\") " pod="openshift-marketplace/redhat-operators-ssmr7" Dec 07 19:17:42 crc kubenswrapper[4815]: I1207 19:17:42.475153 4815 patch_prober.go:28] interesting pod/apiserver-76f77b778f-z4njj container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 07 19:17:42 crc kubenswrapper[4815]: [+]log ok Dec 07 19:17:42 crc kubenswrapper[4815]: [+]etcd ok Dec 07 19:17:42 crc kubenswrapper[4815]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 07 19:17:42 crc kubenswrapper[4815]: [+]poststarthook/generic-apiserver-start-informers ok Dec 07 19:17:42 crc kubenswrapper[4815]: [+]poststarthook/max-in-flight-filter ok Dec 07 19:17:42 crc kubenswrapper[4815]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 07 19:17:42 crc kubenswrapper[4815]: [+]poststarthook/image.openshift.io-apiserver-caches ok Dec 07 19:17:42 crc kubenswrapper[4815]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Dec 07 19:17:42 crc kubenswrapper[4815]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Dec 07 19:17:42 crc kubenswrapper[4815]: [+]poststarthook/project.openshift.io-projectcache ok Dec 07 19:17:42 crc kubenswrapper[4815]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Dec 07 19:17:42 crc kubenswrapper[4815]: [+]poststarthook/openshift.io-startinformers ok Dec 07 19:17:42 crc kubenswrapper[4815]: [+]poststarthook/openshift.io-restmapperupdater ok Dec 07 19:17:42 crc kubenswrapper[4815]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Dec 07 19:17:42 crc kubenswrapper[4815]: livez check failed Dec 07 19:17:42 crc kubenswrapper[4815]: I1207 19:17:42.475191 4815 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-z4njj" podUID="d4648241-9b66-421e-b267-fc03442657a8" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 07 19:17:42 crc kubenswrapper[4815]: I1207 19:17:42.595138 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-t955b" Dec 07 19:17:42 crc kubenswrapper[4815]: I1207 19:17:42.601010 4815 patch_prober.go:28] interesting pod/router-default-5444994796-t955b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 07 19:17:42 crc kubenswrapper[4815]: [-]has-synced failed: reason withheld Dec 07 19:17:42 crc kubenswrapper[4815]: [+]process-running ok Dec 07 19:17:42 crc kubenswrapper[4815]: healthz check failed Dec 07 19:17:42 crc kubenswrapper[4815]: I1207 19:17:42.601051 4815 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-t955b" podUID="e44f9afe-8dfb-40a8-802d-2dd4bae4d7e3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 07 19:17:42 crc kubenswrapper[4815]: I1207 19:17:42.606217 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ssmr7" Dec 07 19:17:42 crc kubenswrapper[4815]: I1207 19:17:42.647511 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7c6vz"] Dec 07 19:17:42 crc kubenswrapper[4815]: I1207 19:17:42.658038 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7c6vz"] Dec 07 19:17:42 crc kubenswrapper[4815]: I1207 19:17:42.658170 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7c6vz" Dec 07 19:17:42 crc kubenswrapper[4815]: I1207 19:17:42.668981 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-c2mqh" Dec 07 19:17:42 crc kubenswrapper[4815]: I1207 19:17:42.706208 4815 patch_prober.go:28] interesting pod/downloads-7954f5f757-gpv67 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Dec 07 19:17:42 crc kubenswrapper[4815]: I1207 19:17:42.706514 4815 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-gpv67" podUID="cb417697-bbf5-4de5-ae9c-c04c37623e57" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Dec 07 19:17:42 crc kubenswrapper[4815]: I1207 19:17:42.706271 4815 patch_prober.go:28] interesting pod/downloads-7954f5f757-gpv67 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Dec 07 19:17:42 crc kubenswrapper[4815]: I1207 19:17:42.706825 4815 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-gpv67" podUID="cb417697-bbf5-4de5-ae9c-c04c37623e57" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Dec 07 19:17:42 crc kubenswrapper[4815]: I1207 19:17:42.729494 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29418915-rch2v" Dec 07 19:17:42 crc kubenswrapper[4815]: I1207 19:17:42.751435 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6da4a462-3799-4630-80bf-91b5b8112d23-catalog-content\") pod \"redhat-operators-7c6vz\" (UID: \"6da4a462-3799-4630-80bf-91b5b8112d23\") " pod="openshift-marketplace/redhat-operators-7c6vz" Dec 07 19:17:42 crc kubenswrapper[4815]: I1207 19:17:42.751528 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6da4a462-3799-4630-80bf-91b5b8112d23-utilities\") pod \"redhat-operators-7c6vz\" (UID: \"6da4a462-3799-4630-80bf-91b5b8112d23\") " pod="openshift-marketplace/redhat-operators-7c6vz" Dec 07 19:17:42 crc kubenswrapper[4815]: I1207 19:17:42.751600 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qhtr\" (UniqueName: \"kubernetes.io/projected/6da4a462-3799-4630-80bf-91b5b8112d23-kube-api-access-6qhtr\") pod \"redhat-operators-7c6vz\" (UID: \"6da4a462-3799-4630-80bf-91b5b8112d23\") " pod="openshift-marketplace/redhat-operators-7c6vz" Dec 07 19:17:42 crc kubenswrapper[4815]: I1207 19:17:42.765130 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-k99gl"] Dec 07 19:17:42 crc kubenswrapper[4815]: W1207 19:17:42.778738 4815 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c3b59c8_1311_4f2a_8ec6_699405d3a4ac.slice/crio-61994bdb28b11b350eeebba08915901e777c8cbb60334b6a0fef728a01742e97 WatchSource:0}: Error finding container 61994bdb28b11b350eeebba08915901e777c8cbb60334b6a0fef728a01742e97: Status 404 returned error can't find the container with id 61994bdb28b11b350eeebba08915901e777c8cbb60334b6a0fef728a01742e97 Dec 07 19:17:42 crc kubenswrapper[4815]: I1207 19:17:42.856450 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0b1f9a05-fdb4-42cd-8835-44ab845941ad-config-volume\") pod \"0b1f9a05-fdb4-42cd-8835-44ab845941ad\" (UID: \"0b1f9a05-fdb4-42cd-8835-44ab845941ad\") " Dec 07 19:17:42 crc kubenswrapper[4815]: I1207 19:17:42.856493 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0b1f9a05-fdb4-42cd-8835-44ab845941ad-secret-volume\") pod \"0b1f9a05-fdb4-42cd-8835-44ab845941ad\" (UID: \"0b1f9a05-fdb4-42cd-8835-44ab845941ad\") " Dec 07 19:17:42 crc kubenswrapper[4815]: I1207 19:17:42.856524 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wp9mp\" (UniqueName: \"kubernetes.io/projected/0b1f9a05-fdb4-42cd-8835-44ab845941ad-kube-api-access-wp9mp\") pod \"0b1f9a05-fdb4-42cd-8835-44ab845941ad\" (UID: \"0b1f9a05-fdb4-42cd-8835-44ab845941ad\") " Dec 07 19:17:42 crc kubenswrapper[4815]: I1207 19:17:42.856772 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6da4a462-3799-4630-80bf-91b5b8112d23-utilities\") pod \"redhat-operators-7c6vz\" (UID: \"6da4a462-3799-4630-80bf-91b5b8112d23\") " pod="openshift-marketplace/redhat-operators-7c6vz" Dec 07 19:17:42 crc kubenswrapper[4815]: I1207 19:17:42.856819 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qhtr\" (UniqueName: \"kubernetes.io/projected/6da4a462-3799-4630-80bf-91b5b8112d23-kube-api-access-6qhtr\") pod \"redhat-operators-7c6vz\" (UID: \"6da4a462-3799-4630-80bf-91b5b8112d23\") " pod="openshift-marketplace/redhat-operators-7c6vz" Dec 07 19:17:42 crc kubenswrapper[4815]: I1207 19:17:42.856845 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6da4a462-3799-4630-80bf-91b5b8112d23-catalog-content\") pod \"redhat-operators-7c6vz\" (UID: \"6da4a462-3799-4630-80bf-91b5b8112d23\") " pod="openshift-marketplace/redhat-operators-7c6vz" Dec 07 19:17:42 crc kubenswrapper[4815]: I1207 19:17:42.857319 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6da4a462-3799-4630-80bf-91b5b8112d23-catalog-content\") pod \"redhat-operators-7c6vz\" (UID: \"6da4a462-3799-4630-80bf-91b5b8112d23\") " pod="openshift-marketplace/redhat-operators-7c6vz" Dec 07 19:17:42 crc kubenswrapper[4815]: I1207 19:17:42.857596 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b1f9a05-fdb4-42cd-8835-44ab845941ad-config-volume" (OuterVolumeSpecName: "config-volume") pod "0b1f9a05-fdb4-42cd-8835-44ab845941ad" (UID: "0b1f9a05-fdb4-42cd-8835-44ab845941ad"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:17:42 crc kubenswrapper[4815]: I1207 19:17:42.857627 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6da4a462-3799-4630-80bf-91b5b8112d23-utilities\") pod \"redhat-operators-7c6vz\" (UID: \"6da4a462-3799-4630-80bf-91b5b8112d23\") " pod="openshift-marketplace/redhat-operators-7c6vz" Dec 07 19:17:42 crc kubenswrapper[4815]: I1207 19:17:42.864043 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b1f9a05-fdb4-42cd-8835-44ab845941ad-kube-api-access-wp9mp" (OuterVolumeSpecName: "kube-api-access-wp9mp") pod "0b1f9a05-fdb4-42cd-8835-44ab845941ad" (UID: "0b1f9a05-fdb4-42cd-8835-44ab845941ad"). InnerVolumeSpecName "kube-api-access-wp9mp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:17:42 crc kubenswrapper[4815]: I1207 19:17:42.869345 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b1f9a05-fdb4-42cd-8835-44ab845941ad-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "0b1f9a05-fdb4-42cd-8835-44ab845941ad" (UID: "0b1f9a05-fdb4-42cd-8835-44ab845941ad"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:17:42 crc kubenswrapper[4815]: I1207 19:17:42.883801 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qhtr\" (UniqueName: \"kubernetes.io/projected/6da4a462-3799-4630-80bf-91b5b8112d23-kube-api-access-6qhtr\") pod \"redhat-operators-7c6vz\" (UID: \"6da4a462-3799-4630-80bf-91b5b8112d23\") " pod="openshift-marketplace/redhat-operators-7c6vz" Dec 07 19:17:42 crc kubenswrapper[4815]: I1207 19:17:42.957637 4815 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0b1f9a05-fdb4-42cd-8835-44ab845941ad-config-volume\") on node \"crc\" DevicePath \"\"" Dec 07 19:17:42 crc kubenswrapper[4815]: I1207 19:17:42.957658 4815 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0b1f9a05-fdb4-42cd-8835-44ab845941ad-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 07 19:17:42 crc kubenswrapper[4815]: I1207 19:17:42.957668 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wp9mp\" (UniqueName: \"kubernetes.io/projected/0b1f9a05-fdb4-42cd-8835-44ab845941ad-kube-api-access-wp9mp\") on node \"crc\" DevicePath \"\"" Dec 07 19:17:42 crc kubenswrapper[4815]: I1207 19:17:42.982459 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7c6vz" Dec 07 19:17:42 crc kubenswrapper[4815]: I1207 19:17:42.984193 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ssmr7"] Dec 07 19:17:43 crc kubenswrapper[4815]: I1207 19:17:43.331382 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bls26" Dec 07 19:17:43 crc kubenswrapper[4815]: I1207 19:17:43.379041 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-2r46v" Dec 07 19:17:43 crc kubenswrapper[4815]: I1207 19:17:43.495312 4815 generic.go:334] "Generic (PLEG): container finished" podID="7fd60078-7b6f-4267-9087-b450e5d67a09" containerID="90cfa190a27906b003c546e4e9fd0314b4d95876ae395fca90c70b08000458c0" exitCode=0 Dec 07 19:17:43 crc kubenswrapper[4815]: I1207 19:17:43.495724 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8bsfd" event={"ID":"7fd60078-7b6f-4267-9087-b450e5d67a09","Type":"ContainerDied","Data":"90cfa190a27906b003c546e4e9fd0314b4d95876ae395fca90c70b08000458c0"} Dec 07 19:17:43 crc kubenswrapper[4815]: I1207 19:17:43.509344 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-k99gl" event={"ID":"4c3b59c8-1311-4f2a-8ec6-699405d3a4ac","Type":"ContainerStarted","Data":"7b32d1fc45f0aa64cb349a9c3afbfadae18e339e188c18f54e630b43efa9ab21"} Dec 07 19:17:43 crc kubenswrapper[4815]: I1207 19:17:43.509383 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-k99gl" event={"ID":"4c3b59c8-1311-4f2a-8ec6-699405d3a4ac","Type":"ContainerStarted","Data":"61994bdb28b11b350eeebba08915901e777c8cbb60334b6a0fef728a01742e97"} Dec 07 19:17:43 crc kubenswrapper[4815]: I1207 19:17:43.510442 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-k99gl" Dec 07 19:17:43 crc kubenswrapper[4815]: I1207 19:17:43.527406 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-k99gl" podStartSLOduration=138.527393057 podStartE2EDuration="2m18.527393057s" podCreationTimestamp="2025-12-07 19:15:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:17:43.524496416 +0000 UTC m=+168.103486461" watchObservedRunningTime="2025-12-07 19:17:43.527393057 +0000 UTC m=+168.106383102" Dec 07 19:17:43 crc kubenswrapper[4815]: I1207 19:17:43.585309 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29418915-rch2v" event={"ID":"0b1f9a05-fdb4-42cd-8835-44ab845941ad","Type":"ContainerDied","Data":"654f1e03ef13e52a96e4f3e9ef72628192d247c3d3e42eb6920e0a25ae255033"} Dec 07 19:17:43 crc kubenswrapper[4815]: I1207 19:17:43.585365 4815 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="654f1e03ef13e52a96e4f3e9ef72628192d247c3d3e42eb6920e0a25ae255033" Dec 07 19:17:43 crc kubenswrapper[4815]: I1207 19:17:43.585447 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29418915-rch2v" Dec 07 19:17:43 crc kubenswrapper[4815]: I1207 19:17:43.598456 4815 patch_prober.go:28] interesting pod/router-default-5444994796-t955b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 07 19:17:43 crc kubenswrapper[4815]: [-]has-synced failed: reason withheld Dec 07 19:17:43 crc kubenswrapper[4815]: [+]process-running ok Dec 07 19:17:43 crc kubenswrapper[4815]: healthz check failed Dec 07 19:17:43 crc kubenswrapper[4815]: I1207 19:17:43.598884 4815 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-t955b" podUID="e44f9afe-8dfb-40a8-802d-2dd4bae4d7e3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 07 19:17:43 crc kubenswrapper[4815]: I1207 19:17:43.643044 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7c6vz"] Dec 07 19:17:43 crc kubenswrapper[4815]: I1207 19:17:43.648681 4815 generic.go:334] "Generic (PLEG): container finished" podID="04fe5f63-515f-4b66-963c-c2ce259b9bad" containerID="565fe84718016e2f2636ae43b41a6f803b351df134f66c9e1713f4edfa90d028" exitCode=0 Dec 07 19:17:43 crc kubenswrapper[4815]: I1207 19:17:43.648773 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sbt7w" event={"ID":"04fe5f63-515f-4b66-963c-c2ce259b9bad","Type":"ContainerDied","Data":"565fe84718016e2f2636ae43b41a6f803b351df134f66c9e1713f4edfa90d028"} Dec 07 19:17:43 crc kubenswrapper[4815]: I1207 19:17:43.663445 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ssmr7" event={"ID":"34c69927-3b4e-4e18-8201-27eb981bad10","Type":"ContainerStarted","Data":"27c861801eede3f346f990c138530d885e28b135b9bc397bd566b76f2581926a"} Dec 07 19:17:43 crc kubenswrapper[4815]: I1207 19:17:43.663474 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ssmr7" event={"ID":"34c69927-3b4e-4e18-8201-27eb981bad10","Type":"ContainerStarted","Data":"61a57393d896752786f03c0b482a2a62fb13d6d9163a360a2a3f1320e99f7332"} Dec 07 19:17:43 crc kubenswrapper[4815]: I1207 19:17:43.846559 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Dec 07 19:17:44 crc kubenswrapper[4815]: I1207 19:17:44.597746 4815 patch_prober.go:28] interesting pod/router-default-5444994796-t955b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 07 19:17:44 crc kubenswrapper[4815]: [-]has-synced failed: reason withheld Dec 07 19:17:44 crc kubenswrapper[4815]: [+]process-running ok Dec 07 19:17:44 crc kubenswrapper[4815]: healthz check failed Dec 07 19:17:44 crc kubenswrapper[4815]: I1207 19:17:44.597797 4815 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-t955b" podUID="e44f9afe-8dfb-40a8-802d-2dd4bae4d7e3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 07 19:17:44 crc kubenswrapper[4815]: I1207 19:17:44.694769 4815 generic.go:334] "Generic (PLEG): container finished" podID="6da4a462-3799-4630-80bf-91b5b8112d23" containerID="3730e3c0ed5cca0e8ef7227fe50756bf34d246608f59ab8d59fb377200b3a43d" exitCode=0 Dec 07 19:17:44 crc kubenswrapper[4815]: I1207 19:17:44.694842 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7c6vz" event={"ID":"6da4a462-3799-4630-80bf-91b5b8112d23","Type":"ContainerDied","Data":"3730e3c0ed5cca0e8ef7227fe50756bf34d246608f59ab8d59fb377200b3a43d"} Dec 07 19:17:44 crc kubenswrapper[4815]: I1207 19:17:44.694866 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7c6vz" event={"ID":"6da4a462-3799-4630-80bf-91b5b8112d23","Type":"ContainerStarted","Data":"c5bd3b68defec579113115dbc009a4ce4a1d514c803e7d096b477c8163e0c4a1"} Dec 07 19:17:44 crc kubenswrapper[4815]: I1207 19:17:44.722761 4815 generic.go:334] "Generic (PLEG): container finished" podID="34c69927-3b4e-4e18-8201-27eb981bad10" containerID="27c861801eede3f346f990c138530d885e28b135b9bc397bd566b76f2581926a" exitCode=0 Dec 07 19:17:44 crc kubenswrapper[4815]: I1207 19:17:44.722852 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ssmr7" event={"ID":"34c69927-3b4e-4e18-8201-27eb981bad10","Type":"ContainerDied","Data":"27c861801eede3f346f990c138530d885e28b135b9bc397bd566b76f2581926a"} Dec 07 19:17:44 crc kubenswrapper[4815]: I1207 19:17:44.767591 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 07 19:17:44 crc kubenswrapper[4815]: E1207 19:17:44.767786 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b1f9a05-fdb4-42cd-8835-44ab845941ad" containerName="collect-profiles" Dec 07 19:17:44 crc kubenswrapper[4815]: I1207 19:17:44.767796 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b1f9a05-fdb4-42cd-8835-44ab845941ad" containerName="collect-profiles" Dec 07 19:17:44 crc kubenswrapper[4815]: I1207 19:17:44.767892 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b1f9a05-fdb4-42cd-8835-44ab845941ad" containerName="collect-profiles" Dec 07 19:17:44 crc kubenswrapper[4815]: I1207 19:17:44.768253 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 07 19:17:44 crc kubenswrapper[4815]: I1207 19:17:44.770603 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 07 19:17:44 crc kubenswrapper[4815]: I1207 19:17:44.771841 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 07 19:17:44 crc kubenswrapper[4815]: I1207 19:17:44.786179 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 07 19:17:44 crc kubenswrapper[4815]: I1207 19:17:44.906007 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b0a6811b-03ce-4007-9872-6b041a4c88f1-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"b0a6811b-03ce-4007-9872-6b041a4c88f1\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 07 19:17:44 crc kubenswrapper[4815]: I1207 19:17:44.906058 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b0a6811b-03ce-4007-9872-6b041a4c88f1-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"b0a6811b-03ce-4007-9872-6b041a4c88f1\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 07 19:17:45 crc kubenswrapper[4815]: I1207 19:17:45.006805 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b0a6811b-03ce-4007-9872-6b041a4c88f1-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"b0a6811b-03ce-4007-9872-6b041a4c88f1\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 07 19:17:45 crc kubenswrapper[4815]: I1207 19:17:45.006859 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b0a6811b-03ce-4007-9872-6b041a4c88f1-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"b0a6811b-03ce-4007-9872-6b041a4c88f1\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 07 19:17:45 crc kubenswrapper[4815]: I1207 19:17:45.006955 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b0a6811b-03ce-4007-9872-6b041a4c88f1-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"b0a6811b-03ce-4007-9872-6b041a4c88f1\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 07 19:17:45 crc kubenswrapper[4815]: I1207 19:17:45.027779 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b0a6811b-03ce-4007-9872-6b041a4c88f1-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"b0a6811b-03ce-4007-9872-6b041a4c88f1\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 07 19:17:45 crc kubenswrapper[4815]: I1207 19:17:45.095652 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 07 19:17:45 crc kubenswrapper[4815]: I1207 19:17:45.366565 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 07 19:17:45 crc kubenswrapper[4815]: W1207 19:17:45.464593 4815 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podb0a6811b_03ce_4007_9872_6b041a4c88f1.slice/crio-3a83b5ebfd9c39a57a64f272237ca79f7aedfbefe47ea0ffa8fcec20a8006013 WatchSource:0}: Error finding container 3a83b5ebfd9c39a57a64f272237ca79f7aedfbefe47ea0ffa8fcec20a8006013: Status 404 returned error can't find the container with id 3a83b5ebfd9c39a57a64f272237ca79f7aedfbefe47ea0ffa8fcec20a8006013 Dec 07 19:17:45 crc kubenswrapper[4815]: I1207 19:17:45.599628 4815 patch_prober.go:28] interesting pod/router-default-5444994796-t955b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 07 19:17:45 crc kubenswrapper[4815]: [-]has-synced failed: reason withheld Dec 07 19:17:45 crc kubenswrapper[4815]: [+]process-running ok Dec 07 19:17:45 crc kubenswrapper[4815]: healthz check failed Dec 07 19:17:45 crc kubenswrapper[4815]: I1207 19:17:45.599679 4815 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-t955b" podUID="e44f9afe-8dfb-40a8-802d-2dd4bae4d7e3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 07 19:17:45 crc kubenswrapper[4815]: I1207 19:17:45.611667 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 07 19:17:45 crc kubenswrapper[4815]: I1207 19:17:45.614026 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 07 19:17:45 crc kubenswrapper[4815]: I1207 19:17:45.616903 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Dec 07 19:17:45 crc kubenswrapper[4815]: I1207 19:17:45.619852 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Dec 07 19:17:45 crc kubenswrapper[4815]: I1207 19:17:45.629765 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 07 19:17:45 crc kubenswrapper[4815]: I1207 19:17:45.715092 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5c32f494-0d8c-47fd-9e99-b2473904a4da-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"5c32f494-0d8c-47fd-9e99-b2473904a4da\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 07 19:17:45 crc kubenswrapper[4815]: I1207 19:17:45.715140 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5c32f494-0d8c-47fd-9e99-b2473904a4da-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"5c32f494-0d8c-47fd-9e99-b2473904a4da\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 07 19:17:45 crc kubenswrapper[4815]: I1207 19:17:45.730950 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"b0a6811b-03ce-4007-9872-6b041a4c88f1","Type":"ContainerStarted","Data":"3a83b5ebfd9c39a57a64f272237ca79f7aedfbefe47ea0ffa8fcec20a8006013"} Dec 07 19:17:45 crc kubenswrapper[4815]: I1207 19:17:45.817017 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5c32f494-0d8c-47fd-9e99-b2473904a4da-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"5c32f494-0d8c-47fd-9e99-b2473904a4da\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 07 19:17:45 crc kubenswrapper[4815]: I1207 19:17:45.817057 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5c32f494-0d8c-47fd-9e99-b2473904a4da-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"5c32f494-0d8c-47fd-9e99-b2473904a4da\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 07 19:17:45 crc kubenswrapper[4815]: I1207 19:17:45.817615 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5c32f494-0d8c-47fd-9e99-b2473904a4da-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"5c32f494-0d8c-47fd-9e99-b2473904a4da\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 07 19:17:45 crc kubenswrapper[4815]: I1207 19:17:45.838712 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5c32f494-0d8c-47fd-9e99-b2473904a4da-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"5c32f494-0d8c-47fd-9e99-b2473904a4da\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 07 19:17:45 crc kubenswrapper[4815]: I1207 19:17:45.943035 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 07 19:17:46 crc kubenswrapper[4815]: I1207 19:17:46.310444 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 07 19:17:46 crc kubenswrapper[4815]: I1207 19:17:46.597924 4815 patch_prober.go:28] interesting pod/router-default-5444994796-t955b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 07 19:17:46 crc kubenswrapper[4815]: [-]has-synced failed: reason withheld Dec 07 19:17:46 crc kubenswrapper[4815]: [+]process-running ok Dec 07 19:17:46 crc kubenswrapper[4815]: healthz check failed Dec 07 19:17:46 crc kubenswrapper[4815]: I1207 19:17:46.597975 4815 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-t955b" podUID="e44f9afe-8dfb-40a8-802d-2dd4bae4d7e3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 07 19:17:46 crc kubenswrapper[4815]: I1207 19:17:46.751957 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"5c32f494-0d8c-47fd-9e99-b2473904a4da","Type":"ContainerStarted","Data":"6303cd0fdca6187e5651ac409fc8982da059e0e0489dbdd707a7692ab5077af5"} Dec 07 19:17:47 crc kubenswrapper[4815]: I1207 19:17:47.469340 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-z4njj" Dec 07 19:17:47 crc kubenswrapper[4815]: I1207 19:17:47.475720 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-z4njj" Dec 07 19:17:47 crc kubenswrapper[4815]: I1207 19:17:47.598579 4815 patch_prober.go:28] interesting pod/router-default-5444994796-t955b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 07 19:17:47 crc kubenswrapper[4815]: [-]has-synced failed: reason withheld Dec 07 19:17:47 crc kubenswrapper[4815]: [+]process-running ok Dec 07 19:17:47 crc kubenswrapper[4815]: healthz check failed Dec 07 19:17:47 crc kubenswrapper[4815]: I1207 19:17:47.598627 4815 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-t955b" podUID="e44f9afe-8dfb-40a8-802d-2dd4bae4d7e3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 07 19:17:47 crc kubenswrapper[4815]: I1207 19:17:47.802607 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"5c32f494-0d8c-47fd-9e99-b2473904a4da","Type":"ContainerStarted","Data":"7add81c6b791a40b4862f5cf4211c15b1460ce517a8930075ee34141d9a20f96"} Dec 07 19:17:47 crc kubenswrapper[4815]: I1207 19:17:47.806702 4815 generic.go:334] "Generic (PLEG): container finished" podID="b0a6811b-03ce-4007-9872-6b041a4c88f1" containerID="4095b39b99dfa224a2ff9085519c7fd0d57342ba85b7a3c9bd5a7eced09f97ee" exitCode=0 Dec 07 19:17:47 crc kubenswrapper[4815]: I1207 19:17:47.807244 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"b0a6811b-03ce-4007-9872-6b041a4c88f1","Type":"ContainerDied","Data":"4095b39b99dfa224a2ff9085519c7fd0d57342ba85b7a3c9bd5a7eced09f97ee"} Dec 07 19:17:47 crc kubenswrapper[4815]: I1207 19:17:47.845381 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.845363543 podStartE2EDuration="2.845363543s" podCreationTimestamp="2025-12-07 19:17:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:17:47.841255039 +0000 UTC m=+172.420245084" watchObservedRunningTime="2025-12-07 19:17:47.845363543 +0000 UTC m=+172.424353588" Dec 07 19:17:48 crc kubenswrapper[4815]: I1207 19:17:48.602644 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-rsb58" Dec 07 19:17:48 crc kubenswrapper[4815]: I1207 19:17:48.602936 4815 patch_prober.go:28] interesting pod/router-default-5444994796-t955b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 07 19:17:48 crc kubenswrapper[4815]: [-]has-synced failed: reason withheld Dec 07 19:17:48 crc kubenswrapper[4815]: [+]process-running ok Dec 07 19:17:48 crc kubenswrapper[4815]: healthz check failed Dec 07 19:17:48 crc kubenswrapper[4815]: I1207 19:17:48.602987 4815 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-t955b" podUID="e44f9afe-8dfb-40a8-802d-2dd4bae4d7e3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 07 19:17:48 crc kubenswrapper[4815]: I1207 19:17:48.759891 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/201e9ba8-3e19-4555-90f0-587497a2a328-metrics-certs\") pod \"network-metrics-daemon-xbq22\" (UID: \"201e9ba8-3e19-4555-90f0-587497a2a328\") " pod="openshift-multus/network-metrics-daemon-xbq22" Dec 07 19:17:48 crc kubenswrapper[4815]: I1207 19:17:48.768392 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/201e9ba8-3e19-4555-90f0-587497a2a328-metrics-certs\") pod \"network-metrics-daemon-xbq22\" (UID: \"201e9ba8-3e19-4555-90f0-587497a2a328\") " pod="openshift-multus/network-metrics-daemon-xbq22" Dec 07 19:17:48 crc kubenswrapper[4815]: I1207 19:17:48.815316 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xbq22" Dec 07 19:17:48 crc kubenswrapper[4815]: I1207 19:17:48.823601 4815 generic.go:334] "Generic (PLEG): container finished" podID="5c32f494-0d8c-47fd-9e99-b2473904a4da" containerID="7add81c6b791a40b4862f5cf4211c15b1460ce517a8930075ee34141d9a20f96" exitCode=0 Dec 07 19:17:48 crc kubenswrapper[4815]: I1207 19:17:48.823836 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"5c32f494-0d8c-47fd-9e99-b2473904a4da","Type":"ContainerDied","Data":"7add81c6b791a40b4862f5cf4211c15b1460ce517a8930075ee34141d9a20f96"} Dec 07 19:17:49 crc kubenswrapper[4815]: I1207 19:17:49.256973 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 07 19:17:49 crc kubenswrapper[4815]: I1207 19:17:49.266891 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b0a6811b-03ce-4007-9872-6b041a4c88f1-kubelet-dir\") pod \"b0a6811b-03ce-4007-9872-6b041a4c88f1\" (UID: \"b0a6811b-03ce-4007-9872-6b041a4c88f1\") " Dec 07 19:17:49 crc kubenswrapper[4815]: I1207 19:17:49.267006 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b0a6811b-03ce-4007-9872-6b041a4c88f1-kube-api-access\") pod \"b0a6811b-03ce-4007-9872-6b041a4c88f1\" (UID: \"b0a6811b-03ce-4007-9872-6b041a4c88f1\") " Dec 07 19:17:49 crc kubenswrapper[4815]: I1207 19:17:49.267014 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b0a6811b-03ce-4007-9872-6b041a4c88f1-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "b0a6811b-03ce-4007-9872-6b041a4c88f1" (UID: "b0a6811b-03ce-4007-9872-6b041a4c88f1"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 07 19:17:49 crc kubenswrapper[4815]: I1207 19:17:49.267273 4815 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b0a6811b-03ce-4007-9872-6b041a4c88f1-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 07 19:17:49 crc kubenswrapper[4815]: I1207 19:17:49.296224 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0a6811b-03ce-4007-9872-6b041a4c88f1-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "b0a6811b-03ce-4007-9872-6b041a4c88f1" (UID: "b0a6811b-03ce-4007-9872-6b041a4c88f1"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:17:49 crc kubenswrapper[4815]: I1207 19:17:49.368872 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b0a6811b-03ce-4007-9872-6b041a4c88f1-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 07 19:17:49 crc kubenswrapper[4815]: I1207 19:17:49.472756 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-xbq22"] Dec 07 19:17:49 crc kubenswrapper[4815]: W1207 19:17:49.481584 4815 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod201e9ba8_3e19_4555_90f0_587497a2a328.slice/crio-1baec26411cd29aa9d1710339ce773eb0d050d44906b00c7af4260fc6305aa91 WatchSource:0}: Error finding container 1baec26411cd29aa9d1710339ce773eb0d050d44906b00c7af4260fc6305aa91: Status 404 returned error can't find the container with id 1baec26411cd29aa9d1710339ce773eb0d050d44906b00c7af4260fc6305aa91 Dec 07 19:17:49 crc kubenswrapper[4815]: I1207 19:17:49.598143 4815 patch_prober.go:28] interesting pod/router-default-5444994796-t955b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 07 19:17:49 crc kubenswrapper[4815]: [-]has-synced failed: reason withheld Dec 07 19:17:49 crc kubenswrapper[4815]: [+]process-running ok Dec 07 19:17:49 crc kubenswrapper[4815]: healthz check failed Dec 07 19:17:49 crc kubenswrapper[4815]: I1207 19:17:49.598191 4815 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-t955b" podUID="e44f9afe-8dfb-40a8-802d-2dd4bae4d7e3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 07 19:17:49 crc kubenswrapper[4815]: I1207 19:17:49.845951 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"b0a6811b-03ce-4007-9872-6b041a4c88f1","Type":"ContainerDied","Data":"3a83b5ebfd9c39a57a64f272237ca79f7aedfbefe47ea0ffa8fcec20a8006013"} Dec 07 19:17:49 crc kubenswrapper[4815]: I1207 19:17:49.845987 4815 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3a83b5ebfd9c39a57a64f272237ca79f7aedfbefe47ea0ffa8fcec20a8006013" Dec 07 19:17:49 crc kubenswrapper[4815]: I1207 19:17:49.846055 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 07 19:17:49 crc kubenswrapper[4815]: I1207 19:17:49.851095 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-xbq22" event={"ID":"201e9ba8-3e19-4555-90f0-587497a2a328","Type":"ContainerStarted","Data":"1baec26411cd29aa9d1710339ce773eb0d050d44906b00c7af4260fc6305aa91"} Dec 07 19:17:50 crc kubenswrapper[4815]: I1207 19:17:50.153754 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 07 19:17:50 crc kubenswrapper[4815]: I1207 19:17:50.176982 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5c32f494-0d8c-47fd-9e99-b2473904a4da-kubelet-dir\") pod \"5c32f494-0d8c-47fd-9e99-b2473904a4da\" (UID: \"5c32f494-0d8c-47fd-9e99-b2473904a4da\") " Dec 07 19:17:50 crc kubenswrapper[4815]: I1207 19:17:50.177039 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5c32f494-0d8c-47fd-9e99-b2473904a4da-kube-api-access\") pod \"5c32f494-0d8c-47fd-9e99-b2473904a4da\" (UID: \"5c32f494-0d8c-47fd-9e99-b2473904a4da\") " Dec 07 19:17:50 crc kubenswrapper[4815]: I1207 19:17:50.177095 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5c32f494-0d8c-47fd-9e99-b2473904a4da-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "5c32f494-0d8c-47fd-9e99-b2473904a4da" (UID: "5c32f494-0d8c-47fd-9e99-b2473904a4da"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 07 19:17:50 crc kubenswrapper[4815]: I1207 19:17:50.177388 4815 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5c32f494-0d8c-47fd-9e99-b2473904a4da-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 07 19:17:50 crc kubenswrapper[4815]: I1207 19:17:50.181092 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c32f494-0d8c-47fd-9e99-b2473904a4da-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "5c32f494-0d8c-47fd-9e99-b2473904a4da" (UID: "5c32f494-0d8c-47fd-9e99-b2473904a4da"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:17:50 crc kubenswrapper[4815]: I1207 19:17:50.277995 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5c32f494-0d8c-47fd-9e99-b2473904a4da-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 07 19:17:50 crc kubenswrapper[4815]: I1207 19:17:50.597797 4815 patch_prober.go:28] interesting pod/router-default-5444994796-t955b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 07 19:17:50 crc kubenswrapper[4815]: [-]has-synced failed: reason withheld Dec 07 19:17:50 crc kubenswrapper[4815]: [+]process-running ok Dec 07 19:17:50 crc kubenswrapper[4815]: healthz check failed Dec 07 19:17:50 crc kubenswrapper[4815]: I1207 19:17:50.597887 4815 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-t955b" podUID="e44f9afe-8dfb-40a8-802d-2dd4bae4d7e3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 07 19:17:50 crc kubenswrapper[4815]: I1207 19:17:50.887978 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"5c32f494-0d8c-47fd-9e99-b2473904a4da","Type":"ContainerDied","Data":"6303cd0fdca6187e5651ac409fc8982da059e0e0489dbdd707a7692ab5077af5"} Dec 07 19:17:50 crc kubenswrapper[4815]: I1207 19:17:50.888016 4815 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6303cd0fdca6187e5651ac409fc8982da059e0e0489dbdd707a7692ab5077af5" Dec 07 19:17:50 crc kubenswrapper[4815]: I1207 19:17:50.888027 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 07 19:17:51 crc kubenswrapper[4815]: I1207 19:17:51.103937 4815 patch_prober.go:28] interesting pod/console-f9d7485db-xxlhj container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Dec 07 19:17:51 crc kubenswrapper[4815]: I1207 19:17:51.104023 4815 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-xxlhj" podUID="f8d9863a-2779-463c-8d73-76246a51b333" containerName="console" probeResult="failure" output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" Dec 07 19:17:51 crc kubenswrapper[4815]: I1207 19:17:51.597710 4815 patch_prober.go:28] interesting pod/router-default-5444994796-t955b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 07 19:17:51 crc kubenswrapper[4815]: [-]has-synced failed: reason withheld Dec 07 19:17:51 crc kubenswrapper[4815]: [+]process-running ok Dec 07 19:17:51 crc kubenswrapper[4815]: healthz check failed Dec 07 19:17:51 crc kubenswrapper[4815]: I1207 19:17:51.598038 4815 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-t955b" podUID="e44f9afe-8dfb-40a8-802d-2dd4bae4d7e3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 07 19:17:51 crc kubenswrapper[4815]: I1207 19:17:51.903592 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-xbq22" event={"ID":"201e9ba8-3e19-4555-90f0-587497a2a328","Type":"ContainerStarted","Data":"6605ed4e1d3cc2e2f244c1e0e518fa58db52c32223db59739a89b75e7e78b0c1"} Dec 07 19:17:52 crc kubenswrapper[4815]: I1207 19:17:52.597905 4815 patch_prober.go:28] interesting pod/router-default-5444994796-t955b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 07 19:17:52 crc kubenswrapper[4815]: [-]has-synced failed: reason withheld Dec 07 19:17:52 crc kubenswrapper[4815]: [+]process-running ok Dec 07 19:17:52 crc kubenswrapper[4815]: healthz check failed Dec 07 19:17:52 crc kubenswrapper[4815]: I1207 19:17:52.597970 4815 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-t955b" podUID="e44f9afe-8dfb-40a8-802d-2dd4bae4d7e3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 07 19:17:52 crc kubenswrapper[4815]: I1207 19:17:52.715896 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-gpv67" Dec 07 19:17:53 crc kubenswrapper[4815]: I1207 19:17:53.597511 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-t955b" Dec 07 19:17:53 crc kubenswrapper[4815]: I1207 19:17:53.605526 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-t955b" Dec 07 19:17:56 crc kubenswrapper[4815]: I1207 19:17:56.360121 4815 patch_prober.go:28] interesting pod/machine-config-daemon-gkn4h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 07 19:17:56 crc kubenswrapper[4815]: I1207 19:17:56.360634 4815 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 07 19:18:01 crc kubenswrapper[4815]: I1207 19:18:01.116797 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-xxlhj" Dec 07 19:18:01 crc kubenswrapper[4815]: I1207 19:18:01.123542 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-xxlhj" Dec 07 19:18:02 crc kubenswrapper[4815]: I1207 19:18:02.471683 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-k99gl" Dec 07 19:18:09 crc kubenswrapper[4815]: I1207 19:18:09.940556 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 07 19:18:13 crc kubenswrapper[4815]: I1207 19:18:13.387810 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rw7gt" Dec 07 19:18:17 crc kubenswrapper[4815]: I1207 19:18:17.973995 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 07 19:18:17 crc kubenswrapper[4815]: E1207 19:18:17.975727 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c32f494-0d8c-47fd-9e99-b2473904a4da" containerName="pruner" Dec 07 19:18:17 crc kubenswrapper[4815]: I1207 19:18:17.975892 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c32f494-0d8c-47fd-9e99-b2473904a4da" containerName="pruner" Dec 07 19:18:17 crc kubenswrapper[4815]: E1207 19:18:17.976086 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0a6811b-03ce-4007-9872-6b041a4c88f1" containerName="pruner" Dec 07 19:18:17 crc kubenswrapper[4815]: I1207 19:18:17.976207 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0a6811b-03ce-4007-9872-6b041a4c88f1" containerName="pruner" Dec 07 19:18:17 crc kubenswrapper[4815]: I1207 19:18:17.976518 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0a6811b-03ce-4007-9872-6b041a4c88f1" containerName="pruner" Dec 07 19:18:17 crc kubenswrapper[4815]: I1207 19:18:17.976658 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c32f494-0d8c-47fd-9e99-b2473904a4da" containerName="pruner" Dec 07 19:18:17 crc kubenswrapper[4815]: I1207 19:18:17.977359 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 07 19:18:17 crc kubenswrapper[4815]: I1207 19:18:17.979617 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 07 19:18:17 crc kubenswrapper[4815]: I1207 19:18:17.987391 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 07 19:18:17 crc kubenswrapper[4815]: I1207 19:18:17.988167 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 07 19:18:18 crc kubenswrapper[4815]: I1207 19:18:18.103628 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/51575da5-e649-457f-b2af-149a0c4103f6-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"51575da5-e649-457f-b2af-149a0c4103f6\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 07 19:18:18 crc kubenswrapper[4815]: I1207 19:18:18.103871 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/51575da5-e649-457f-b2af-149a0c4103f6-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"51575da5-e649-457f-b2af-149a0c4103f6\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 07 19:18:18 crc kubenswrapper[4815]: I1207 19:18:18.204950 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/51575da5-e649-457f-b2af-149a0c4103f6-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"51575da5-e649-457f-b2af-149a0c4103f6\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 07 19:18:18 crc kubenswrapper[4815]: I1207 19:18:18.205041 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/51575da5-e649-457f-b2af-149a0c4103f6-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"51575da5-e649-457f-b2af-149a0c4103f6\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 07 19:18:18 crc kubenswrapper[4815]: I1207 19:18:18.205455 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/51575da5-e649-457f-b2af-149a0c4103f6-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"51575da5-e649-457f-b2af-149a0c4103f6\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 07 19:18:18 crc kubenswrapper[4815]: I1207 19:18:18.237717 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/51575da5-e649-457f-b2af-149a0c4103f6-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"51575da5-e649-457f-b2af-149a0c4103f6\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 07 19:18:18 crc kubenswrapper[4815]: I1207 19:18:18.317814 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 07 19:18:21 crc kubenswrapper[4815]: E1207 19:18:21.991891 4815 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 07 19:18:21 crc kubenswrapper[4815]: E1207 19:18:21.992578 4815 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8lvjz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-ssmr7_openshift-marketplace(34c69927-3b4e-4e18-8201-27eb981bad10): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 07 19:18:21 crc kubenswrapper[4815]: E1207 19:18:21.995170 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-ssmr7" podUID="34c69927-3b4e-4e18-8201-27eb981bad10" Dec 07 19:18:22 crc kubenswrapper[4815]: I1207 19:18:22.163832 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 07 19:18:22 crc kubenswrapper[4815]: I1207 19:18:22.171605 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 07 19:18:22 crc kubenswrapper[4815]: I1207 19:18:22.182834 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 07 19:18:22 crc kubenswrapper[4815]: I1207 19:18:22.268467 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8f52b789-4509-4834-95f0-62d38f4e620c-kube-api-access\") pod \"installer-9-crc\" (UID: \"8f52b789-4509-4834-95f0-62d38f4e620c\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 07 19:18:22 crc kubenswrapper[4815]: I1207 19:18:22.268635 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8f52b789-4509-4834-95f0-62d38f4e620c-kubelet-dir\") pod \"installer-9-crc\" (UID: \"8f52b789-4509-4834-95f0-62d38f4e620c\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 07 19:18:22 crc kubenswrapper[4815]: I1207 19:18:22.268676 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8f52b789-4509-4834-95f0-62d38f4e620c-var-lock\") pod \"installer-9-crc\" (UID: \"8f52b789-4509-4834-95f0-62d38f4e620c\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 07 19:18:22 crc kubenswrapper[4815]: I1207 19:18:22.370254 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8f52b789-4509-4834-95f0-62d38f4e620c-kube-api-access\") pod \"installer-9-crc\" (UID: \"8f52b789-4509-4834-95f0-62d38f4e620c\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 07 19:18:22 crc kubenswrapper[4815]: I1207 19:18:22.370306 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8f52b789-4509-4834-95f0-62d38f4e620c-kubelet-dir\") pod \"installer-9-crc\" (UID: \"8f52b789-4509-4834-95f0-62d38f4e620c\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 07 19:18:22 crc kubenswrapper[4815]: I1207 19:18:22.370329 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8f52b789-4509-4834-95f0-62d38f4e620c-var-lock\") pod \"installer-9-crc\" (UID: \"8f52b789-4509-4834-95f0-62d38f4e620c\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 07 19:18:22 crc kubenswrapper[4815]: I1207 19:18:22.370498 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8f52b789-4509-4834-95f0-62d38f4e620c-var-lock\") pod \"installer-9-crc\" (UID: \"8f52b789-4509-4834-95f0-62d38f4e620c\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 07 19:18:22 crc kubenswrapper[4815]: I1207 19:18:22.370545 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8f52b789-4509-4834-95f0-62d38f4e620c-kubelet-dir\") pod \"installer-9-crc\" (UID: \"8f52b789-4509-4834-95f0-62d38f4e620c\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 07 19:18:22 crc kubenswrapper[4815]: I1207 19:18:22.387645 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8f52b789-4509-4834-95f0-62d38f4e620c-kube-api-access\") pod \"installer-9-crc\" (UID: \"8f52b789-4509-4834-95f0-62d38f4e620c\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 07 19:18:22 crc kubenswrapper[4815]: I1207 19:18:22.497075 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 07 19:18:23 crc kubenswrapper[4815]: E1207 19:18:23.758016 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-ssmr7" podUID="34c69927-3b4e-4e18-8201-27eb981bad10" Dec 07 19:18:23 crc kubenswrapper[4815]: E1207 19:18:23.827636 4815 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 07 19:18:23 crc kubenswrapper[4815]: E1207 19:18:23.827854 4815 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6qhtr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-7c6vz_openshift-marketplace(6da4a462-3799-4630-80bf-91b5b8112d23): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 07 19:18:23 crc kubenswrapper[4815]: E1207 19:18:23.829243 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-7c6vz" podUID="6da4a462-3799-4630-80bf-91b5b8112d23" Dec 07 19:18:24 crc kubenswrapper[4815]: E1207 19:18:24.422287 4815 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 07 19:18:24 crc kubenswrapper[4815]: E1207 19:18:24.422823 4815 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jtfrt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-sbt7w_openshift-marketplace(04fe5f63-515f-4b66-963c-c2ce259b9bad): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 07 19:18:24 crc kubenswrapper[4815]: E1207 19:18:24.424239 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-sbt7w" podUID="04fe5f63-515f-4b66-963c-c2ce259b9bad" Dec 07 19:18:26 crc kubenswrapper[4815]: I1207 19:18:26.359982 4815 patch_prober.go:28] interesting pod/machine-config-daemon-gkn4h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 07 19:18:26 crc kubenswrapper[4815]: I1207 19:18:26.360078 4815 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 07 19:18:26 crc kubenswrapper[4815]: I1207 19:18:26.360145 4815 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" Dec 07 19:18:26 crc kubenswrapper[4815]: I1207 19:18:26.361299 4815 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"55eba62082762b6527923446c2d419d2eb00919331cf8e59bfa9cb27ca65a82e"} pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 07 19:18:26 crc kubenswrapper[4815]: I1207 19:18:26.361483 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" containerName="machine-config-daemon" containerID="cri-o://55eba62082762b6527923446c2d419d2eb00919331cf8e59bfa9cb27ca65a82e" gracePeriod=600 Dec 07 19:18:29 crc kubenswrapper[4815]: E1207 19:18:29.538419 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-sbt7w" podUID="04fe5f63-515f-4b66-963c-c2ce259b9bad" Dec 07 19:18:29 crc kubenswrapper[4815]: E1207 19:18:29.538469 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-7c6vz" podUID="6da4a462-3799-4630-80bf-91b5b8112d23" Dec 07 19:18:30 crc kubenswrapper[4815]: I1207 19:18:30.138763 4815 generic.go:334] "Generic (PLEG): container finished" podID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" containerID="55eba62082762b6527923446c2d419d2eb00919331cf8e59bfa9cb27ca65a82e" exitCode=0 Dec 07 19:18:30 crc kubenswrapper[4815]: I1207 19:18:30.138803 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" event={"ID":"3d662ba2-aa03-4eea-bd30-8ad40638f6c7","Type":"ContainerDied","Data":"55eba62082762b6527923446c2d419d2eb00919331cf8e59bfa9cb27ca65a82e"} Dec 07 19:18:32 crc kubenswrapper[4815]: E1207 19:18:32.548154 4815 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 07 19:18:32 crc kubenswrapper[4815]: E1207 19:18:32.548815 4815 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-48h9v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-52nff_openshift-marketplace(b9de4e38-617b-41a4-b97f-155d559d497a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 07 19:18:32 crc kubenswrapper[4815]: E1207 19:18:32.550075 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-52nff" podUID="b9de4e38-617b-41a4-b97f-155d559d497a" Dec 07 19:18:32 crc kubenswrapper[4815]: E1207 19:18:32.621257 4815 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 07 19:18:32 crc kubenswrapper[4815]: E1207 19:18:32.621516 4815 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 07 19:18:32 crc kubenswrapper[4815]: E1207 19:18:32.621679 4815 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lt8qc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-8bsfd_openshift-marketplace(7fd60078-7b6f-4267-9087-b450e5d67a09): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 07 19:18:32 crc kubenswrapper[4815]: E1207 19:18:32.621831 4815 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qmwpl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-csjhg_openshift-marketplace(0eff7fdb-d25d-4a77-b828-1901c31f091d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 07 19:18:32 crc kubenswrapper[4815]: E1207 19:18:32.622906 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-csjhg" podUID="0eff7fdb-d25d-4a77-b828-1901c31f091d" Dec 07 19:18:32 crc kubenswrapper[4815]: E1207 19:18:32.622965 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-8bsfd" podUID="7fd60078-7b6f-4267-9087-b450e5d67a09" Dec 07 19:18:32 crc kubenswrapper[4815]: E1207 19:18:32.653669 4815 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 07 19:18:32 crc kubenswrapper[4815]: E1207 19:18:32.654001 4815 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-f4x9n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-bjqjh_openshift-marketplace(cc15074d-3de2-4533-84f9-d400e3400019): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 07 19:18:32 crc kubenswrapper[4815]: E1207 19:18:32.655140 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-bjqjh" podUID="cc15074d-3de2-4533-84f9-d400e3400019" Dec 07 19:18:32 crc kubenswrapper[4815]: E1207 19:18:32.670770 4815 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 07 19:18:32 crc kubenswrapper[4815]: E1207 19:18:32.671276 4815 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hvgpf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-nb88w_openshift-marketplace(69cc2280-9dd4-43d1-87d6-c54a5b801a32): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 07 19:18:32 crc kubenswrapper[4815]: E1207 19:18:32.672435 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-nb88w" podUID="69cc2280-9dd4-43d1-87d6-c54a5b801a32" Dec 07 19:18:32 crc kubenswrapper[4815]: I1207 19:18:32.842484 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 07 19:18:32 crc kubenswrapper[4815]: I1207 19:18:32.951575 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 07 19:18:32 crc kubenswrapper[4815]: W1207 19:18:32.966557 4815 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod51575da5_e649_457f_b2af_149a0c4103f6.slice/crio-645c6db3e0324a9861327c0bc084925e77cf15fb7f06a9ff805f7b6c9e64c637 WatchSource:0}: Error finding container 645c6db3e0324a9861327c0bc084925e77cf15fb7f06a9ff805f7b6c9e64c637: Status 404 returned error can't find the container with id 645c6db3e0324a9861327c0bc084925e77cf15fb7f06a9ff805f7b6c9e64c637 Dec 07 19:18:33 crc kubenswrapper[4815]: I1207 19:18:33.155809 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"51575da5-e649-457f-b2af-149a0c4103f6","Type":"ContainerStarted","Data":"645c6db3e0324a9861327c0bc084925e77cf15fb7f06a9ff805f7b6c9e64c637"} Dec 07 19:18:33 crc kubenswrapper[4815]: I1207 19:18:33.158377 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-xbq22" event={"ID":"201e9ba8-3e19-4555-90f0-587497a2a328","Type":"ContainerStarted","Data":"ace9333538dfb0ac10e7b0590e2e7202f3ae523ad330e331b5241104b16f7ec8"} Dec 07 19:18:33 crc kubenswrapper[4815]: I1207 19:18:33.163129 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" event={"ID":"3d662ba2-aa03-4eea-bd30-8ad40638f6c7","Type":"ContainerStarted","Data":"66d22a4a12607541a7479c5436a415fa94c0b6d69786fcddbf663ce01735e9bc"} Dec 07 19:18:33 crc kubenswrapper[4815]: I1207 19:18:33.164986 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"8f52b789-4509-4834-95f0-62d38f4e620c","Type":"ContainerStarted","Data":"1393b1547e7b18d52d852c2b98cb047a3bbf6001736643a2f8d414666b404ec2"} Dec 07 19:18:33 crc kubenswrapper[4815]: E1207 19:18:33.167344 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-csjhg" podUID="0eff7fdb-d25d-4a77-b828-1901c31f091d" Dec 07 19:18:33 crc kubenswrapper[4815]: E1207 19:18:33.167434 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-8bsfd" podUID="7fd60078-7b6f-4267-9087-b450e5d67a09" Dec 07 19:18:33 crc kubenswrapper[4815]: E1207 19:18:33.167525 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-52nff" podUID="b9de4e38-617b-41a4-b97f-155d559d497a" Dec 07 19:18:33 crc kubenswrapper[4815]: E1207 19:18:33.167557 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-bjqjh" podUID="cc15074d-3de2-4533-84f9-d400e3400019" Dec 07 19:18:33 crc kubenswrapper[4815]: E1207 19:18:33.167699 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-nb88w" podUID="69cc2280-9dd4-43d1-87d6-c54a5b801a32" Dec 07 19:18:33 crc kubenswrapper[4815]: I1207 19:18:33.191310 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-xbq22" podStartSLOduration=188.19129366 podStartE2EDuration="3m8.19129366s" podCreationTimestamp="2025-12-07 19:15:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:18:33.173383486 +0000 UTC m=+217.752373531" watchObservedRunningTime="2025-12-07 19:18:33.19129366 +0000 UTC m=+217.770283705" Dec 07 19:18:34 crc kubenswrapper[4815]: I1207 19:18:34.171010 4815 generic.go:334] "Generic (PLEG): container finished" podID="51575da5-e649-457f-b2af-149a0c4103f6" containerID="214050dc5e0760bda75f20d829eba4ecdb7fa8e1686594ba71fa24a57466e42d" exitCode=0 Dec 07 19:18:34 crc kubenswrapper[4815]: I1207 19:18:34.171479 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"51575da5-e649-457f-b2af-149a0c4103f6","Type":"ContainerDied","Data":"214050dc5e0760bda75f20d829eba4ecdb7fa8e1686594ba71fa24a57466e42d"} Dec 07 19:18:34 crc kubenswrapper[4815]: I1207 19:18:34.174291 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"8f52b789-4509-4834-95f0-62d38f4e620c","Type":"ContainerStarted","Data":"02391ea487d6d52c854b6c17e4271168ad5518e6301ed59d9b0448f19c9d42e1"} Dec 07 19:18:34 crc kubenswrapper[4815]: I1207 19:18:34.208281 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=12.208265966999999 podStartE2EDuration="12.208265967s" podCreationTimestamp="2025-12-07 19:18:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:18:34.206058536 +0000 UTC m=+218.785048581" watchObservedRunningTime="2025-12-07 19:18:34.208265967 +0000 UTC m=+218.787256012" Dec 07 19:18:35 crc kubenswrapper[4815]: I1207 19:18:35.463982 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 07 19:18:35 crc kubenswrapper[4815]: I1207 19:18:35.543975 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/51575da5-e649-457f-b2af-149a0c4103f6-kube-api-access\") pod \"51575da5-e649-457f-b2af-149a0c4103f6\" (UID: \"51575da5-e649-457f-b2af-149a0c4103f6\") " Dec 07 19:18:35 crc kubenswrapper[4815]: I1207 19:18:35.544232 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/51575da5-e649-457f-b2af-149a0c4103f6-kubelet-dir\") pod \"51575da5-e649-457f-b2af-149a0c4103f6\" (UID: \"51575da5-e649-457f-b2af-149a0c4103f6\") " Dec 07 19:18:35 crc kubenswrapper[4815]: I1207 19:18:35.544558 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/51575da5-e649-457f-b2af-149a0c4103f6-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "51575da5-e649-457f-b2af-149a0c4103f6" (UID: "51575da5-e649-457f-b2af-149a0c4103f6"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 07 19:18:35 crc kubenswrapper[4815]: I1207 19:18:35.550059 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51575da5-e649-457f-b2af-149a0c4103f6-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "51575da5-e649-457f-b2af-149a0c4103f6" (UID: "51575da5-e649-457f-b2af-149a0c4103f6"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:18:35 crc kubenswrapper[4815]: I1207 19:18:35.645171 4815 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/51575da5-e649-457f-b2af-149a0c4103f6-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 07 19:18:35 crc kubenswrapper[4815]: I1207 19:18:35.645388 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/51575da5-e649-457f-b2af-149a0c4103f6-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 07 19:18:36 crc kubenswrapper[4815]: I1207 19:18:36.185357 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"51575da5-e649-457f-b2af-149a0c4103f6","Type":"ContainerDied","Data":"645c6db3e0324a9861327c0bc084925e77cf15fb7f06a9ff805f7b6c9e64c637"} Dec 07 19:18:36 crc kubenswrapper[4815]: I1207 19:18:36.185621 4815 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="645c6db3e0324a9861327c0bc084925e77cf15fb7f06a9ff805f7b6c9e64c637" Dec 07 19:18:36 crc kubenswrapper[4815]: I1207 19:18:36.185408 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 07 19:18:40 crc kubenswrapper[4815]: I1207 19:18:40.208044 4815 generic.go:334] "Generic (PLEG): container finished" podID="34c69927-3b4e-4e18-8201-27eb981bad10" containerID="75940c7ec359ac6048f851d381a8f43d4a904311f6d61fad6deb75f2674599cc" exitCode=0 Dec 07 19:18:40 crc kubenswrapper[4815]: I1207 19:18:40.208129 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ssmr7" event={"ID":"34c69927-3b4e-4e18-8201-27eb981bad10","Type":"ContainerDied","Data":"75940c7ec359ac6048f851d381a8f43d4a904311f6d61fad6deb75f2674599cc"} Dec 07 19:18:40 crc kubenswrapper[4815]: I1207 19:18:40.826015 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-r924c"] Dec 07 19:18:41 crc kubenswrapper[4815]: I1207 19:18:41.215077 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ssmr7" event={"ID":"34c69927-3b4e-4e18-8201-27eb981bad10","Type":"ContainerStarted","Data":"dc316da3ca756944a74687ba403b388d207be913f64f2d8a90ec49bd7520ac96"} Dec 07 19:18:41 crc kubenswrapper[4815]: I1207 19:18:41.235711 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ssmr7" podStartSLOduration=2.269595606 podStartE2EDuration="59.235696248s" podCreationTimestamp="2025-12-07 19:17:42 +0000 UTC" firstStartedPulling="2025-12-07 19:17:43.664080992 +0000 UTC m=+168.243071037" lastFinishedPulling="2025-12-07 19:18:40.630181594 +0000 UTC m=+225.209171679" observedRunningTime="2025-12-07 19:18:41.233459036 +0000 UTC m=+225.812449081" watchObservedRunningTime="2025-12-07 19:18:41.235696248 +0000 UTC m=+225.814686293" Dec 07 19:18:42 crc kubenswrapper[4815]: I1207 19:18:42.221154 4815 generic.go:334] "Generic (PLEG): container finished" podID="6da4a462-3799-4630-80bf-91b5b8112d23" containerID="37ec65185ef52c45483f50b2c6831785c65f4e601852a4b3f0cc93bea8101c6b" exitCode=0 Dec 07 19:18:42 crc kubenswrapper[4815]: I1207 19:18:42.222189 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7c6vz" event={"ID":"6da4a462-3799-4630-80bf-91b5b8112d23","Type":"ContainerDied","Data":"37ec65185ef52c45483f50b2c6831785c65f4e601852a4b3f0cc93bea8101c6b"} Dec 07 19:18:42 crc kubenswrapper[4815]: I1207 19:18:42.607827 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ssmr7" Dec 07 19:18:42 crc kubenswrapper[4815]: I1207 19:18:42.608168 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ssmr7" Dec 07 19:18:43 crc kubenswrapper[4815]: I1207 19:18:43.228787 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7c6vz" event={"ID":"6da4a462-3799-4630-80bf-91b5b8112d23","Type":"ContainerStarted","Data":"c06daabe4e4232067fe74a2f8b122b84d2581bb0390edd4eec0098bad9e4aebe"} Dec 07 19:18:43 crc kubenswrapper[4815]: I1207 19:18:43.231008 4815 generic.go:334] "Generic (PLEG): container finished" podID="04fe5f63-515f-4b66-963c-c2ce259b9bad" containerID="395469c72e7cbee9c154af95f11191d9d938ab38b2a3cb573658d6979747564c" exitCode=0 Dec 07 19:18:43 crc kubenswrapper[4815]: I1207 19:18:43.231032 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sbt7w" event={"ID":"04fe5f63-515f-4b66-963c-c2ce259b9bad","Type":"ContainerDied","Data":"395469c72e7cbee9c154af95f11191d9d938ab38b2a3cb573658d6979747564c"} Dec 07 19:18:43 crc kubenswrapper[4815]: I1207 19:18:43.247690 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7c6vz" podStartSLOduration=3.3076507250000002 podStartE2EDuration="1m1.247671936s" podCreationTimestamp="2025-12-07 19:17:42 +0000 UTC" firstStartedPulling="2025-12-07 19:17:44.706979225 +0000 UTC m=+169.285969270" lastFinishedPulling="2025-12-07 19:18:42.647000436 +0000 UTC m=+227.225990481" observedRunningTime="2025-12-07 19:18:43.244505598 +0000 UTC m=+227.823495643" watchObservedRunningTime="2025-12-07 19:18:43.247671936 +0000 UTC m=+227.826661981" Dec 07 19:18:43 crc kubenswrapper[4815]: I1207 19:18:43.672601 4815 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-ssmr7" podUID="34c69927-3b4e-4e18-8201-27eb981bad10" containerName="registry-server" probeResult="failure" output=< Dec 07 19:18:43 crc kubenswrapper[4815]: timeout: failed to connect service ":50051" within 1s Dec 07 19:18:43 crc kubenswrapper[4815]: > Dec 07 19:18:44 crc kubenswrapper[4815]: I1207 19:18:44.237773 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sbt7w" event={"ID":"04fe5f63-515f-4b66-963c-c2ce259b9bad","Type":"ContainerStarted","Data":"64b300ae46be3492721068216bda10b21d81f220749cbfafc2f39cf65db269a4"} Dec 07 19:18:44 crc kubenswrapper[4815]: I1207 19:18:44.259939 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-sbt7w" podStartSLOduration=4.29148276 podStartE2EDuration="1m4.259886012s" podCreationTimestamp="2025-12-07 19:17:40 +0000 UTC" firstStartedPulling="2025-12-07 19:17:43.652579344 +0000 UTC m=+168.231569389" lastFinishedPulling="2025-12-07 19:18:43.620982596 +0000 UTC m=+228.199972641" observedRunningTime="2025-12-07 19:18:44.256087017 +0000 UTC m=+228.835077062" watchObservedRunningTime="2025-12-07 19:18:44.259886012 +0000 UTC m=+228.838876057" Dec 07 19:18:50 crc kubenswrapper[4815]: I1207 19:18:50.263900 4815 generic.go:334] "Generic (PLEG): container finished" podID="cc15074d-3de2-4533-84f9-d400e3400019" containerID="8f38fff6e3d344a1504be0592a0438559876ad2a2c4c7fded528e2646c8952d2" exitCode=0 Dec 07 19:18:50 crc kubenswrapper[4815]: I1207 19:18:50.263951 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bjqjh" event={"ID":"cc15074d-3de2-4533-84f9-d400e3400019","Type":"ContainerDied","Data":"8f38fff6e3d344a1504be0592a0438559876ad2a2c4c7fded528e2646c8952d2"} Dec 07 19:18:50 crc kubenswrapper[4815]: I1207 19:18:50.270401 4815 generic.go:334] "Generic (PLEG): container finished" podID="7fd60078-7b6f-4267-9087-b450e5d67a09" containerID="64da62e11e447f55c0255926a204b9a8768a28a055cf53497473b8ba83689dba" exitCode=0 Dec 07 19:18:50 crc kubenswrapper[4815]: I1207 19:18:50.270466 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8bsfd" event={"ID":"7fd60078-7b6f-4267-9087-b450e5d67a09","Type":"ContainerDied","Data":"64da62e11e447f55c0255926a204b9a8768a28a055cf53497473b8ba83689dba"} Dec 07 19:18:50 crc kubenswrapper[4815]: I1207 19:18:50.273396 4815 generic.go:334] "Generic (PLEG): container finished" podID="0eff7fdb-d25d-4a77-b828-1901c31f091d" containerID="29a859eddbc43e2242237a847505a484989163e31a19b7c2dc3967081ebe26be" exitCode=0 Dec 07 19:18:50 crc kubenswrapper[4815]: I1207 19:18:50.273445 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-csjhg" event={"ID":"0eff7fdb-d25d-4a77-b828-1901c31f091d","Type":"ContainerDied","Data":"29a859eddbc43e2242237a847505a484989163e31a19b7c2dc3967081ebe26be"} Dec 07 19:18:50 crc kubenswrapper[4815]: I1207 19:18:50.277110 4815 generic.go:334] "Generic (PLEG): container finished" podID="b9de4e38-617b-41a4-b97f-155d559d497a" containerID="60d9e41befa3fa4fc4f6fe8bb9401a0d3d3a289e8ea313d0d44a502f85baeb2a" exitCode=0 Dec 07 19:18:50 crc kubenswrapper[4815]: I1207 19:18:50.277158 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-52nff" event={"ID":"b9de4e38-617b-41a4-b97f-155d559d497a","Type":"ContainerDied","Data":"60d9e41befa3fa4fc4f6fe8bb9401a0d3d3a289e8ea313d0d44a502f85baeb2a"} Dec 07 19:18:50 crc kubenswrapper[4815]: I1207 19:18:50.279637 4815 generic.go:334] "Generic (PLEG): container finished" podID="69cc2280-9dd4-43d1-87d6-c54a5b801a32" containerID="277c34db773db200adcd120ae3061186720c3c3b1c37d3a7b08d8b052ca77590" exitCode=0 Dec 07 19:18:50 crc kubenswrapper[4815]: I1207 19:18:50.279669 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nb88w" event={"ID":"69cc2280-9dd4-43d1-87d6-c54a5b801a32","Type":"ContainerDied","Data":"277c34db773db200adcd120ae3061186720c3c3b1c37d3a7b08d8b052ca77590"} Dec 07 19:18:51 crc kubenswrapper[4815]: I1207 19:18:51.529521 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-sbt7w" Dec 07 19:18:51 crc kubenswrapper[4815]: I1207 19:18:51.529642 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-sbt7w" Dec 07 19:18:51 crc kubenswrapper[4815]: I1207 19:18:51.616136 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-sbt7w" Dec 07 19:18:52 crc kubenswrapper[4815]: I1207 19:18:52.354952 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-sbt7w" Dec 07 19:18:52 crc kubenswrapper[4815]: I1207 19:18:52.641198 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ssmr7" Dec 07 19:18:52 crc kubenswrapper[4815]: I1207 19:18:52.903807 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ssmr7" Dec 07 19:18:52 crc kubenswrapper[4815]: I1207 19:18:52.983149 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7c6vz" Dec 07 19:18:52 crc kubenswrapper[4815]: I1207 19:18:52.983184 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7c6vz" Dec 07 19:18:53 crc kubenswrapper[4815]: I1207 19:18:53.021680 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7c6vz" Dec 07 19:18:53 crc kubenswrapper[4815]: I1207 19:18:53.336228 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-csjhg" event={"ID":"0eff7fdb-d25d-4a77-b828-1901c31f091d","Type":"ContainerStarted","Data":"f1c373bbd4bfdda8d187f066cf1e9bf1b3886d28770b14c0a339e6d7706669e8"} Dec 07 19:18:53 crc kubenswrapper[4815]: I1207 19:18:53.342110 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bjqjh" event={"ID":"cc15074d-3de2-4533-84f9-d400e3400019","Type":"ContainerStarted","Data":"00f805c5ddd516abbdda2284be05cc5e83ccd221593da3cb831ecb32373fdb8e"} Dec 07 19:18:53 crc kubenswrapper[4815]: I1207 19:18:53.347197 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8bsfd" event={"ID":"7fd60078-7b6f-4267-9087-b450e5d67a09","Type":"ContainerStarted","Data":"4f19ecbb87ceb09940f659589ecd9d6582f20f3dc99dee45653c3e8c8840f6e2"} Dec 07 19:18:53 crc kubenswrapper[4815]: I1207 19:18:53.375384 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-csjhg" podStartSLOduration=3.823611129 podStartE2EDuration="1m14.37536089s" podCreationTimestamp="2025-12-07 19:17:39 +0000 UTC" firstStartedPulling="2025-12-07 19:17:42.434499712 +0000 UTC m=+167.013489757" lastFinishedPulling="2025-12-07 19:18:52.986249473 +0000 UTC m=+237.565239518" observedRunningTime="2025-12-07 19:18:53.373018246 +0000 UTC m=+237.952008291" watchObservedRunningTime="2025-12-07 19:18:53.37536089 +0000 UTC m=+237.954350945" Dec 07 19:18:53 crc kubenswrapper[4815]: I1207 19:18:53.394735 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8bsfd" podStartSLOduration=1.757479603 podStartE2EDuration="1m12.394717165s" podCreationTimestamp="2025-12-07 19:17:41 +0000 UTC" firstStartedPulling="2025-12-07 19:17:42.429279138 +0000 UTC m=+167.008269183" lastFinishedPulling="2025-12-07 19:18:53.06651669 +0000 UTC m=+237.645506745" observedRunningTime="2025-12-07 19:18:53.39345336 +0000 UTC m=+237.972443405" watchObservedRunningTime="2025-12-07 19:18:53.394717165 +0000 UTC m=+237.973707210" Dec 07 19:18:53 crc kubenswrapper[4815]: I1207 19:18:53.395354 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7c6vz" Dec 07 19:18:53 crc kubenswrapper[4815]: I1207 19:18:53.414005 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bjqjh" podStartSLOduration=3.874675349 podStartE2EDuration="1m14.413987817s" podCreationTimestamp="2025-12-07 19:17:39 +0000 UTC" firstStartedPulling="2025-12-07 19:17:42.406722195 +0000 UTC m=+166.985712240" lastFinishedPulling="2025-12-07 19:18:52.946034663 +0000 UTC m=+237.525024708" observedRunningTime="2025-12-07 19:18:53.412759193 +0000 UTC m=+237.991749238" watchObservedRunningTime="2025-12-07 19:18:53.413987817 +0000 UTC m=+237.992977862" Dec 07 19:18:54 crc kubenswrapper[4815]: I1207 19:18:54.357030 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-52nff" event={"ID":"b9de4e38-617b-41a4-b97f-155d559d497a","Type":"ContainerStarted","Data":"4607e322a8006f0373c80f1e2059aa62873bb765427a576f43a4b8f37eb89f88"} Dec 07 19:18:54 crc kubenswrapper[4815]: I1207 19:18:54.359065 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nb88w" event={"ID":"69cc2280-9dd4-43d1-87d6-c54a5b801a32","Type":"ContainerStarted","Data":"a139fcab43f7388eece156b83d154b00b660e3ca46ef99e8ae0279877f79146b"} Dec 07 19:18:54 crc kubenswrapper[4815]: I1207 19:18:54.422445 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-52nff" podStartSLOduration=3.554634139 podStartE2EDuration="1m15.422430909s" podCreationTimestamp="2025-12-07 19:17:39 +0000 UTC" firstStartedPulling="2025-12-07 19:17:41.307424053 +0000 UTC m=+165.886414098" lastFinishedPulling="2025-12-07 19:18:53.175220823 +0000 UTC m=+237.754210868" observedRunningTime="2025-12-07 19:18:54.420688011 +0000 UTC m=+238.999678056" watchObservedRunningTime="2025-12-07 19:18:54.422430909 +0000 UTC m=+239.001420954" Dec 07 19:18:54 crc kubenswrapper[4815]: I1207 19:18:54.443298 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-nb88w" podStartSLOduration=5.561644273 podStartE2EDuration="1m16.443282755s" podCreationTimestamp="2025-12-07 19:17:38 +0000 UTC" firstStartedPulling="2025-12-07 19:17:42.386732113 +0000 UTC m=+166.965722158" lastFinishedPulling="2025-12-07 19:18:53.268370595 +0000 UTC m=+237.847360640" observedRunningTime="2025-12-07 19:18:54.441087495 +0000 UTC m=+239.020077540" watchObservedRunningTime="2025-12-07 19:18:54.443282755 +0000 UTC m=+239.022272800" Dec 07 19:18:56 crc kubenswrapper[4815]: I1207 19:18:56.605967 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7c6vz"] Dec 07 19:18:56 crc kubenswrapper[4815]: I1207 19:18:56.606180 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7c6vz" podUID="6da4a462-3799-4630-80bf-91b5b8112d23" containerName="registry-server" containerID="cri-o://c06daabe4e4232067fe74a2f8b122b84d2581bb0390edd4eec0098bad9e4aebe" gracePeriod=2 Dec 07 19:18:59 crc kubenswrapper[4815]: I1207 19:18:59.224096 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-nb88w" Dec 07 19:18:59 crc kubenswrapper[4815]: I1207 19:18:59.224614 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-nb88w" Dec 07 19:18:59 crc kubenswrapper[4815]: I1207 19:18:59.271989 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-nb88w" Dec 07 19:18:59 crc kubenswrapper[4815]: I1207 19:18:59.395600 4815 generic.go:334] "Generic (PLEG): container finished" podID="6da4a462-3799-4630-80bf-91b5b8112d23" containerID="c06daabe4e4232067fe74a2f8b122b84d2581bb0390edd4eec0098bad9e4aebe" exitCode=0 Dec 07 19:18:59 crc kubenswrapper[4815]: I1207 19:18:59.396083 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7c6vz" event={"ID":"6da4a462-3799-4630-80bf-91b5b8112d23","Type":"ContainerDied","Data":"c06daabe4e4232067fe74a2f8b122b84d2581bb0390edd4eec0098bad9e4aebe"} Dec 07 19:18:59 crc kubenswrapper[4815]: I1207 19:18:59.441410 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-nb88w" Dec 07 19:18:59 crc kubenswrapper[4815]: I1207 19:18:59.700338 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-52nff" Dec 07 19:18:59 crc kubenswrapper[4815]: I1207 19:18:59.700687 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-52nff" Dec 07 19:18:59 crc kubenswrapper[4815]: I1207 19:18:59.743204 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-52nff" Dec 07 19:18:59 crc kubenswrapper[4815]: I1207 19:18:59.907104 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-csjhg" Dec 07 19:18:59 crc kubenswrapper[4815]: I1207 19:18:59.907574 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-csjhg" Dec 07 19:18:59 crc kubenswrapper[4815]: I1207 19:18:59.992121 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-csjhg" Dec 07 19:19:00 crc kubenswrapper[4815]: I1207 19:19:00.116376 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bjqjh" Dec 07 19:19:00 crc kubenswrapper[4815]: I1207 19:19:00.117320 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bjqjh" Dec 07 19:19:00 crc kubenswrapper[4815]: I1207 19:19:00.131132 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7c6vz" Dec 07 19:19:00 crc kubenswrapper[4815]: I1207 19:19:00.179471 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bjqjh" Dec 07 19:19:00 crc kubenswrapper[4815]: I1207 19:19:00.265452 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6da4a462-3799-4630-80bf-91b5b8112d23-utilities\") pod \"6da4a462-3799-4630-80bf-91b5b8112d23\" (UID: \"6da4a462-3799-4630-80bf-91b5b8112d23\") " Dec 07 19:19:00 crc kubenswrapper[4815]: I1207 19:19:00.265595 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6da4a462-3799-4630-80bf-91b5b8112d23-catalog-content\") pod \"6da4a462-3799-4630-80bf-91b5b8112d23\" (UID: \"6da4a462-3799-4630-80bf-91b5b8112d23\") " Dec 07 19:19:00 crc kubenswrapper[4815]: I1207 19:19:00.265682 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qhtr\" (UniqueName: \"kubernetes.io/projected/6da4a462-3799-4630-80bf-91b5b8112d23-kube-api-access-6qhtr\") pod \"6da4a462-3799-4630-80bf-91b5b8112d23\" (UID: \"6da4a462-3799-4630-80bf-91b5b8112d23\") " Dec 07 19:19:00 crc kubenswrapper[4815]: I1207 19:19:00.266260 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6da4a462-3799-4630-80bf-91b5b8112d23-utilities" (OuterVolumeSpecName: "utilities") pod "6da4a462-3799-4630-80bf-91b5b8112d23" (UID: "6da4a462-3799-4630-80bf-91b5b8112d23"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 07 19:19:00 crc kubenswrapper[4815]: I1207 19:19:00.275220 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6da4a462-3799-4630-80bf-91b5b8112d23-kube-api-access-6qhtr" (OuterVolumeSpecName: "kube-api-access-6qhtr") pod "6da4a462-3799-4630-80bf-91b5b8112d23" (UID: "6da4a462-3799-4630-80bf-91b5b8112d23"). InnerVolumeSpecName "kube-api-access-6qhtr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:19:00 crc kubenswrapper[4815]: I1207 19:19:00.367353 4815 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6da4a462-3799-4630-80bf-91b5b8112d23-utilities\") on node \"crc\" DevicePath \"\"" Dec 07 19:19:00 crc kubenswrapper[4815]: I1207 19:19:00.367398 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6qhtr\" (UniqueName: \"kubernetes.io/projected/6da4a462-3799-4630-80bf-91b5b8112d23-kube-api-access-6qhtr\") on node \"crc\" DevicePath \"\"" Dec 07 19:19:00 crc kubenswrapper[4815]: I1207 19:19:00.369650 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6da4a462-3799-4630-80bf-91b5b8112d23-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6da4a462-3799-4630-80bf-91b5b8112d23" (UID: "6da4a462-3799-4630-80bf-91b5b8112d23"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 07 19:19:00 crc kubenswrapper[4815]: I1207 19:19:00.405578 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7c6vz" event={"ID":"6da4a462-3799-4630-80bf-91b5b8112d23","Type":"ContainerDied","Data":"c5bd3b68defec579113115dbc009a4ce4a1d514c803e7d096b477c8163e0c4a1"} Dec 07 19:19:00 crc kubenswrapper[4815]: I1207 19:19:00.405647 4815 scope.go:117] "RemoveContainer" containerID="c06daabe4e4232067fe74a2f8b122b84d2581bb0390edd4eec0098bad9e4aebe" Dec 07 19:19:00 crc kubenswrapper[4815]: I1207 19:19:00.406987 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7c6vz" Dec 07 19:19:00 crc kubenswrapper[4815]: I1207 19:19:00.440551 4815 scope.go:117] "RemoveContainer" containerID="37ec65185ef52c45483f50b2c6831785c65f4e601852a4b3f0cc93bea8101c6b" Dec 07 19:19:00 crc kubenswrapper[4815]: I1207 19:19:00.451779 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7c6vz"] Dec 07 19:19:00 crc kubenswrapper[4815]: I1207 19:19:00.460887 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7c6vz"] Dec 07 19:19:00 crc kubenswrapper[4815]: I1207 19:19:00.470306 4815 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6da4a462-3799-4630-80bf-91b5b8112d23-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 07 19:19:00 crc kubenswrapper[4815]: I1207 19:19:00.471113 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bjqjh" Dec 07 19:19:00 crc kubenswrapper[4815]: I1207 19:19:00.485054 4815 scope.go:117] "RemoveContainer" containerID="3730e3c0ed5cca0e8ef7227fe50756bf34d246608f59ab8d59fb377200b3a43d" Dec 07 19:19:00 crc kubenswrapper[4815]: I1207 19:19:00.512751 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-csjhg" Dec 07 19:19:00 crc kubenswrapper[4815]: I1207 19:19:00.557593 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-52nff" Dec 07 19:19:01 crc kubenswrapper[4815]: I1207 19:19:01.659073 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8bsfd" Dec 07 19:19:01 crc kubenswrapper[4815]: I1207 19:19:01.659148 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8bsfd" Dec 07 19:19:01 crc kubenswrapper[4815]: I1207 19:19:01.697679 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8bsfd" Dec 07 19:19:01 crc kubenswrapper[4815]: I1207 19:19:01.777686 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6da4a462-3799-4630-80bf-91b5b8112d23" path="/var/lib/kubelet/pods/6da4a462-3799-4630-80bf-91b5b8112d23/volumes" Dec 07 19:19:02 crc kubenswrapper[4815]: I1207 19:19:02.477185 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8bsfd" Dec 07 19:19:03 crc kubenswrapper[4815]: I1207 19:19:03.014552 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bjqjh"] Dec 07 19:19:03 crc kubenswrapper[4815]: I1207 19:19:03.425788 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-bjqjh" podUID="cc15074d-3de2-4533-84f9-d400e3400019" containerName="registry-server" containerID="cri-o://00f805c5ddd516abbdda2284be05cc5e83ccd221593da3cb831ecb32373fdb8e" gracePeriod=2 Dec 07 19:19:03 crc kubenswrapper[4815]: I1207 19:19:03.610381 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-csjhg"] Dec 07 19:19:03 crc kubenswrapper[4815]: I1207 19:19:03.610988 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-csjhg" podUID="0eff7fdb-d25d-4a77-b828-1901c31f091d" containerName="registry-server" containerID="cri-o://f1c373bbd4bfdda8d187f066cf1e9bf1b3886d28770b14c0a339e6d7706669e8" gracePeriod=2 Dec 07 19:19:05 crc kubenswrapper[4815]: I1207 19:19:05.445502 4815 generic.go:334] "Generic (PLEG): container finished" podID="0eff7fdb-d25d-4a77-b828-1901c31f091d" containerID="f1c373bbd4bfdda8d187f066cf1e9bf1b3886d28770b14c0a339e6d7706669e8" exitCode=0 Dec 07 19:19:05 crc kubenswrapper[4815]: I1207 19:19:05.445611 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-csjhg" event={"ID":"0eff7fdb-d25d-4a77-b828-1901c31f091d","Type":"ContainerDied","Data":"f1c373bbd4bfdda8d187f066cf1e9bf1b3886d28770b14c0a339e6d7706669e8"} Dec 07 19:19:05 crc kubenswrapper[4815]: I1207 19:19:05.449818 4815 generic.go:334] "Generic (PLEG): container finished" podID="cc15074d-3de2-4533-84f9-d400e3400019" containerID="00f805c5ddd516abbdda2284be05cc5e83ccd221593da3cb831ecb32373fdb8e" exitCode=0 Dec 07 19:19:05 crc kubenswrapper[4815]: I1207 19:19:05.449912 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bjqjh" event={"ID":"cc15074d-3de2-4533-84f9-d400e3400019","Type":"ContainerDied","Data":"00f805c5ddd516abbdda2284be05cc5e83ccd221593da3cb831ecb32373fdb8e"} Dec 07 19:19:05 crc kubenswrapper[4815]: I1207 19:19:05.690683 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bjqjh" Dec 07 19:19:05 crc kubenswrapper[4815]: I1207 19:19:05.807819 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-csjhg" Dec 07 19:19:05 crc kubenswrapper[4815]: I1207 19:19:05.845018 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc15074d-3de2-4533-84f9-d400e3400019-utilities\") pod \"cc15074d-3de2-4533-84f9-d400e3400019\" (UID: \"cc15074d-3de2-4533-84f9-d400e3400019\") " Dec 07 19:19:05 crc kubenswrapper[4815]: I1207 19:19:05.845083 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f4x9n\" (UniqueName: \"kubernetes.io/projected/cc15074d-3de2-4533-84f9-d400e3400019-kube-api-access-f4x9n\") pod \"cc15074d-3de2-4533-84f9-d400e3400019\" (UID: \"cc15074d-3de2-4533-84f9-d400e3400019\") " Dec 07 19:19:05 crc kubenswrapper[4815]: I1207 19:19:05.845119 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc15074d-3de2-4533-84f9-d400e3400019-catalog-content\") pod \"cc15074d-3de2-4533-84f9-d400e3400019\" (UID: \"cc15074d-3de2-4533-84f9-d400e3400019\") " Dec 07 19:19:05 crc kubenswrapper[4815]: I1207 19:19:05.846339 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc15074d-3de2-4533-84f9-d400e3400019-utilities" (OuterVolumeSpecName: "utilities") pod "cc15074d-3de2-4533-84f9-d400e3400019" (UID: "cc15074d-3de2-4533-84f9-d400e3400019"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 07 19:19:05 crc kubenswrapper[4815]: I1207 19:19:05.850095 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc15074d-3de2-4533-84f9-d400e3400019-kube-api-access-f4x9n" (OuterVolumeSpecName: "kube-api-access-f4x9n") pod "cc15074d-3de2-4533-84f9-d400e3400019" (UID: "cc15074d-3de2-4533-84f9-d400e3400019"). InnerVolumeSpecName "kube-api-access-f4x9n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:19:05 crc kubenswrapper[4815]: I1207 19:19:05.874848 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-r924c" podUID="4dbec086-a865-4859-adbf-ab61d8395463" containerName="oauth-openshift" containerID="cri-o://1b93670e74fe118c42b4ca9ec9658221c91c2bcd3013cb5a33e119010bddbd87" gracePeriod=15 Dec 07 19:19:05 crc kubenswrapper[4815]: I1207 19:19:05.899986 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc15074d-3de2-4533-84f9-d400e3400019-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cc15074d-3de2-4533-84f9-d400e3400019" (UID: "cc15074d-3de2-4533-84f9-d400e3400019"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 07 19:19:05 crc kubenswrapper[4815]: I1207 19:19:05.946772 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0eff7fdb-d25d-4a77-b828-1901c31f091d-utilities\") pod \"0eff7fdb-d25d-4a77-b828-1901c31f091d\" (UID: \"0eff7fdb-d25d-4a77-b828-1901c31f091d\") " Dec 07 19:19:05 crc kubenswrapper[4815]: I1207 19:19:05.946833 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qmwpl\" (UniqueName: \"kubernetes.io/projected/0eff7fdb-d25d-4a77-b828-1901c31f091d-kube-api-access-qmwpl\") pod \"0eff7fdb-d25d-4a77-b828-1901c31f091d\" (UID: \"0eff7fdb-d25d-4a77-b828-1901c31f091d\") " Dec 07 19:19:05 crc kubenswrapper[4815]: I1207 19:19:05.946887 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0eff7fdb-d25d-4a77-b828-1901c31f091d-catalog-content\") pod \"0eff7fdb-d25d-4a77-b828-1901c31f091d\" (UID: \"0eff7fdb-d25d-4a77-b828-1901c31f091d\") " Dec 07 19:19:05 crc kubenswrapper[4815]: I1207 19:19:05.947395 4815 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc15074d-3de2-4533-84f9-d400e3400019-utilities\") on node \"crc\" DevicePath \"\"" Dec 07 19:19:05 crc kubenswrapper[4815]: I1207 19:19:05.947415 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f4x9n\" (UniqueName: \"kubernetes.io/projected/cc15074d-3de2-4533-84f9-d400e3400019-kube-api-access-f4x9n\") on node \"crc\" DevicePath \"\"" Dec 07 19:19:05 crc kubenswrapper[4815]: I1207 19:19:05.947425 4815 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc15074d-3de2-4533-84f9-d400e3400019-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 07 19:19:05 crc kubenswrapper[4815]: I1207 19:19:05.947549 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0eff7fdb-d25d-4a77-b828-1901c31f091d-utilities" (OuterVolumeSpecName: "utilities") pod "0eff7fdb-d25d-4a77-b828-1901c31f091d" (UID: "0eff7fdb-d25d-4a77-b828-1901c31f091d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 07 19:19:05 crc kubenswrapper[4815]: I1207 19:19:05.951832 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0eff7fdb-d25d-4a77-b828-1901c31f091d-kube-api-access-qmwpl" (OuterVolumeSpecName: "kube-api-access-qmwpl") pod "0eff7fdb-d25d-4a77-b828-1901c31f091d" (UID: "0eff7fdb-d25d-4a77-b828-1901c31f091d"). InnerVolumeSpecName "kube-api-access-qmwpl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:19:06 crc kubenswrapper[4815]: I1207 19:19:06.010562 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0eff7fdb-d25d-4a77-b828-1901c31f091d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0eff7fdb-d25d-4a77-b828-1901c31f091d" (UID: "0eff7fdb-d25d-4a77-b828-1901c31f091d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 07 19:19:06 crc kubenswrapper[4815]: I1207 19:19:06.012307 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8bsfd"] Dec 07 19:19:06 crc kubenswrapper[4815]: I1207 19:19:06.012693 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8bsfd" podUID="7fd60078-7b6f-4267-9087-b450e5d67a09" containerName="registry-server" containerID="cri-o://4f19ecbb87ceb09940f659589ecd9d6582f20f3dc99dee45653c3e8c8840f6e2" gracePeriod=2 Dec 07 19:19:06 crc kubenswrapper[4815]: I1207 19:19:06.048332 4815 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0eff7fdb-d25d-4a77-b828-1901c31f091d-utilities\") on node \"crc\" DevicePath \"\"" Dec 07 19:19:06 crc kubenswrapper[4815]: I1207 19:19:06.048364 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qmwpl\" (UniqueName: \"kubernetes.io/projected/0eff7fdb-d25d-4a77-b828-1901c31f091d-kube-api-access-qmwpl\") on node \"crc\" DevicePath \"\"" Dec 07 19:19:06 crc kubenswrapper[4815]: I1207 19:19:06.048379 4815 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0eff7fdb-d25d-4a77-b828-1901c31f091d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 07 19:19:06 crc kubenswrapper[4815]: I1207 19:19:06.259710 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-r924c" Dec 07 19:19:06 crc kubenswrapper[4815]: I1207 19:19:06.364491 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8bsfd" Dec 07 19:19:06 crc kubenswrapper[4815]: I1207 19:19:06.452679 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4dbec086-a865-4859-adbf-ab61d8395463-v4-0-config-system-serving-cert\") pod \"4dbec086-a865-4859-adbf-ab61d8395463\" (UID: \"4dbec086-a865-4859-adbf-ab61d8395463\") " Dec 07 19:19:06 crc kubenswrapper[4815]: I1207 19:19:06.452751 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4dbec086-a865-4859-adbf-ab61d8395463-audit-dir\") pod \"4dbec086-a865-4859-adbf-ab61d8395463\" (UID: \"4dbec086-a865-4859-adbf-ab61d8395463\") " Dec 07 19:19:06 crc kubenswrapper[4815]: I1207 19:19:06.452779 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4dbec086-a865-4859-adbf-ab61d8395463-v4-0-config-system-cliconfig\") pod \"4dbec086-a865-4859-adbf-ab61d8395463\" (UID: \"4dbec086-a865-4859-adbf-ab61d8395463\") " Dec 07 19:19:06 crc kubenswrapper[4815]: I1207 19:19:06.452805 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4dbec086-a865-4859-adbf-ab61d8395463-v4-0-config-user-idp-0-file-data\") pod \"4dbec086-a865-4859-adbf-ab61d8395463\" (UID: \"4dbec086-a865-4859-adbf-ab61d8395463\") " Dec 07 19:19:06 crc kubenswrapper[4815]: I1207 19:19:06.452836 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4dbec086-a865-4859-adbf-ab61d8395463-v4-0-config-user-template-provider-selection\") pod \"4dbec086-a865-4859-adbf-ab61d8395463\" (UID: \"4dbec086-a865-4859-adbf-ab61d8395463\") " Dec 07 19:19:06 crc kubenswrapper[4815]: I1207 19:19:06.452846 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4dbec086-a865-4859-adbf-ab61d8395463-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "4dbec086-a865-4859-adbf-ab61d8395463" (UID: "4dbec086-a865-4859-adbf-ab61d8395463"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 07 19:19:06 crc kubenswrapper[4815]: I1207 19:19:06.452862 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4dbec086-a865-4859-adbf-ab61d8395463-v4-0-config-system-router-certs\") pod \"4dbec086-a865-4859-adbf-ab61d8395463\" (UID: \"4dbec086-a865-4859-adbf-ab61d8395463\") " Dec 07 19:19:06 crc kubenswrapper[4815]: I1207 19:19:06.452937 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4dbec086-a865-4859-adbf-ab61d8395463-audit-policies\") pod \"4dbec086-a865-4859-adbf-ab61d8395463\" (UID: \"4dbec086-a865-4859-adbf-ab61d8395463\") " Dec 07 19:19:06 crc kubenswrapper[4815]: I1207 19:19:06.452961 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4dbec086-a865-4859-adbf-ab61d8395463-v4-0-config-system-session\") pod \"4dbec086-a865-4859-adbf-ab61d8395463\" (UID: \"4dbec086-a865-4859-adbf-ab61d8395463\") " Dec 07 19:19:06 crc kubenswrapper[4815]: I1207 19:19:06.453003 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4dbec086-a865-4859-adbf-ab61d8395463-v4-0-config-system-service-ca\") pod \"4dbec086-a865-4859-adbf-ab61d8395463\" (UID: \"4dbec086-a865-4859-adbf-ab61d8395463\") " Dec 07 19:19:06 crc kubenswrapper[4815]: I1207 19:19:06.453026 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4dbec086-a865-4859-adbf-ab61d8395463-v4-0-config-user-template-error\") pod \"4dbec086-a865-4859-adbf-ab61d8395463\" (UID: \"4dbec086-a865-4859-adbf-ab61d8395463\") " Dec 07 19:19:06 crc kubenswrapper[4815]: I1207 19:19:06.453068 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4sth8\" (UniqueName: \"kubernetes.io/projected/4dbec086-a865-4859-adbf-ab61d8395463-kube-api-access-4sth8\") pod \"4dbec086-a865-4859-adbf-ab61d8395463\" (UID: \"4dbec086-a865-4859-adbf-ab61d8395463\") " Dec 07 19:19:06 crc kubenswrapper[4815]: I1207 19:19:06.453086 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4dbec086-a865-4859-adbf-ab61d8395463-v4-0-config-system-ocp-branding-template\") pod \"4dbec086-a865-4859-adbf-ab61d8395463\" (UID: \"4dbec086-a865-4859-adbf-ab61d8395463\") " Dec 07 19:19:06 crc kubenswrapper[4815]: I1207 19:19:06.453130 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4dbec086-a865-4859-adbf-ab61d8395463-v4-0-config-system-trusted-ca-bundle\") pod \"4dbec086-a865-4859-adbf-ab61d8395463\" (UID: \"4dbec086-a865-4859-adbf-ab61d8395463\") " Dec 07 19:19:06 crc kubenswrapper[4815]: I1207 19:19:06.453160 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4dbec086-a865-4859-adbf-ab61d8395463-v4-0-config-user-template-login\") pod \"4dbec086-a865-4859-adbf-ab61d8395463\" (UID: \"4dbec086-a865-4859-adbf-ab61d8395463\") " Dec 07 19:19:06 crc kubenswrapper[4815]: I1207 19:19:06.453474 4815 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4dbec086-a865-4859-adbf-ab61d8395463-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 07 19:19:06 crc kubenswrapper[4815]: I1207 19:19:06.453839 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4dbec086-a865-4859-adbf-ab61d8395463-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "4dbec086-a865-4859-adbf-ab61d8395463" (UID: "4dbec086-a865-4859-adbf-ab61d8395463"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:19:06 crc kubenswrapper[4815]: I1207 19:19:06.455735 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dbec086-a865-4859-adbf-ab61d8395463-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "4dbec086-a865-4859-adbf-ab61d8395463" (UID: "4dbec086-a865-4859-adbf-ab61d8395463"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:19:06 crc kubenswrapper[4815]: I1207 19:19:06.455802 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dbec086-a865-4859-adbf-ab61d8395463-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "4dbec086-a865-4859-adbf-ab61d8395463" (UID: "4dbec086-a865-4859-adbf-ab61d8395463"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:19:06 crc kubenswrapper[4815]: I1207 19:19:06.456049 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dbec086-a865-4859-adbf-ab61d8395463-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "4dbec086-a865-4859-adbf-ab61d8395463" (UID: "4dbec086-a865-4859-adbf-ab61d8395463"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:19:06 crc kubenswrapper[4815]: I1207 19:19:06.456266 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4dbec086-a865-4859-adbf-ab61d8395463-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "4dbec086-a865-4859-adbf-ab61d8395463" (UID: "4dbec086-a865-4859-adbf-ab61d8395463"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:19:06 crc kubenswrapper[4815]: I1207 19:19:06.456598 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dbec086-a865-4859-adbf-ab61d8395463-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "4dbec086-a865-4859-adbf-ab61d8395463" (UID: "4dbec086-a865-4859-adbf-ab61d8395463"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:19:06 crc kubenswrapper[4815]: I1207 19:19:06.456689 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4dbec086-a865-4859-adbf-ab61d8395463-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "4dbec086-a865-4859-adbf-ab61d8395463" (UID: "4dbec086-a865-4859-adbf-ab61d8395463"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:19:06 crc kubenswrapper[4815]: I1207 19:19:06.456739 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4dbec086-a865-4859-adbf-ab61d8395463-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "4dbec086-a865-4859-adbf-ab61d8395463" (UID: "4dbec086-a865-4859-adbf-ab61d8395463"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:19:06 crc kubenswrapper[4815]: I1207 19:19:06.456788 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dbec086-a865-4859-adbf-ab61d8395463-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "4dbec086-a865-4859-adbf-ab61d8395463" (UID: "4dbec086-a865-4859-adbf-ab61d8395463"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:19:06 crc kubenswrapper[4815]: I1207 19:19:06.457299 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dbec086-a865-4859-adbf-ab61d8395463-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "4dbec086-a865-4859-adbf-ab61d8395463" (UID: "4dbec086-a865-4859-adbf-ab61d8395463"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:19:06 crc kubenswrapper[4815]: I1207 19:19:06.458827 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4dbec086-a865-4859-adbf-ab61d8395463-kube-api-access-4sth8" (OuterVolumeSpecName: "kube-api-access-4sth8") pod "4dbec086-a865-4859-adbf-ab61d8395463" (UID: "4dbec086-a865-4859-adbf-ab61d8395463"). InnerVolumeSpecName "kube-api-access-4sth8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:19:06 crc kubenswrapper[4815]: I1207 19:19:06.460057 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dbec086-a865-4859-adbf-ab61d8395463-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "4dbec086-a865-4859-adbf-ab61d8395463" (UID: "4dbec086-a865-4859-adbf-ab61d8395463"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:19:06 crc kubenswrapper[4815]: I1207 19:19:06.460348 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dbec086-a865-4859-adbf-ab61d8395463-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "4dbec086-a865-4859-adbf-ab61d8395463" (UID: "4dbec086-a865-4859-adbf-ab61d8395463"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:19:06 crc kubenswrapper[4815]: I1207 19:19:06.461189 4815 generic.go:334] "Generic (PLEG): container finished" podID="4dbec086-a865-4859-adbf-ab61d8395463" containerID="1b93670e74fe118c42b4ca9ec9658221c91c2bcd3013cb5a33e119010bddbd87" exitCode=0 Dec 07 19:19:06 crc kubenswrapper[4815]: I1207 19:19:06.461229 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-r924c" Dec 07 19:19:06 crc kubenswrapper[4815]: I1207 19:19:06.461252 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-r924c" event={"ID":"4dbec086-a865-4859-adbf-ab61d8395463","Type":"ContainerDied","Data":"1b93670e74fe118c42b4ca9ec9658221c91c2bcd3013cb5a33e119010bddbd87"} Dec 07 19:19:06 crc kubenswrapper[4815]: I1207 19:19:06.461281 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-r924c" event={"ID":"4dbec086-a865-4859-adbf-ab61d8395463","Type":"ContainerDied","Data":"f6d4a5da3a34ecd3946091b6fd90aa5ee96c95db062dd3ead3309532f7ca6854"} Dec 07 19:19:06 crc kubenswrapper[4815]: I1207 19:19:06.461295 4815 scope.go:117] "RemoveContainer" containerID="1b93670e74fe118c42b4ca9ec9658221c91c2bcd3013cb5a33e119010bddbd87" Dec 07 19:19:06 crc kubenswrapper[4815]: I1207 19:19:06.469437 4815 generic.go:334] "Generic (PLEG): container finished" podID="7fd60078-7b6f-4267-9087-b450e5d67a09" containerID="4f19ecbb87ceb09940f659589ecd9d6582f20f3dc99dee45653c3e8c8840f6e2" exitCode=0 Dec 07 19:19:06 crc kubenswrapper[4815]: I1207 19:19:06.471703 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8bsfd" event={"ID":"7fd60078-7b6f-4267-9087-b450e5d67a09","Type":"ContainerDied","Data":"4f19ecbb87ceb09940f659589ecd9d6582f20f3dc99dee45653c3e8c8840f6e2"} Dec 07 19:19:06 crc kubenswrapper[4815]: I1207 19:19:06.471736 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8bsfd" event={"ID":"7fd60078-7b6f-4267-9087-b450e5d67a09","Type":"ContainerDied","Data":"5ed58502660fd6cac4ab2e6fccdc65e135de216859e48c19dec62390e0918bea"} Dec 07 19:19:06 crc kubenswrapper[4815]: I1207 19:19:06.471811 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8bsfd" Dec 07 19:19:06 crc kubenswrapper[4815]: I1207 19:19:06.477824 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-csjhg" Dec 07 19:19:06 crc kubenswrapper[4815]: I1207 19:19:06.485197 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-csjhg" event={"ID":"0eff7fdb-d25d-4a77-b828-1901c31f091d","Type":"ContainerDied","Data":"3adad301e389def75136f598732276aede87e3eefe4ff7a0437785bbc0cb0894"} Dec 07 19:19:06 crc kubenswrapper[4815]: I1207 19:19:06.490598 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bjqjh" event={"ID":"cc15074d-3de2-4533-84f9-d400e3400019","Type":"ContainerDied","Data":"c23a3434eff2074ea14c7992788964839fc73b4e20b61ccb77f1951512ff169a"} Dec 07 19:19:06 crc kubenswrapper[4815]: I1207 19:19:06.490678 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bjqjh" Dec 07 19:19:06 crc kubenswrapper[4815]: I1207 19:19:06.491532 4815 scope.go:117] "RemoveContainer" containerID="1b93670e74fe118c42b4ca9ec9658221c91c2bcd3013cb5a33e119010bddbd87" Dec 07 19:19:06 crc kubenswrapper[4815]: E1207 19:19:06.491832 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b93670e74fe118c42b4ca9ec9658221c91c2bcd3013cb5a33e119010bddbd87\": container with ID starting with 1b93670e74fe118c42b4ca9ec9658221c91c2bcd3013cb5a33e119010bddbd87 not found: ID does not exist" containerID="1b93670e74fe118c42b4ca9ec9658221c91c2bcd3013cb5a33e119010bddbd87" Dec 07 19:19:06 crc kubenswrapper[4815]: I1207 19:19:06.491863 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b93670e74fe118c42b4ca9ec9658221c91c2bcd3013cb5a33e119010bddbd87"} err="failed to get container status \"1b93670e74fe118c42b4ca9ec9658221c91c2bcd3013cb5a33e119010bddbd87\": rpc error: code = NotFound desc = could not find container \"1b93670e74fe118c42b4ca9ec9658221c91c2bcd3013cb5a33e119010bddbd87\": container with ID starting with 1b93670e74fe118c42b4ca9ec9658221c91c2bcd3013cb5a33e119010bddbd87 not found: ID does not exist" Dec 07 19:19:06 crc kubenswrapper[4815]: I1207 19:19:06.491887 4815 scope.go:117] "RemoveContainer" containerID="4f19ecbb87ceb09940f659589ecd9d6582f20f3dc99dee45653c3e8c8840f6e2" Dec 07 19:19:06 crc kubenswrapper[4815]: I1207 19:19:06.500735 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-r924c"] Dec 07 19:19:06 crc kubenswrapper[4815]: I1207 19:19:06.521129 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-r924c"] Dec 07 19:19:06 crc kubenswrapper[4815]: I1207 19:19:06.522750 4815 scope.go:117] "RemoveContainer" containerID="64da62e11e447f55c0255926a204b9a8768a28a055cf53497473b8ba83689dba" Dec 07 19:19:06 crc kubenswrapper[4815]: I1207 19:19:06.525370 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-csjhg"] Dec 07 19:19:06 crc kubenswrapper[4815]: I1207 19:19:06.529229 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-csjhg"] Dec 07 19:19:06 crc kubenswrapper[4815]: I1207 19:19:06.537881 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bjqjh"] Dec 07 19:19:06 crc kubenswrapper[4815]: I1207 19:19:06.541567 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-bjqjh"] Dec 07 19:19:06 crc kubenswrapper[4815]: I1207 19:19:06.554090 4815 scope.go:117] "RemoveContainer" containerID="90cfa190a27906b003c546e4e9fd0314b4d95876ae395fca90c70b08000458c0" Dec 07 19:19:06 crc kubenswrapper[4815]: I1207 19:19:06.554410 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7fd60078-7b6f-4267-9087-b450e5d67a09-utilities\") pod \"7fd60078-7b6f-4267-9087-b450e5d67a09\" (UID: \"7fd60078-7b6f-4267-9087-b450e5d67a09\") " Dec 07 19:19:06 crc kubenswrapper[4815]: I1207 19:19:06.554468 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7fd60078-7b6f-4267-9087-b450e5d67a09-catalog-content\") pod \"7fd60078-7b6f-4267-9087-b450e5d67a09\" (UID: \"7fd60078-7b6f-4267-9087-b450e5d67a09\") " Dec 07 19:19:06 crc kubenswrapper[4815]: I1207 19:19:06.554493 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lt8qc\" (UniqueName: \"kubernetes.io/projected/7fd60078-7b6f-4267-9087-b450e5d67a09-kube-api-access-lt8qc\") pod \"7fd60078-7b6f-4267-9087-b450e5d67a09\" (UID: \"7fd60078-7b6f-4267-9087-b450e5d67a09\") " Dec 07 19:19:06 crc kubenswrapper[4815]: I1207 19:19:06.554736 4815 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4dbec086-a865-4859-adbf-ab61d8395463-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 07 19:19:06 crc kubenswrapper[4815]: I1207 19:19:06.554747 4815 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4dbec086-a865-4859-adbf-ab61d8395463-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 07 19:19:06 crc kubenswrapper[4815]: I1207 19:19:06.554758 4815 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4dbec086-a865-4859-adbf-ab61d8395463-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 07 19:19:06 crc kubenswrapper[4815]: I1207 19:19:06.554766 4815 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4dbec086-a865-4859-adbf-ab61d8395463-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 07 19:19:06 crc kubenswrapper[4815]: I1207 19:19:06.554778 4815 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4dbec086-a865-4859-adbf-ab61d8395463-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 07 19:19:06 crc kubenswrapper[4815]: I1207 19:19:06.554788 4815 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4dbec086-a865-4859-adbf-ab61d8395463-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 07 19:19:06 crc kubenswrapper[4815]: I1207 19:19:06.554798 4815 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4dbec086-a865-4859-adbf-ab61d8395463-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 07 19:19:06 crc kubenswrapper[4815]: I1207 19:19:06.554807 4815 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4dbec086-a865-4859-adbf-ab61d8395463-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 07 19:19:06 crc kubenswrapper[4815]: I1207 19:19:06.554815 4815 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4dbec086-a865-4859-adbf-ab61d8395463-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 07 19:19:06 crc kubenswrapper[4815]: I1207 19:19:06.554823 4815 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4dbec086-a865-4859-adbf-ab61d8395463-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 07 19:19:06 crc kubenswrapper[4815]: I1207 19:19:06.554833 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4sth8\" (UniqueName: \"kubernetes.io/projected/4dbec086-a865-4859-adbf-ab61d8395463-kube-api-access-4sth8\") on node \"crc\" DevicePath \"\"" Dec 07 19:19:06 crc kubenswrapper[4815]: I1207 19:19:06.554844 4815 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4dbec086-a865-4859-adbf-ab61d8395463-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 07 19:19:06 crc kubenswrapper[4815]: I1207 19:19:06.554853 4815 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4dbec086-a865-4859-adbf-ab61d8395463-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 07 19:19:06 crc kubenswrapper[4815]: I1207 19:19:06.555335 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7fd60078-7b6f-4267-9087-b450e5d67a09-utilities" (OuterVolumeSpecName: "utilities") pod "7fd60078-7b6f-4267-9087-b450e5d67a09" (UID: "7fd60078-7b6f-4267-9087-b450e5d67a09"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 07 19:19:06 crc kubenswrapper[4815]: I1207 19:19:06.557876 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fd60078-7b6f-4267-9087-b450e5d67a09-kube-api-access-lt8qc" (OuterVolumeSpecName: "kube-api-access-lt8qc") pod "7fd60078-7b6f-4267-9087-b450e5d67a09" (UID: "7fd60078-7b6f-4267-9087-b450e5d67a09"). InnerVolumeSpecName "kube-api-access-lt8qc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:19:06 crc kubenswrapper[4815]: I1207 19:19:06.571128 4815 scope.go:117] "RemoveContainer" containerID="4f19ecbb87ceb09940f659589ecd9d6582f20f3dc99dee45653c3e8c8840f6e2" Dec 07 19:19:06 crc kubenswrapper[4815]: E1207 19:19:06.571543 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f19ecbb87ceb09940f659589ecd9d6582f20f3dc99dee45653c3e8c8840f6e2\": container with ID starting with 4f19ecbb87ceb09940f659589ecd9d6582f20f3dc99dee45653c3e8c8840f6e2 not found: ID does not exist" containerID="4f19ecbb87ceb09940f659589ecd9d6582f20f3dc99dee45653c3e8c8840f6e2" Dec 07 19:19:06 crc kubenswrapper[4815]: I1207 19:19:06.571587 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f19ecbb87ceb09940f659589ecd9d6582f20f3dc99dee45653c3e8c8840f6e2"} err="failed to get container status \"4f19ecbb87ceb09940f659589ecd9d6582f20f3dc99dee45653c3e8c8840f6e2\": rpc error: code = NotFound desc = could not find container \"4f19ecbb87ceb09940f659589ecd9d6582f20f3dc99dee45653c3e8c8840f6e2\": container with ID starting with 4f19ecbb87ceb09940f659589ecd9d6582f20f3dc99dee45653c3e8c8840f6e2 not found: ID does not exist" Dec 07 19:19:06 crc kubenswrapper[4815]: I1207 19:19:06.571613 4815 scope.go:117] "RemoveContainer" containerID="64da62e11e447f55c0255926a204b9a8768a28a055cf53497473b8ba83689dba" Dec 07 19:19:06 crc kubenswrapper[4815]: E1207 19:19:06.571972 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64da62e11e447f55c0255926a204b9a8768a28a055cf53497473b8ba83689dba\": container with ID starting with 64da62e11e447f55c0255926a204b9a8768a28a055cf53497473b8ba83689dba not found: ID does not exist" containerID="64da62e11e447f55c0255926a204b9a8768a28a055cf53497473b8ba83689dba" Dec 07 19:19:06 crc kubenswrapper[4815]: I1207 19:19:06.571994 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64da62e11e447f55c0255926a204b9a8768a28a055cf53497473b8ba83689dba"} err="failed to get container status \"64da62e11e447f55c0255926a204b9a8768a28a055cf53497473b8ba83689dba\": rpc error: code = NotFound desc = could not find container \"64da62e11e447f55c0255926a204b9a8768a28a055cf53497473b8ba83689dba\": container with ID starting with 64da62e11e447f55c0255926a204b9a8768a28a055cf53497473b8ba83689dba not found: ID does not exist" Dec 07 19:19:06 crc kubenswrapper[4815]: I1207 19:19:06.572008 4815 scope.go:117] "RemoveContainer" containerID="90cfa190a27906b003c546e4e9fd0314b4d95876ae395fca90c70b08000458c0" Dec 07 19:19:06 crc kubenswrapper[4815]: E1207 19:19:06.572259 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90cfa190a27906b003c546e4e9fd0314b4d95876ae395fca90c70b08000458c0\": container with ID starting with 90cfa190a27906b003c546e4e9fd0314b4d95876ae395fca90c70b08000458c0 not found: ID does not exist" containerID="90cfa190a27906b003c546e4e9fd0314b4d95876ae395fca90c70b08000458c0" Dec 07 19:19:06 crc kubenswrapper[4815]: I1207 19:19:06.572290 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90cfa190a27906b003c546e4e9fd0314b4d95876ae395fca90c70b08000458c0"} err="failed to get container status \"90cfa190a27906b003c546e4e9fd0314b4d95876ae395fca90c70b08000458c0\": rpc error: code = NotFound desc = could not find container \"90cfa190a27906b003c546e4e9fd0314b4d95876ae395fca90c70b08000458c0\": container with ID starting with 90cfa190a27906b003c546e4e9fd0314b4d95876ae395fca90c70b08000458c0 not found: ID does not exist" Dec 07 19:19:06 crc kubenswrapper[4815]: I1207 19:19:06.572312 4815 scope.go:117] "RemoveContainer" containerID="f1c373bbd4bfdda8d187f066cf1e9bf1b3886d28770b14c0a339e6d7706669e8" Dec 07 19:19:06 crc kubenswrapper[4815]: I1207 19:19:06.573215 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7fd60078-7b6f-4267-9087-b450e5d67a09-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7fd60078-7b6f-4267-9087-b450e5d67a09" (UID: "7fd60078-7b6f-4267-9087-b450e5d67a09"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 07 19:19:06 crc kubenswrapper[4815]: I1207 19:19:06.585238 4815 scope.go:117] "RemoveContainer" containerID="29a859eddbc43e2242237a847505a484989163e31a19b7c2dc3967081ebe26be" Dec 07 19:19:06 crc kubenswrapper[4815]: I1207 19:19:06.603558 4815 scope.go:117] "RemoveContainer" containerID="1d2f92f00630a1c5db7a72b94ed0b537a6ec7ca0eea838756639e968dad2dd29" Dec 07 19:19:06 crc kubenswrapper[4815]: I1207 19:19:06.614496 4815 scope.go:117] "RemoveContainer" containerID="00f805c5ddd516abbdda2284be05cc5e83ccd221593da3cb831ecb32373fdb8e" Dec 07 19:19:06 crc kubenswrapper[4815]: I1207 19:19:06.626367 4815 scope.go:117] "RemoveContainer" containerID="8f38fff6e3d344a1504be0592a0438559876ad2a2c4c7fded528e2646c8952d2" Dec 07 19:19:06 crc kubenswrapper[4815]: I1207 19:19:06.641255 4815 scope.go:117] "RemoveContainer" containerID="46387a3d812b105dfba81410ab6b66d43d087afb667def6e51cd33345f719b04" Dec 07 19:19:06 crc kubenswrapper[4815]: I1207 19:19:06.656058 4815 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7fd60078-7b6f-4267-9087-b450e5d67a09-utilities\") on node \"crc\" DevicePath \"\"" Dec 07 19:19:06 crc kubenswrapper[4815]: I1207 19:19:06.656080 4815 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7fd60078-7b6f-4267-9087-b450e5d67a09-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 07 19:19:06 crc kubenswrapper[4815]: I1207 19:19:06.656127 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lt8qc\" (UniqueName: \"kubernetes.io/projected/7fd60078-7b6f-4267-9087-b450e5d67a09-kube-api-access-lt8qc\") on node \"crc\" DevicePath \"\"" Dec 07 19:19:06 crc kubenswrapper[4815]: I1207 19:19:06.803895 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8bsfd"] Dec 07 19:19:06 crc kubenswrapper[4815]: I1207 19:19:06.807664 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8bsfd"] Dec 07 19:19:07 crc kubenswrapper[4815]: I1207 19:19:07.784281 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0eff7fdb-d25d-4a77-b828-1901c31f091d" path="/var/lib/kubelet/pods/0eff7fdb-d25d-4a77-b828-1901c31f091d/volumes" Dec 07 19:19:07 crc kubenswrapper[4815]: I1207 19:19:07.786513 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4dbec086-a865-4859-adbf-ab61d8395463" path="/var/lib/kubelet/pods/4dbec086-a865-4859-adbf-ab61d8395463/volumes" Dec 07 19:19:07 crc kubenswrapper[4815]: I1207 19:19:07.787676 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7fd60078-7b6f-4267-9087-b450e5d67a09" path="/var/lib/kubelet/pods/7fd60078-7b6f-4267-9087-b450e5d67a09/volumes" Dec 07 19:19:07 crc kubenswrapper[4815]: I1207 19:19:07.789704 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc15074d-3de2-4533-84f9-d400e3400019" path="/var/lib/kubelet/pods/cc15074d-3de2-4533-84f9-d400e3400019/volumes" Dec 07 19:19:09 crc kubenswrapper[4815]: I1207 19:19:09.761687 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-5d4f55d7c5-kcwqp"] Dec 07 19:19:09 crc kubenswrapper[4815]: E1207 19:19:09.763139 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc15074d-3de2-4533-84f9-d400e3400019" containerName="extract-content" Dec 07 19:19:09 crc kubenswrapper[4815]: I1207 19:19:09.763299 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc15074d-3de2-4533-84f9-d400e3400019" containerName="extract-content" Dec 07 19:19:09 crc kubenswrapper[4815]: E1207 19:19:09.763444 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51575da5-e649-457f-b2af-149a0c4103f6" containerName="pruner" Dec 07 19:19:09 crc kubenswrapper[4815]: I1207 19:19:09.763590 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="51575da5-e649-457f-b2af-149a0c4103f6" containerName="pruner" Dec 07 19:19:09 crc kubenswrapper[4815]: E1207 19:19:09.763714 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6da4a462-3799-4630-80bf-91b5b8112d23" containerName="extract-content" Dec 07 19:19:09 crc kubenswrapper[4815]: I1207 19:19:09.764014 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="6da4a462-3799-4630-80bf-91b5b8112d23" containerName="extract-content" Dec 07 19:19:09 crc kubenswrapper[4815]: E1207 19:19:09.764142 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fd60078-7b6f-4267-9087-b450e5d67a09" containerName="registry-server" Dec 07 19:19:09 crc kubenswrapper[4815]: I1207 19:19:09.764253 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fd60078-7b6f-4267-9087-b450e5d67a09" containerName="registry-server" Dec 07 19:19:09 crc kubenswrapper[4815]: E1207 19:19:09.764379 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0eff7fdb-d25d-4a77-b828-1901c31f091d" containerName="registry-server" Dec 07 19:19:09 crc kubenswrapper[4815]: I1207 19:19:09.764489 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="0eff7fdb-d25d-4a77-b828-1901c31f091d" containerName="registry-server" Dec 07 19:19:09 crc kubenswrapper[4815]: E1207 19:19:09.764604 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc15074d-3de2-4533-84f9-d400e3400019" containerName="extract-utilities" Dec 07 19:19:09 crc kubenswrapper[4815]: I1207 19:19:09.764754 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc15074d-3de2-4533-84f9-d400e3400019" containerName="extract-utilities" Dec 07 19:19:09 crc kubenswrapper[4815]: E1207 19:19:09.764885 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0eff7fdb-d25d-4a77-b828-1901c31f091d" containerName="extract-content" Dec 07 19:19:09 crc kubenswrapper[4815]: I1207 19:19:09.765026 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="0eff7fdb-d25d-4a77-b828-1901c31f091d" containerName="extract-content" Dec 07 19:19:09 crc kubenswrapper[4815]: E1207 19:19:09.765143 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc15074d-3de2-4533-84f9-d400e3400019" containerName="registry-server" Dec 07 19:19:09 crc kubenswrapper[4815]: I1207 19:19:09.765249 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc15074d-3de2-4533-84f9-d400e3400019" containerName="registry-server" Dec 07 19:19:09 crc kubenswrapper[4815]: E1207 19:19:09.765359 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6da4a462-3799-4630-80bf-91b5b8112d23" containerName="registry-server" Dec 07 19:19:09 crc kubenswrapper[4815]: I1207 19:19:09.765465 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="6da4a462-3799-4630-80bf-91b5b8112d23" containerName="registry-server" Dec 07 19:19:09 crc kubenswrapper[4815]: E1207 19:19:09.765597 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0eff7fdb-d25d-4a77-b828-1901c31f091d" containerName="extract-utilities" Dec 07 19:19:09 crc kubenswrapper[4815]: I1207 19:19:09.765709 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="0eff7fdb-d25d-4a77-b828-1901c31f091d" containerName="extract-utilities" Dec 07 19:19:09 crc kubenswrapper[4815]: E1207 19:19:09.765834 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fd60078-7b6f-4267-9087-b450e5d67a09" containerName="extract-content" Dec 07 19:19:09 crc kubenswrapper[4815]: I1207 19:19:09.765984 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fd60078-7b6f-4267-9087-b450e5d67a09" containerName="extract-content" Dec 07 19:19:09 crc kubenswrapper[4815]: E1207 19:19:09.766151 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dbec086-a865-4859-adbf-ab61d8395463" containerName="oauth-openshift" Dec 07 19:19:09 crc kubenswrapper[4815]: I1207 19:19:09.766311 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dbec086-a865-4859-adbf-ab61d8395463" containerName="oauth-openshift" Dec 07 19:19:09 crc kubenswrapper[4815]: E1207 19:19:09.766503 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fd60078-7b6f-4267-9087-b450e5d67a09" containerName="extract-utilities" Dec 07 19:19:09 crc kubenswrapper[4815]: I1207 19:19:09.766664 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fd60078-7b6f-4267-9087-b450e5d67a09" containerName="extract-utilities" Dec 07 19:19:09 crc kubenswrapper[4815]: E1207 19:19:09.766824 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6da4a462-3799-4630-80bf-91b5b8112d23" containerName="extract-utilities" Dec 07 19:19:09 crc kubenswrapper[4815]: I1207 19:19:09.767030 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="6da4a462-3799-4630-80bf-91b5b8112d23" containerName="extract-utilities" Dec 07 19:19:09 crc kubenswrapper[4815]: I1207 19:19:09.767467 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc15074d-3de2-4533-84f9-d400e3400019" containerName="registry-server" Dec 07 19:19:09 crc kubenswrapper[4815]: I1207 19:19:09.767651 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fd60078-7b6f-4267-9087-b450e5d67a09" containerName="registry-server" Dec 07 19:19:09 crc kubenswrapper[4815]: I1207 19:19:09.767856 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="6da4a462-3799-4630-80bf-91b5b8112d23" containerName="registry-server" Dec 07 19:19:09 crc kubenswrapper[4815]: I1207 19:19:09.768182 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="4dbec086-a865-4859-adbf-ab61d8395463" containerName="oauth-openshift" Dec 07 19:19:09 crc kubenswrapper[4815]: I1207 19:19:09.768334 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="0eff7fdb-d25d-4a77-b828-1901c31f091d" containerName="registry-server" Dec 07 19:19:09 crc kubenswrapper[4815]: I1207 19:19:09.768496 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="51575da5-e649-457f-b2af-149a0c4103f6" containerName="pruner" Dec 07 19:19:09 crc kubenswrapper[4815]: I1207 19:19:09.769318 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5d4f55d7c5-kcwqp" Dec 07 19:19:09 crc kubenswrapper[4815]: I1207 19:19:09.777358 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 07 19:19:09 crc kubenswrapper[4815]: I1207 19:19:09.777783 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 07 19:19:09 crc kubenswrapper[4815]: I1207 19:19:09.779040 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 07 19:19:09 crc kubenswrapper[4815]: I1207 19:19:09.779429 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 07 19:19:09 crc kubenswrapper[4815]: I1207 19:19:09.779581 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 07 19:19:09 crc kubenswrapper[4815]: I1207 19:19:09.783744 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 07 19:19:09 crc kubenswrapper[4815]: I1207 19:19:09.783836 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 07 19:19:09 crc kubenswrapper[4815]: I1207 19:19:09.784053 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 07 19:19:09 crc kubenswrapper[4815]: I1207 19:19:09.784061 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 07 19:19:09 crc kubenswrapper[4815]: I1207 19:19:09.784197 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 07 19:19:09 crc kubenswrapper[4815]: I1207 19:19:09.784145 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 07 19:19:09 crc kubenswrapper[4815]: I1207 19:19:09.784516 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 07 19:19:09 crc kubenswrapper[4815]: I1207 19:19:09.785910 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5d4f55d7c5-kcwqp"] Dec 07 19:19:09 crc kubenswrapper[4815]: I1207 19:19:09.820614 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 07 19:19:09 crc kubenswrapper[4815]: I1207 19:19:09.829149 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 07 19:19:09 crc kubenswrapper[4815]: I1207 19:19:09.833115 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 07 19:19:09 crc kubenswrapper[4815]: I1207 19:19:09.898675 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b8f1d1be-2fad-4cfa-9b67-1b91879bf70d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5d4f55d7c5-kcwqp\" (UID: \"b8f1d1be-2fad-4cfa-9b67-1b91879bf70d\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-kcwqp" Dec 07 19:19:09 crc kubenswrapper[4815]: I1207 19:19:09.898741 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b8f1d1be-2fad-4cfa-9b67-1b91879bf70d-audit-policies\") pod \"oauth-openshift-5d4f55d7c5-kcwqp\" (UID: \"b8f1d1be-2fad-4cfa-9b67-1b91879bf70d\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-kcwqp" Dec 07 19:19:09 crc kubenswrapper[4815]: I1207 19:19:09.898776 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b8f1d1be-2fad-4cfa-9b67-1b91879bf70d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5d4f55d7c5-kcwqp\" (UID: \"b8f1d1be-2fad-4cfa-9b67-1b91879bf70d\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-kcwqp" Dec 07 19:19:09 crc kubenswrapper[4815]: I1207 19:19:09.898893 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b8f1d1be-2fad-4cfa-9b67-1b91879bf70d-v4-0-config-system-service-ca\") pod \"oauth-openshift-5d4f55d7c5-kcwqp\" (UID: \"b8f1d1be-2fad-4cfa-9b67-1b91879bf70d\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-kcwqp" Dec 07 19:19:09 crc kubenswrapper[4815]: I1207 19:19:09.898998 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b8f1d1be-2fad-4cfa-9b67-1b91879bf70d-v4-0-config-system-router-certs\") pod \"oauth-openshift-5d4f55d7c5-kcwqp\" (UID: \"b8f1d1be-2fad-4cfa-9b67-1b91879bf70d\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-kcwqp" Dec 07 19:19:09 crc kubenswrapper[4815]: I1207 19:19:09.899034 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b8f1d1be-2fad-4cfa-9b67-1b91879bf70d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5d4f55d7c5-kcwqp\" (UID: \"b8f1d1be-2fad-4cfa-9b67-1b91879bf70d\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-kcwqp" Dec 07 19:19:09 crc kubenswrapper[4815]: I1207 19:19:09.899060 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b8f1d1be-2fad-4cfa-9b67-1b91879bf70d-v4-0-config-user-template-login\") pod \"oauth-openshift-5d4f55d7c5-kcwqp\" (UID: \"b8f1d1be-2fad-4cfa-9b67-1b91879bf70d\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-kcwqp" Dec 07 19:19:09 crc kubenswrapper[4815]: I1207 19:19:09.899164 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b8f1d1be-2fad-4cfa-9b67-1b91879bf70d-audit-dir\") pod \"oauth-openshift-5d4f55d7c5-kcwqp\" (UID: \"b8f1d1be-2fad-4cfa-9b67-1b91879bf70d\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-kcwqp" Dec 07 19:19:09 crc kubenswrapper[4815]: I1207 19:19:09.899191 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b8f1d1be-2fad-4cfa-9b67-1b91879bf70d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5d4f55d7c5-kcwqp\" (UID: \"b8f1d1be-2fad-4cfa-9b67-1b91879bf70d\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-kcwqp" Dec 07 19:19:09 crc kubenswrapper[4815]: I1207 19:19:09.899217 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b8f1d1be-2fad-4cfa-9b67-1b91879bf70d-v4-0-config-system-session\") pod \"oauth-openshift-5d4f55d7c5-kcwqp\" (UID: \"b8f1d1be-2fad-4cfa-9b67-1b91879bf70d\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-kcwqp" Dec 07 19:19:09 crc kubenswrapper[4815]: I1207 19:19:09.899242 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b8f1d1be-2fad-4cfa-9b67-1b91879bf70d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5d4f55d7c5-kcwqp\" (UID: \"b8f1d1be-2fad-4cfa-9b67-1b91879bf70d\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-kcwqp" Dec 07 19:19:09 crc kubenswrapper[4815]: I1207 19:19:09.899270 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b8f1d1be-2fad-4cfa-9b67-1b91879bf70d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5d4f55d7c5-kcwqp\" (UID: \"b8f1d1be-2fad-4cfa-9b67-1b91879bf70d\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-kcwqp" Dec 07 19:19:09 crc kubenswrapper[4815]: I1207 19:19:09.899291 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b8f1d1be-2fad-4cfa-9b67-1b91879bf70d-v4-0-config-user-template-error\") pod \"oauth-openshift-5d4f55d7c5-kcwqp\" (UID: \"b8f1d1be-2fad-4cfa-9b67-1b91879bf70d\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-kcwqp" Dec 07 19:19:09 crc kubenswrapper[4815]: I1207 19:19:09.899412 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bj2dm\" (UniqueName: \"kubernetes.io/projected/b8f1d1be-2fad-4cfa-9b67-1b91879bf70d-kube-api-access-bj2dm\") pod \"oauth-openshift-5d4f55d7c5-kcwqp\" (UID: \"b8f1d1be-2fad-4cfa-9b67-1b91879bf70d\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-kcwqp" Dec 07 19:19:10 crc kubenswrapper[4815]: I1207 19:19:10.001880 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b8f1d1be-2fad-4cfa-9b67-1b91879bf70d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5d4f55d7c5-kcwqp\" (UID: \"b8f1d1be-2fad-4cfa-9b67-1b91879bf70d\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-kcwqp" Dec 07 19:19:10 crc kubenswrapper[4815]: I1207 19:19:10.001978 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b8f1d1be-2fad-4cfa-9b67-1b91879bf70d-v4-0-config-system-service-ca\") pod \"oauth-openshift-5d4f55d7c5-kcwqp\" (UID: \"b8f1d1be-2fad-4cfa-9b67-1b91879bf70d\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-kcwqp" Dec 07 19:19:10 crc kubenswrapper[4815]: I1207 19:19:10.002017 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b8f1d1be-2fad-4cfa-9b67-1b91879bf70d-v4-0-config-system-router-certs\") pod \"oauth-openshift-5d4f55d7c5-kcwqp\" (UID: \"b8f1d1be-2fad-4cfa-9b67-1b91879bf70d\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-kcwqp" Dec 07 19:19:10 crc kubenswrapper[4815]: I1207 19:19:10.002053 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b8f1d1be-2fad-4cfa-9b67-1b91879bf70d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5d4f55d7c5-kcwqp\" (UID: \"b8f1d1be-2fad-4cfa-9b67-1b91879bf70d\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-kcwqp" Dec 07 19:19:10 crc kubenswrapper[4815]: I1207 19:19:10.002088 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b8f1d1be-2fad-4cfa-9b67-1b91879bf70d-v4-0-config-user-template-login\") pod \"oauth-openshift-5d4f55d7c5-kcwqp\" (UID: \"b8f1d1be-2fad-4cfa-9b67-1b91879bf70d\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-kcwqp" Dec 07 19:19:10 crc kubenswrapper[4815]: I1207 19:19:10.002139 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b8f1d1be-2fad-4cfa-9b67-1b91879bf70d-audit-dir\") pod \"oauth-openshift-5d4f55d7c5-kcwqp\" (UID: \"b8f1d1be-2fad-4cfa-9b67-1b91879bf70d\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-kcwqp" Dec 07 19:19:10 crc kubenswrapper[4815]: I1207 19:19:10.002177 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b8f1d1be-2fad-4cfa-9b67-1b91879bf70d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5d4f55d7c5-kcwqp\" (UID: \"b8f1d1be-2fad-4cfa-9b67-1b91879bf70d\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-kcwqp" Dec 07 19:19:10 crc kubenswrapper[4815]: I1207 19:19:10.002222 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b8f1d1be-2fad-4cfa-9b67-1b91879bf70d-v4-0-config-system-session\") pod \"oauth-openshift-5d4f55d7c5-kcwqp\" (UID: \"b8f1d1be-2fad-4cfa-9b67-1b91879bf70d\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-kcwqp" Dec 07 19:19:10 crc kubenswrapper[4815]: I1207 19:19:10.002264 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b8f1d1be-2fad-4cfa-9b67-1b91879bf70d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5d4f55d7c5-kcwqp\" (UID: \"b8f1d1be-2fad-4cfa-9b67-1b91879bf70d\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-kcwqp" Dec 07 19:19:10 crc kubenswrapper[4815]: I1207 19:19:10.002307 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b8f1d1be-2fad-4cfa-9b67-1b91879bf70d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5d4f55d7c5-kcwqp\" (UID: \"b8f1d1be-2fad-4cfa-9b67-1b91879bf70d\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-kcwqp" Dec 07 19:19:10 crc kubenswrapper[4815]: I1207 19:19:10.003279 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b8f1d1be-2fad-4cfa-9b67-1b91879bf70d-v4-0-config-user-template-error\") pod \"oauth-openshift-5d4f55d7c5-kcwqp\" (UID: \"b8f1d1be-2fad-4cfa-9b67-1b91879bf70d\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-kcwqp" Dec 07 19:19:10 crc kubenswrapper[4815]: I1207 19:19:10.003692 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bj2dm\" (UniqueName: \"kubernetes.io/projected/b8f1d1be-2fad-4cfa-9b67-1b91879bf70d-kube-api-access-bj2dm\") pod \"oauth-openshift-5d4f55d7c5-kcwqp\" (UID: \"b8f1d1be-2fad-4cfa-9b67-1b91879bf70d\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-kcwqp" Dec 07 19:19:10 crc kubenswrapper[4815]: I1207 19:19:10.003828 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b8f1d1be-2fad-4cfa-9b67-1b91879bf70d-audit-policies\") pod \"oauth-openshift-5d4f55d7c5-kcwqp\" (UID: \"b8f1d1be-2fad-4cfa-9b67-1b91879bf70d\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-kcwqp" Dec 07 19:19:10 crc kubenswrapper[4815]: I1207 19:19:10.003867 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b8f1d1be-2fad-4cfa-9b67-1b91879bf70d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5d4f55d7c5-kcwqp\" (UID: \"b8f1d1be-2fad-4cfa-9b67-1b91879bf70d\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-kcwqp" Dec 07 19:19:10 crc kubenswrapper[4815]: I1207 19:19:10.003386 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b8f1d1be-2fad-4cfa-9b67-1b91879bf70d-v4-0-config-system-service-ca\") pod \"oauth-openshift-5d4f55d7c5-kcwqp\" (UID: \"b8f1d1be-2fad-4cfa-9b67-1b91879bf70d\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-kcwqp" Dec 07 19:19:10 crc kubenswrapper[4815]: I1207 19:19:10.002774 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b8f1d1be-2fad-4cfa-9b67-1b91879bf70d-audit-dir\") pod \"oauth-openshift-5d4f55d7c5-kcwqp\" (UID: \"b8f1d1be-2fad-4cfa-9b67-1b91879bf70d\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-kcwqp" Dec 07 19:19:10 crc kubenswrapper[4815]: I1207 19:19:10.007880 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b8f1d1be-2fad-4cfa-9b67-1b91879bf70d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5d4f55d7c5-kcwqp\" (UID: \"b8f1d1be-2fad-4cfa-9b67-1b91879bf70d\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-kcwqp" Dec 07 19:19:10 crc kubenswrapper[4815]: I1207 19:19:10.008517 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b8f1d1be-2fad-4cfa-9b67-1b91879bf70d-audit-policies\") pod \"oauth-openshift-5d4f55d7c5-kcwqp\" (UID: \"b8f1d1be-2fad-4cfa-9b67-1b91879bf70d\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-kcwqp" Dec 07 19:19:10 crc kubenswrapper[4815]: I1207 19:19:10.009833 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b8f1d1be-2fad-4cfa-9b67-1b91879bf70d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5d4f55d7c5-kcwqp\" (UID: \"b8f1d1be-2fad-4cfa-9b67-1b91879bf70d\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-kcwqp" Dec 07 19:19:10 crc kubenswrapper[4815]: I1207 19:19:10.009888 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b8f1d1be-2fad-4cfa-9b67-1b91879bf70d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5d4f55d7c5-kcwqp\" (UID: \"b8f1d1be-2fad-4cfa-9b67-1b91879bf70d\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-kcwqp" Dec 07 19:19:10 crc kubenswrapper[4815]: I1207 19:19:10.010342 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b8f1d1be-2fad-4cfa-9b67-1b91879bf70d-v4-0-config-user-template-login\") pod \"oauth-openshift-5d4f55d7c5-kcwqp\" (UID: \"b8f1d1be-2fad-4cfa-9b67-1b91879bf70d\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-kcwqp" Dec 07 19:19:10 crc kubenswrapper[4815]: I1207 19:19:10.010653 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b8f1d1be-2fad-4cfa-9b67-1b91879bf70d-v4-0-config-user-template-error\") pod \"oauth-openshift-5d4f55d7c5-kcwqp\" (UID: \"b8f1d1be-2fad-4cfa-9b67-1b91879bf70d\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-kcwqp" Dec 07 19:19:10 crc kubenswrapper[4815]: I1207 19:19:10.011399 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b8f1d1be-2fad-4cfa-9b67-1b91879bf70d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5d4f55d7c5-kcwqp\" (UID: \"b8f1d1be-2fad-4cfa-9b67-1b91879bf70d\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-kcwqp" Dec 07 19:19:10 crc kubenswrapper[4815]: I1207 19:19:10.011640 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b8f1d1be-2fad-4cfa-9b67-1b91879bf70d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5d4f55d7c5-kcwqp\" (UID: \"b8f1d1be-2fad-4cfa-9b67-1b91879bf70d\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-kcwqp" Dec 07 19:19:10 crc kubenswrapper[4815]: I1207 19:19:10.013534 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b8f1d1be-2fad-4cfa-9b67-1b91879bf70d-v4-0-config-system-router-certs\") pod \"oauth-openshift-5d4f55d7c5-kcwqp\" (UID: \"b8f1d1be-2fad-4cfa-9b67-1b91879bf70d\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-kcwqp" Dec 07 19:19:10 crc kubenswrapper[4815]: I1207 19:19:10.014873 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b8f1d1be-2fad-4cfa-9b67-1b91879bf70d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5d4f55d7c5-kcwqp\" (UID: \"b8f1d1be-2fad-4cfa-9b67-1b91879bf70d\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-kcwqp" Dec 07 19:19:10 crc kubenswrapper[4815]: I1207 19:19:10.015991 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b8f1d1be-2fad-4cfa-9b67-1b91879bf70d-v4-0-config-system-session\") pod \"oauth-openshift-5d4f55d7c5-kcwqp\" (UID: \"b8f1d1be-2fad-4cfa-9b67-1b91879bf70d\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-kcwqp" Dec 07 19:19:10 crc kubenswrapper[4815]: I1207 19:19:10.041509 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bj2dm\" (UniqueName: \"kubernetes.io/projected/b8f1d1be-2fad-4cfa-9b67-1b91879bf70d-kube-api-access-bj2dm\") pod \"oauth-openshift-5d4f55d7c5-kcwqp\" (UID: \"b8f1d1be-2fad-4cfa-9b67-1b91879bf70d\") " pod="openshift-authentication/oauth-openshift-5d4f55d7c5-kcwqp" Dec 07 19:19:10 crc kubenswrapper[4815]: I1207 19:19:10.120375 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5d4f55d7c5-kcwqp" Dec 07 19:19:10 crc kubenswrapper[4815]: I1207 19:19:10.549176 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5d4f55d7c5-kcwqp"] Dec 07 19:19:10 crc kubenswrapper[4815]: W1207 19:19:10.556047 4815 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8f1d1be_2fad_4cfa_9b67_1b91879bf70d.slice/crio-541f3cfeecbb64c7c151960ec91f686cdecaef5465d1535f45453cfebd2665bc WatchSource:0}: Error finding container 541f3cfeecbb64c7c151960ec91f686cdecaef5465d1535f45453cfebd2665bc: Status 404 returned error can't find the container with id 541f3cfeecbb64c7c151960ec91f686cdecaef5465d1535f45453cfebd2665bc Dec 07 19:19:10 crc kubenswrapper[4815]: I1207 19:19:10.719164 4815 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 07 19:19:10 crc kubenswrapper[4815]: I1207 19:19:10.720014 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://52ab545f838534306ea70cc6e400e9fb56ed71ed8206d72c6626166f3a51a93c" gracePeriod=15 Dec 07 19:19:10 crc kubenswrapper[4815]: I1207 19:19:10.720068 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://4f59886c065844948639bf9828ed20c24546cc38659683c98162568c6b12c029" gracePeriod=15 Dec 07 19:19:10 crc kubenswrapper[4815]: I1207 19:19:10.720020 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://c76d9f056d22d1be012327374370269c2f6d6b8bd341c98419fcb744370751e3" gracePeriod=15 Dec 07 19:19:10 crc kubenswrapper[4815]: I1207 19:19:10.720178 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://902a764a3f6e3ac55c1817f369dfd69ffbf8528bfe4444a6514cdabd48f09fb5" gracePeriod=15 Dec 07 19:19:10 crc kubenswrapper[4815]: I1207 19:19:10.720282 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://5bec885c9a80d9b5b1fe630a9b7a3f19b23b0af05f359bd87a664f007812b4c2" gracePeriod=15 Dec 07 19:19:10 crc kubenswrapper[4815]: I1207 19:19:10.720630 4815 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 07 19:19:10 crc kubenswrapper[4815]: E1207 19:19:10.720888 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 07 19:19:10 crc kubenswrapper[4815]: I1207 19:19:10.720905 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 07 19:19:10 crc kubenswrapper[4815]: E1207 19:19:10.720927 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 07 19:19:10 crc kubenswrapper[4815]: I1207 19:19:10.720935 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 07 19:19:10 crc kubenswrapper[4815]: E1207 19:19:10.720947 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 07 19:19:10 crc kubenswrapper[4815]: I1207 19:19:10.720954 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 07 19:19:10 crc kubenswrapper[4815]: E1207 19:19:10.720964 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 07 19:19:10 crc kubenswrapper[4815]: I1207 19:19:10.720969 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 07 19:19:10 crc kubenswrapper[4815]: E1207 19:19:10.720979 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 07 19:19:10 crc kubenswrapper[4815]: I1207 19:19:10.720986 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 07 19:19:10 crc kubenswrapper[4815]: E1207 19:19:10.720996 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 07 19:19:10 crc kubenswrapper[4815]: I1207 19:19:10.721002 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 07 19:19:10 crc kubenswrapper[4815]: I1207 19:19:10.721092 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 07 19:19:10 crc kubenswrapper[4815]: I1207 19:19:10.721101 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 07 19:19:10 crc kubenswrapper[4815]: I1207 19:19:10.721115 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 07 19:19:10 crc kubenswrapper[4815]: I1207 19:19:10.721127 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 07 19:19:10 crc kubenswrapper[4815]: I1207 19:19:10.721133 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 07 19:19:10 crc kubenswrapper[4815]: I1207 19:19:10.721140 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 07 19:19:10 crc kubenswrapper[4815]: E1207 19:19:10.721228 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 07 19:19:10 crc kubenswrapper[4815]: I1207 19:19:10.721234 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 07 19:19:10 crc kubenswrapper[4815]: I1207 19:19:10.724806 4815 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 07 19:19:10 crc kubenswrapper[4815]: I1207 19:19:10.731707 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 07 19:19:10 crc kubenswrapper[4815]: I1207 19:19:10.738088 4815 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="f4b27818a5e8e43d0dc095d08835c792" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" Dec 07 19:19:10 crc kubenswrapper[4815]: I1207 19:19:10.816690 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 07 19:19:10 crc kubenswrapper[4815]: I1207 19:19:10.816781 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 07 19:19:10 crc kubenswrapper[4815]: I1207 19:19:10.816803 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 07 19:19:10 crc kubenswrapper[4815]: I1207 19:19:10.816829 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 07 19:19:10 crc kubenswrapper[4815]: I1207 19:19:10.816865 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 07 19:19:10 crc kubenswrapper[4815]: I1207 19:19:10.816895 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 07 19:19:10 crc kubenswrapper[4815]: I1207 19:19:10.816932 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 07 19:19:10 crc kubenswrapper[4815]: I1207 19:19:10.816948 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 07 19:19:10 crc kubenswrapper[4815]: E1207 19:19:10.858554 4815 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/events\": dial tcp 38.102.83.2:6443: connect: connection refused" event="&Event{ObjectMeta:{oauth-openshift-5d4f55d7c5-kcwqp.187f05114f6eb70c openshift-authentication 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-authentication,Name:oauth-openshift-5d4f55d7c5-kcwqp,UID:b8f1d1be-2fad-4cfa-9b67-1b91879bf70d,APIVersion:v1,ResourceVersion:29366,FieldPath:spec.containers{oauth-openshift},},Reason:Created,Message:Created container oauth-openshift,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-07 19:19:10.857762572 +0000 UTC m=+255.436752617,LastTimestamp:2025-12-07 19:19:10.857762572 +0000 UTC m=+255.436752617,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 07 19:19:10 crc kubenswrapper[4815]: I1207 19:19:10.918553 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 07 19:19:10 crc kubenswrapper[4815]: I1207 19:19:10.918596 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 07 19:19:10 crc kubenswrapper[4815]: I1207 19:19:10.918623 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 07 19:19:10 crc kubenswrapper[4815]: I1207 19:19:10.918651 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 07 19:19:10 crc kubenswrapper[4815]: I1207 19:19:10.918675 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 07 19:19:10 crc kubenswrapper[4815]: I1207 19:19:10.918709 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 07 19:19:10 crc kubenswrapper[4815]: I1207 19:19:10.918733 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 07 19:19:10 crc kubenswrapper[4815]: I1207 19:19:10.918757 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 07 19:19:10 crc kubenswrapper[4815]: I1207 19:19:10.918825 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 07 19:19:10 crc kubenswrapper[4815]: I1207 19:19:10.918865 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 07 19:19:10 crc kubenswrapper[4815]: I1207 19:19:10.918891 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 07 19:19:10 crc kubenswrapper[4815]: I1207 19:19:10.918960 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 07 19:19:10 crc kubenswrapper[4815]: I1207 19:19:10.918998 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 07 19:19:10 crc kubenswrapper[4815]: I1207 19:19:10.919026 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 07 19:19:10 crc kubenswrapper[4815]: I1207 19:19:10.919051 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 07 19:19:10 crc kubenswrapper[4815]: I1207 19:19:10.919078 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 07 19:19:11 crc kubenswrapper[4815]: I1207 19:19:11.537859 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 07 19:19:11 crc kubenswrapper[4815]: I1207 19:19:11.541223 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 07 19:19:11 crc kubenswrapper[4815]: I1207 19:19:11.542678 4815 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c76d9f056d22d1be012327374370269c2f6d6b8bd341c98419fcb744370751e3" exitCode=0 Dec 07 19:19:11 crc kubenswrapper[4815]: I1207 19:19:11.542714 4815 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="902a764a3f6e3ac55c1817f369dfd69ffbf8528bfe4444a6514cdabd48f09fb5" exitCode=0 Dec 07 19:19:11 crc kubenswrapper[4815]: I1207 19:19:11.542731 4815 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="4f59886c065844948639bf9828ed20c24546cc38659683c98162568c6b12c029" exitCode=0 Dec 07 19:19:11 crc kubenswrapper[4815]: I1207 19:19:11.542742 4815 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="5bec885c9a80d9b5b1fe630a9b7a3f19b23b0af05f359bd87a664f007812b4c2" exitCode=2 Dec 07 19:19:11 crc kubenswrapper[4815]: I1207 19:19:11.542828 4815 scope.go:117] "RemoveContainer" containerID="539bb1f0e569004272aac994e13bdc7f17f342f5986f08beca97a4c888e046c2" Dec 07 19:19:11 crc kubenswrapper[4815]: I1207 19:19:11.547855 4815 generic.go:334] "Generic (PLEG): container finished" podID="8f52b789-4509-4834-95f0-62d38f4e620c" containerID="02391ea487d6d52c854b6c17e4271168ad5518e6301ed59d9b0448f19c9d42e1" exitCode=0 Dec 07 19:19:11 crc kubenswrapper[4815]: I1207 19:19:11.547953 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"8f52b789-4509-4834-95f0-62d38f4e620c","Type":"ContainerDied","Data":"02391ea487d6d52c854b6c17e4271168ad5518e6301ed59d9b0448f19c9d42e1"} Dec 07 19:19:11 crc kubenswrapper[4815]: I1207 19:19:11.548714 4815 status_manager.go:851] "Failed to get status for pod" podUID="8f52b789-4509-4834-95f0-62d38f4e620c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.2:6443: connect: connection refused" Dec 07 19:19:11 crc kubenswrapper[4815]: I1207 19:19:11.563742 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-5d4f55d7c5-kcwqp_b8f1d1be-2fad-4cfa-9b67-1b91879bf70d/oauth-openshift/0.log" Dec 07 19:19:11 crc kubenswrapper[4815]: I1207 19:19:11.563832 4815 generic.go:334] "Generic (PLEG): container finished" podID="b8f1d1be-2fad-4cfa-9b67-1b91879bf70d" containerID="4673c0b8e98df8030e13b01ce760165105e85dff292240f5862fdff68650358c" exitCode=255 Dec 07 19:19:11 crc kubenswrapper[4815]: I1207 19:19:11.564001 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5d4f55d7c5-kcwqp" event={"ID":"b8f1d1be-2fad-4cfa-9b67-1b91879bf70d","Type":"ContainerDied","Data":"4673c0b8e98df8030e13b01ce760165105e85dff292240f5862fdff68650358c"} Dec 07 19:19:11 crc kubenswrapper[4815]: I1207 19:19:11.564089 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5d4f55d7c5-kcwqp" event={"ID":"b8f1d1be-2fad-4cfa-9b67-1b91879bf70d","Type":"ContainerStarted","Data":"541f3cfeecbb64c7c151960ec91f686cdecaef5465d1535f45453cfebd2665bc"} Dec 07 19:19:11 crc kubenswrapper[4815]: I1207 19:19:11.564964 4815 scope.go:117] "RemoveContainer" containerID="4673c0b8e98df8030e13b01ce760165105e85dff292240f5862fdff68650358c" Dec 07 19:19:11 crc kubenswrapper[4815]: I1207 19:19:11.565768 4815 status_manager.go:851] "Failed to get status for pod" podUID="b8f1d1be-2fad-4cfa-9b67-1b91879bf70d" pod="openshift-authentication/oauth-openshift-5d4f55d7c5-kcwqp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-5d4f55d7c5-kcwqp\": dial tcp 38.102.83.2:6443: connect: connection refused" Dec 07 19:19:11 crc kubenswrapper[4815]: I1207 19:19:11.566247 4815 status_manager.go:851] "Failed to get status for pod" podUID="8f52b789-4509-4834-95f0-62d38f4e620c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.2:6443: connect: connection refused" Dec 07 19:19:12 crc kubenswrapper[4815]: E1207 19:19:12.089657 4815 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:19:12Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:19:12Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:19:12Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:19:12Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:15adb3b2133604b064893f8009a74145e4c8bb5b134d111346dcccbdd2aa9bc2\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:164fc35a19aa6cc886c8015c8ee3eba4895e76b1152cb9d795e4f3154a8533a3\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1610512706},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:5040a128a7439f631f71b3c1ba8c11857cedda9212053fa32fa210068429f665\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:de3d8ff2eb7431b14b78e2ccba3b498e75aa9cd45a4831e10a4ac8a4539ed765\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1222233431},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:9b9ae7ab67ec3e86ec42d5973457d724f7903ed8fc79c6be7fcb796a663d02cf\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:af4bf3356fe0006344ab13e9c13322b5c1f5335da8d07e947e69e71ea7ab655c\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1201863506},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1b1026c62413fa239fa4ff6541fe8bda656c1281867ad6ee2c848feccb13c97e\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:2b633ebdc901d19290af4dc2d09e2b59c504c0fc15a3fba410b0ce098e2d5753\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1141987142},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.2:6443: connect: connection refused" Dec 07 19:19:12 crc kubenswrapper[4815]: E1207 19:19:12.090160 4815 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.2:6443: connect: connection refused" Dec 07 19:19:12 crc kubenswrapper[4815]: E1207 19:19:12.090387 4815 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.2:6443: connect: connection refused" Dec 07 19:19:12 crc kubenswrapper[4815]: E1207 19:19:12.090571 4815 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.2:6443: connect: connection refused" Dec 07 19:19:12 crc kubenswrapper[4815]: E1207 19:19:12.090756 4815 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.2:6443: connect: connection refused" Dec 07 19:19:12 crc kubenswrapper[4815]: E1207 19:19:12.090772 4815 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 07 19:19:12 crc kubenswrapper[4815]: I1207 19:19:12.575737 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-5d4f55d7c5-kcwqp_b8f1d1be-2fad-4cfa-9b67-1b91879bf70d/oauth-openshift/1.log" Dec 07 19:19:12 crc kubenswrapper[4815]: I1207 19:19:12.576727 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-5d4f55d7c5-kcwqp_b8f1d1be-2fad-4cfa-9b67-1b91879bf70d/oauth-openshift/0.log" Dec 07 19:19:12 crc kubenswrapper[4815]: I1207 19:19:12.576771 4815 generic.go:334] "Generic (PLEG): container finished" podID="b8f1d1be-2fad-4cfa-9b67-1b91879bf70d" containerID="4df41cf0c3043b4f25e4e5545f1accc93408e2a493dbf240f4917cc9faf8448e" exitCode=255 Dec 07 19:19:12 crc kubenswrapper[4815]: I1207 19:19:12.576861 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5d4f55d7c5-kcwqp" event={"ID":"b8f1d1be-2fad-4cfa-9b67-1b91879bf70d","Type":"ContainerDied","Data":"4df41cf0c3043b4f25e4e5545f1accc93408e2a493dbf240f4917cc9faf8448e"} Dec 07 19:19:12 crc kubenswrapper[4815]: I1207 19:19:12.576902 4815 scope.go:117] "RemoveContainer" containerID="4673c0b8e98df8030e13b01ce760165105e85dff292240f5862fdff68650358c" Dec 07 19:19:12 crc kubenswrapper[4815]: I1207 19:19:12.577597 4815 scope.go:117] "RemoveContainer" containerID="4df41cf0c3043b4f25e4e5545f1accc93408e2a493dbf240f4917cc9faf8448e" Dec 07 19:19:12 crc kubenswrapper[4815]: I1207 19:19:12.577634 4815 status_manager.go:851] "Failed to get status for pod" podUID="b8f1d1be-2fad-4cfa-9b67-1b91879bf70d" pod="openshift-authentication/oauth-openshift-5d4f55d7c5-kcwqp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-5d4f55d7c5-kcwqp\": dial tcp 38.102.83.2:6443: connect: connection refused" Dec 07 19:19:12 crc kubenswrapper[4815]: I1207 19:19:12.577823 4815 status_manager.go:851] "Failed to get status for pod" podUID="8f52b789-4509-4834-95f0-62d38f4e620c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.2:6443: connect: connection refused" Dec 07 19:19:12 crc kubenswrapper[4815]: E1207 19:19:12.578070 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oauth-openshift\" with CrashLoopBackOff: \"back-off 10s restarting failed container=oauth-openshift pod=oauth-openshift-5d4f55d7c5-kcwqp_openshift-authentication(b8f1d1be-2fad-4cfa-9b67-1b91879bf70d)\"" pod="openshift-authentication/oauth-openshift-5d4f55d7c5-kcwqp" podUID="b8f1d1be-2fad-4cfa-9b67-1b91879bf70d" Dec 07 19:19:12 crc kubenswrapper[4815]: I1207 19:19:12.586025 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 07 19:19:12 crc kubenswrapper[4815]: I1207 19:19:12.988264 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 07 19:19:12 crc kubenswrapper[4815]: I1207 19:19:12.989165 4815 status_manager.go:851] "Failed to get status for pod" podUID="8f52b789-4509-4834-95f0-62d38f4e620c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.2:6443: connect: connection refused" Dec 07 19:19:12 crc kubenswrapper[4815]: I1207 19:19:12.989580 4815 status_manager.go:851] "Failed to get status for pod" podUID="b8f1d1be-2fad-4cfa-9b67-1b91879bf70d" pod="openshift-authentication/oauth-openshift-5d4f55d7c5-kcwqp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-5d4f55d7c5-kcwqp\": dial tcp 38.102.83.2:6443: connect: connection refused" Dec 07 19:19:13 crc kubenswrapper[4815]: I1207 19:19:13.103951 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 07 19:19:13 crc kubenswrapper[4815]: I1207 19:19:13.105220 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 07 19:19:13 crc kubenswrapper[4815]: I1207 19:19:13.106302 4815 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.2:6443: connect: connection refused" Dec 07 19:19:13 crc kubenswrapper[4815]: I1207 19:19:13.107012 4815 status_manager.go:851] "Failed to get status for pod" podUID="b8f1d1be-2fad-4cfa-9b67-1b91879bf70d" pod="openshift-authentication/oauth-openshift-5d4f55d7c5-kcwqp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-5d4f55d7c5-kcwqp\": dial tcp 38.102.83.2:6443: connect: connection refused" Dec 07 19:19:13 crc kubenswrapper[4815]: I1207 19:19:13.107491 4815 status_manager.go:851] "Failed to get status for pod" podUID="8f52b789-4509-4834-95f0-62d38f4e620c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.2:6443: connect: connection refused" Dec 07 19:19:13 crc kubenswrapper[4815]: I1207 19:19:13.151731 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8f52b789-4509-4834-95f0-62d38f4e620c-var-lock\") pod \"8f52b789-4509-4834-95f0-62d38f4e620c\" (UID: \"8f52b789-4509-4834-95f0-62d38f4e620c\") " Dec 07 19:19:13 crc kubenswrapper[4815]: I1207 19:19:13.151801 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8f52b789-4509-4834-95f0-62d38f4e620c-kube-api-access\") pod \"8f52b789-4509-4834-95f0-62d38f4e620c\" (UID: \"8f52b789-4509-4834-95f0-62d38f4e620c\") " Dec 07 19:19:13 crc kubenswrapper[4815]: I1207 19:19:13.151819 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8f52b789-4509-4834-95f0-62d38f4e620c-kubelet-dir\") pod \"8f52b789-4509-4834-95f0-62d38f4e620c\" (UID: \"8f52b789-4509-4834-95f0-62d38f4e620c\") " Dec 07 19:19:13 crc kubenswrapper[4815]: I1207 19:19:13.152047 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8f52b789-4509-4834-95f0-62d38f4e620c-var-lock" (OuterVolumeSpecName: "var-lock") pod "8f52b789-4509-4834-95f0-62d38f4e620c" (UID: "8f52b789-4509-4834-95f0-62d38f4e620c"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 07 19:19:13 crc kubenswrapper[4815]: I1207 19:19:13.152095 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8f52b789-4509-4834-95f0-62d38f4e620c-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "8f52b789-4509-4834-95f0-62d38f4e620c" (UID: "8f52b789-4509-4834-95f0-62d38f4e620c"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 07 19:19:13 crc kubenswrapper[4815]: I1207 19:19:13.152249 4815 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8f52b789-4509-4834-95f0-62d38f4e620c-var-lock\") on node \"crc\" DevicePath \"\"" Dec 07 19:19:13 crc kubenswrapper[4815]: I1207 19:19:13.152268 4815 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8f52b789-4509-4834-95f0-62d38f4e620c-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 07 19:19:13 crc kubenswrapper[4815]: I1207 19:19:13.160679 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f52b789-4509-4834-95f0-62d38f4e620c-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "8f52b789-4509-4834-95f0-62d38f4e620c" (UID: "8f52b789-4509-4834-95f0-62d38f4e620c"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:19:13 crc kubenswrapper[4815]: I1207 19:19:13.253904 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 07 19:19:13 crc kubenswrapper[4815]: I1207 19:19:13.254254 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 07 19:19:13 crc kubenswrapper[4815]: I1207 19:19:13.254301 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 07 19:19:13 crc kubenswrapper[4815]: I1207 19:19:13.254346 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 07 19:19:13 crc kubenswrapper[4815]: I1207 19:19:13.254391 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 07 19:19:13 crc kubenswrapper[4815]: I1207 19:19:13.254549 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 07 19:19:13 crc kubenswrapper[4815]: I1207 19:19:13.254760 4815 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Dec 07 19:19:13 crc kubenswrapper[4815]: I1207 19:19:13.254801 4815 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 07 19:19:13 crc kubenswrapper[4815]: I1207 19:19:13.254824 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8f52b789-4509-4834-95f0-62d38f4e620c-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 07 19:19:13 crc kubenswrapper[4815]: I1207 19:19:13.254845 4815 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 07 19:19:13 crc kubenswrapper[4815]: I1207 19:19:13.601229 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 07 19:19:13 crc kubenswrapper[4815]: I1207 19:19:13.601227 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"8f52b789-4509-4834-95f0-62d38f4e620c","Type":"ContainerDied","Data":"1393b1547e7b18d52d852c2b98cb047a3bbf6001736643a2f8d414666b404ec2"} Dec 07 19:19:13 crc kubenswrapper[4815]: I1207 19:19:13.601468 4815 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1393b1547e7b18d52d852c2b98cb047a3bbf6001736643a2f8d414666b404ec2" Dec 07 19:19:13 crc kubenswrapper[4815]: I1207 19:19:13.605230 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-5d4f55d7c5-kcwqp_b8f1d1be-2fad-4cfa-9b67-1b91879bf70d/oauth-openshift/1.log" Dec 07 19:19:13 crc kubenswrapper[4815]: I1207 19:19:13.606131 4815 scope.go:117] "RemoveContainer" containerID="4df41cf0c3043b4f25e4e5545f1accc93408e2a493dbf240f4917cc9faf8448e" Dec 07 19:19:13 crc kubenswrapper[4815]: E1207 19:19:13.606767 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oauth-openshift\" with CrashLoopBackOff: \"back-off 10s restarting failed container=oauth-openshift pod=oauth-openshift-5d4f55d7c5-kcwqp_openshift-authentication(b8f1d1be-2fad-4cfa-9b67-1b91879bf70d)\"" pod="openshift-authentication/oauth-openshift-5d4f55d7c5-kcwqp" podUID="b8f1d1be-2fad-4cfa-9b67-1b91879bf70d" Dec 07 19:19:13 crc kubenswrapper[4815]: I1207 19:19:13.606964 4815 status_manager.go:851] "Failed to get status for pod" podUID="8f52b789-4509-4834-95f0-62d38f4e620c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.2:6443: connect: connection refused" Dec 07 19:19:13 crc kubenswrapper[4815]: I1207 19:19:13.607839 4815 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.2:6443: connect: connection refused" Dec 07 19:19:13 crc kubenswrapper[4815]: I1207 19:19:13.608471 4815 status_manager.go:851] "Failed to get status for pod" podUID="b8f1d1be-2fad-4cfa-9b67-1b91879bf70d" pod="openshift-authentication/oauth-openshift-5d4f55d7c5-kcwqp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-5d4f55d7c5-kcwqp\": dial tcp 38.102.83.2:6443: connect: connection refused" Dec 07 19:19:13 crc kubenswrapper[4815]: I1207 19:19:13.610785 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 07 19:19:13 crc kubenswrapper[4815]: I1207 19:19:13.612134 4815 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="52ab545f838534306ea70cc6e400e9fb56ed71ed8206d72c6626166f3a51a93c" exitCode=0 Dec 07 19:19:13 crc kubenswrapper[4815]: I1207 19:19:13.612218 4815 scope.go:117] "RemoveContainer" containerID="c76d9f056d22d1be012327374370269c2f6d6b8bd341c98419fcb744370751e3" Dec 07 19:19:13 crc kubenswrapper[4815]: I1207 19:19:13.612470 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 07 19:19:13 crc kubenswrapper[4815]: I1207 19:19:13.633064 4815 status_manager.go:851] "Failed to get status for pod" podUID="b8f1d1be-2fad-4cfa-9b67-1b91879bf70d" pod="openshift-authentication/oauth-openshift-5d4f55d7c5-kcwqp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-5d4f55d7c5-kcwqp\": dial tcp 38.102.83.2:6443: connect: connection refused" Dec 07 19:19:13 crc kubenswrapper[4815]: I1207 19:19:13.633641 4815 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.2:6443: connect: connection refused" Dec 07 19:19:13 crc kubenswrapper[4815]: I1207 19:19:13.634201 4815 status_manager.go:851] "Failed to get status for pod" podUID="8f52b789-4509-4834-95f0-62d38f4e620c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.2:6443: connect: connection refused" Dec 07 19:19:13 crc kubenswrapper[4815]: I1207 19:19:13.638811 4815 scope.go:117] "RemoveContainer" containerID="902a764a3f6e3ac55c1817f369dfd69ffbf8528bfe4444a6514cdabd48f09fb5" Dec 07 19:19:13 crc kubenswrapper[4815]: I1207 19:19:13.652625 4815 status_manager.go:851] "Failed to get status for pod" podUID="8f52b789-4509-4834-95f0-62d38f4e620c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.2:6443: connect: connection refused" Dec 07 19:19:13 crc kubenswrapper[4815]: I1207 19:19:13.655170 4815 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.2:6443: connect: connection refused" Dec 07 19:19:13 crc kubenswrapper[4815]: I1207 19:19:13.655665 4815 status_manager.go:851] "Failed to get status for pod" podUID="b8f1d1be-2fad-4cfa-9b67-1b91879bf70d" pod="openshift-authentication/oauth-openshift-5d4f55d7c5-kcwqp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-5d4f55d7c5-kcwqp\": dial tcp 38.102.83.2:6443: connect: connection refused" Dec 07 19:19:13 crc kubenswrapper[4815]: I1207 19:19:13.667529 4815 scope.go:117] "RemoveContainer" containerID="4f59886c065844948639bf9828ed20c24546cc38659683c98162568c6b12c029" Dec 07 19:19:13 crc kubenswrapper[4815]: I1207 19:19:13.695262 4815 scope.go:117] "RemoveContainer" containerID="5bec885c9a80d9b5b1fe630a9b7a3f19b23b0af05f359bd87a664f007812b4c2" Dec 07 19:19:13 crc kubenswrapper[4815]: I1207 19:19:13.718391 4815 scope.go:117] "RemoveContainer" containerID="52ab545f838534306ea70cc6e400e9fb56ed71ed8206d72c6626166f3a51a93c" Dec 07 19:19:13 crc kubenswrapper[4815]: I1207 19:19:13.744805 4815 scope.go:117] "RemoveContainer" containerID="e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b" Dec 07 19:19:13 crc kubenswrapper[4815]: I1207 19:19:13.779948 4815 scope.go:117] "RemoveContainer" containerID="c76d9f056d22d1be012327374370269c2f6d6b8bd341c98419fcb744370751e3" Dec 07 19:19:13 crc kubenswrapper[4815]: I1207 19:19:13.782094 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Dec 07 19:19:13 crc kubenswrapper[4815]: E1207 19:19:13.782458 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c76d9f056d22d1be012327374370269c2f6d6b8bd341c98419fcb744370751e3\": container with ID starting with c76d9f056d22d1be012327374370269c2f6d6b8bd341c98419fcb744370751e3 not found: ID does not exist" containerID="c76d9f056d22d1be012327374370269c2f6d6b8bd341c98419fcb744370751e3" Dec 07 19:19:13 crc kubenswrapper[4815]: I1207 19:19:13.782509 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c76d9f056d22d1be012327374370269c2f6d6b8bd341c98419fcb744370751e3"} err="failed to get container status \"c76d9f056d22d1be012327374370269c2f6d6b8bd341c98419fcb744370751e3\": rpc error: code = NotFound desc = could not find container \"c76d9f056d22d1be012327374370269c2f6d6b8bd341c98419fcb744370751e3\": container with ID starting with c76d9f056d22d1be012327374370269c2f6d6b8bd341c98419fcb744370751e3 not found: ID does not exist" Dec 07 19:19:13 crc kubenswrapper[4815]: I1207 19:19:13.782530 4815 scope.go:117] "RemoveContainer" containerID="902a764a3f6e3ac55c1817f369dfd69ffbf8528bfe4444a6514cdabd48f09fb5" Dec 07 19:19:13 crc kubenswrapper[4815]: E1207 19:19:13.782830 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"902a764a3f6e3ac55c1817f369dfd69ffbf8528bfe4444a6514cdabd48f09fb5\": container with ID starting with 902a764a3f6e3ac55c1817f369dfd69ffbf8528bfe4444a6514cdabd48f09fb5 not found: ID does not exist" containerID="902a764a3f6e3ac55c1817f369dfd69ffbf8528bfe4444a6514cdabd48f09fb5" Dec 07 19:19:13 crc kubenswrapper[4815]: I1207 19:19:13.782853 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"902a764a3f6e3ac55c1817f369dfd69ffbf8528bfe4444a6514cdabd48f09fb5"} err="failed to get container status \"902a764a3f6e3ac55c1817f369dfd69ffbf8528bfe4444a6514cdabd48f09fb5\": rpc error: code = NotFound desc = could not find container \"902a764a3f6e3ac55c1817f369dfd69ffbf8528bfe4444a6514cdabd48f09fb5\": container with ID starting with 902a764a3f6e3ac55c1817f369dfd69ffbf8528bfe4444a6514cdabd48f09fb5 not found: ID does not exist" Dec 07 19:19:13 crc kubenswrapper[4815]: I1207 19:19:13.782880 4815 scope.go:117] "RemoveContainer" containerID="4f59886c065844948639bf9828ed20c24546cc38659683c98162568c6b12c029" Dec 07 19:19:13 crc kubenswrapper[4815]: E1207 19:19:13.784234 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f59886c065844948639bf9828ed20c24546cc38659683c98162568c6b12c029\": container with ID starting with 4f59886c065844948639bf9828ed20c24546cc38659683c98162568c6b12c029 not found: ID does not exist" containerID="4f59886c065844948639bf9828ed20c24546cc38659683c98162568c6b12c029" Dec 07 19:19:13 crc kubenswrapper[4815]: I1207 19:19:13.784287 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f59886c065844948639bf9828ed20c24546cc38659683c98162568c6b12c029"} err="failed to get container status \"4f59886c065844948639bf9828ed20c24546cc38659683c98162568c6b12c029\": rpc error: code = NotFound desc = could not find container \"4f59886c065844948639bf9828ed20c24546cc38659683c98162568c6b12c029\": container with ID starting with 4f59886c065844948639bf9828ed20c24546cc38659683c98162568c6b12c029 not found: ID does not exist" Dec 07 19:19:13 crc kubenswrapper[4815]: I1207 19:19:13.784303 4815 scope.go:117] "RemoveContainer" containerID="5bec885c9a80d9b5b1fe630a9b7a3f19b23b0af05f359bd87a664f007812b4c2" Dec 07 19:19:13 crc kubenswrapper[4815]: E1207 19:19:13.784561 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5bec885c9a80d9b5b1fe630a9b7a3f19b23b0af05f359bd87a664f007812b4c2\": container with ID starting with 5bec885c9a80d9b5b1fe630a9b7a3f19b23b0af05f359bd87a664f007812b4c2 not found: ID does not exist" containerID="5bec885c9a80d9b5b1fe630a9b7a3f19b23b0af05f359bd87a664f007812b4c2" Dec 07 19:19:13 crc kubenswrapper[4815]: I1207 19:19:13.784577 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bec885c9a80d9b5b1fe630a9b7a3f19b23b0af05f359bd87a664f007812b4c2"} err="failed to get container status \"5bec885c9a80d9b5b1fe630a9b7a3f19b23b0af05f359bd87a664f007812b4c2\": rpc error: code = NotFound desc = could not find container \"5bec885c9a80d9b5b1fe630a9b7a3f19b23b0af05f359bd87a664f007812b4c2\": container with ID starting with 5bec885c9a80d9b5b1fe630a9b7a3f19b23b0af05f359bd87a664f007812b4c2 not found: ID does not exist" Dec 07 19:19:13 crc kubenswrapper[4815]: I1207 19:19:13.784608 4815 scope.go:117] "RemoveContainer" containerID="52ab545f838534306ea70cc6e400e9fb56ed71ed8206d72c6626166f3a51a93c" Dec 07 19:19:13 crc kubenswrapper[4815]: E1207 19:19:13.784798 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52ab545f838534306ea70cc6e400e9fb56ed71ed8206d72c6626166f3a51a93c\": container with ID starting with 52ab545f838534306ea70cc6e400e9fb56ed71ed8206d72c6626166f3a51a93c not found: ID does not exist" containerID="52ab545f838534306ea70cc6e400e9fb56ed71ed8206d72c6626166f3a51a93c" Dec 07 19:19:13 crc kubenswrapper[4815]: I1207 19:19:13.785213 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52ab545f838534306ea70cc6e400e9fb56ed71ed8206d72c6626166f3a51a93c"} err="failed to get container status \"52ab545f838534306ea70cc6e400e9fb56ed71ed8206d72c6626166f3a51a93c\": rpc error: code = NotFound desc = could not find container \"52ab545f838534306ea70cc6e400e9fb56ed71ed8206d72c6626166f3a51a93c\": container with ID starting with 52ab545f838534306ea70cc6e400e9fb56ed71ed8206d72c6626166f3a51a93c not found: ID does not exist" Dec 07 19:19:13 crc kubenswrapper[4815]: I1207 19:19:13.785250 4815 scope.go:117] "RemoveContainer" containerID="e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b" Dec 07 19:19:13 crc kubenswrapper[4815]: E1207 19:19:13.785460 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b\": container with ID starting with e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b not found: ID does not exist" containerID="e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b" Dec 07 19:19:13 crc kubenswrapper[4815]: I1207 19:19:13.785497 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b"} err="failed to get container status \"e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b\": rpc error: code = NotFound desc = could not find container \"e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b\": container with ID starting with e4460a4577c38b0cefbd1b83c55ab11d12d5f32ce78cda8ec02629e73bd4966b not found: ID does not exist" Dec 07 19:19:15 crc kubenswrapper[4815]: E1207 19:19:15.633628 4815 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/events\": dial tcp 38.102.83.2:6443: connect: connection refused" event="&Event{ObjectMeta:{oauth-openshift-5d4f55d7c5-kcwqp.187f05114f6eb70c openshift-authentication 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-authentication,Name:oauth-openshift-5d4f55d7c5-kcwqp,UID:b8f1d1be-2fad-4cfa-9b67-1b91879bf70d,APIVersion:v1,ResourceVersion:29366,FieldPath:spec.containers{oauth-openshift},},Reason:Created,Message:Created container oauth-openshift,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-07 19:19:10.857762572 +0000 UTC m=+255.436752617,LastTimestamp:2025-12-07 19:19:10.857762572 +0000 UTC m=+255.436752617,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 07 19:19:15 crc kubenswrapper[4815]: E1207 19:19:15.776651 4815 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.2:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 07 19:19:15 crc kubenswrapper[4815]: I1207 19:19:15.777686 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 07 19:19:15 crc kubenswrapper[4815]: I1207 19:19:15.778791 4815 status_manager.go:851] "Failed to get status for pod" podUID="8f52b789-4509-4834-95f0-62d38f4e620c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.2:6443: connect: connection refused" Dec 07 19:19:15 crc kubenswrapper[4815]: I1207 19:19:15.779747 4815 status_manager.go:851] "Failed to get status for pod" podUID="b8f1d1be-2fad-4cfa-9b67-1b91879bf70d" pod="openshift-authentication/oauth-openshift-5d4f55d7c5-kcwqp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-5d4f55d7c5-kcwqp\": dial tcp 38.102.83.2:6443: connect: connection refused" Dec 07 19:19:15 crc kubenswrapper[4815]: W1207 19:19:15.807147 4815 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-5a53af493b7f5500461da8be845ba6209243a2ebfeeb05b21095ad27cf2d1a8a WatchSource:0}: Error finding container 5a53af493b7f5500461da8be845ba6209243a2ebfeeb05b21095ad27cf2d1a8a: Status 404 returned error can't find the container with id 5a53af493b7f5500461da8be845ba6209243a2ebfeeb05b21095ad27cf2d1a8a Dec 07 19:19:16 crc kubenswrapper[4815]: E1207 19:19:16.626499 4815 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.2:6443: connect: connection refused" Dec 07 19:19:16 crc kubenswrapper[4815]: E1207 19:19:16.627377 4815 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.2:6443: connect: connection refused" Dec 07 19:19:16 crc kubenswrapper[4815]: E1207 19:19:16.627755 4815 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.2:6443: connect: connection refused" Dec 07 19:19:16 crc kubenswrapper[4815]: E1207 19:19:16.630254 4815 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.2:6443: connect: connection refused" Dec 07 19:19:16 crc kubenswrapper[4815]: E1207 19:19:16.632236 4815 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.2:6443: connect: connection refused" Dec 07 19:19:16 crc kubenswrapper[4815]: I1207 19:19:16.632303 4815 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Dec 07 19:19:16 crc kubenswrapper[4815]: E1207 19:19:16.632762 4815 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.2:6443: connect: connection refused" interval="200ms" Dec 07 19:19:16 crc kubenswrapper[4815]: I1207 19:19:16.637202 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"14345a5e5f1047bc8de1a8d5458a23eb2bcf7b63ea43575787421aaf6fbe964c"} Dec 07 19:19:16 crc kubenswrapper[4815]: I1207 19:19:16.638467 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"5a53af493b7f5500461da8be845ba6209243a2ebfeeb05b21095ad27cf2d1a8a"} Dec 07 19:19:16 crc kubenswrapper[4815]: E1207 19:19:16.639615 4815 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.2:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 07 19:19:16 crc kubenswrapper[4815]: I1207 19:19:16.639839 4815 status_manager.go:851] "Failed to get status for pod" podUID="b8f1d1be-2fad-4cfa-9b67-1b91879bf70d" pod="openshift-authentication/oauth-openshift-5d4f55d7c5-kcwqp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-5d4f55d7c5-kcwqp\": dial tcp 38.102.83.2:6443: connect: connection refused" Dec 07 19:19:16 crc kubenswrapper[4815]: I1207 19:19:16.640415 4815 status_manager.go:851] "Failed to get status for pod" podUID="8f52b789-4509-4834-95f0-62d38f4e620c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.2:6443: connect: connection refused" Dec 07 19:19:16 crc kubenswrapper[4815]: E1207 19:19:16.833556 4815 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.2:6443: connect: connection refused" interval="400ms" Dec 07 19:19:17 crc kubenswrapper[4815]: E1207 19:19:17.234375 4815 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.2:6443: connect: connection refused" interval="800ms" Dec 07 19:19:17 crc kubenswrapper[4815]: E1207 19:19:17.644424 4815 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.2:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 07 19:19:18 crc kubenswrapper[4815]: E1207 19:19:18.036031 4815 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.2:6443: connect: connection refused" interval="1.6s" Dec 07 19:19:19 crc kubenswrapper[4815]: E1207 19:19:19.637548 4815 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.2:6443: connect: connection refused" interval="3.2s" Dec 07 19:19:20 crc kubenswrapper[4815]: I1207 19:19:20.121765 4815 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-authentication/oauth-openshift-5d4f55d7c5-kcwqp" Dec 07 19:19:20 crc kubenswrapper[4815]: I1207 19:19:20.121852 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-5d4f55d7c5-kcwqp" Dec 07 19:19:20 crc kubenswrapper[4815]: I1207 19:19:20.122758 4815 scope.go:117] "RemoveContainer" containerID="4df41cf0c3043b4f25e4e5545f1accc93408e2a493dbf240f4917cc9faf8448e" Dec 07 19:19:20 crc kubenswrapper[4815]: E1207 19:19:20.123315 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oauth-openshift\" with CrashLoopBackOff: \"back-off 10s restarting failed container=oauth-openshift pod=oauth-openshift-5d4f55d7c5-kcwqp_openshift-authentication(b8f1d1be-2fad-4cfa-9b67-1b91879bf70d)\"" pod="openshift-authentication/oauth-openshift-5d4f55d7c5-kcwqp" podUID="b8f1d1be-2fad-4cfa-9b67-1b91879bf70d" Dec 07 19:19:22 crc kubenswrapper[4815]: E1207 19:19:22.471603 4815 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:19:22Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:19:22Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:19:22Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-07T19:19:22Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:15adb3b2133604b064893f8009a74145e4c8bb5b134d111346dcccbdd2aa9bc2\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:164fc35a19aa6cc886c8015c8ee3eba4895e76b1152cb9d795e4f3154a8533a3\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1610512706},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:5040a128a7439f631f71b3c1ba8c11857cedda9212053fa32fa210068429f665\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:de3d8ff2eb7431b14b78e2ccba3b498e75aa9cd45a4831e10a4ac8a4539ed765\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1222233431},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:9b9ae7ab67ec3e86ec42d5973457d724f7903ed8fc79c6be7fcb796a663d02cf\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:af4bf3356fe0006344ab13e9c13322b5c1f5335da8d07e947e69e71ea7ab655c\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1201863506},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1b1026c62413fa239fa4ff6541fe8bda656c1281867ad6ee2c848feccb13c97e\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:2b633ebdc901d19290af4dc2d09e2b59c504c0fc15a3fba410b0ce098e2d5753\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1141987142},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.2:6443: connect: connection refused" Dec 07 19:19:22 crc kubenswrapper[4815]: E1207 19:19:22.472256 4815 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.2:6443: connect: connection refused" Dec 07 19:19:22 crc kubenswrapper[4815]: E1207 19:19:22.472586 4815 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.2:6443: connect: connection refused" Dec 07 19:19:22 crc kubenswrapper[4815]: E1207 19:19:22.472814 4815 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.2:6443: connect: connection refused" Dec 07 19:19:22 crc kubenswrapper[4815]: E1207 19:19:22.473060 4815 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.2:6443: connect: connection refused" Dec 07 19:19:22 crc kubenswrapper[4815]: E1207 19:19:22.473102 4815 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 07 19:19:22 crc kubenswrapper[4815]: I1207 19:19:22.770106 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 07 19:19:22 crc kubenswrapper[4815]: I1207 19:19:22.771293 4815 status_manager.go:851] "Failed to get status for pod" podUID="8f52b789-4509-4834-95f0-62d38f4e620c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.2:6443: connect: connection refused" Dec 07 19:19:22 crc kubenswrapper[4815]: I1207 19:19:22.772022 4815 status_manager.go:851] "Failed to get status for pod" podUID="b8f1d1be-2fad-4cfa-9b67-1b91879bf70d" pod="openshift-authentication/oauth-openshift-5d4f55d7c5-kcwqp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-5d4f55d7c5-kcwqp\": dial tcp 38.102.83.2:6443: connect: connection refused" Dec 07 19:19:22 crc kubenswrapper[4815]: I1207 19:19:22.799800 4815 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="fe01e9d3-9839-41ea-82d9-f4b6f48bf3b2" Dec 07 19:19:22 crc kubenswrapper[4815]: I1207 19:19:22.799855 4815 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="fe01e9d3-9839-41ea-82d9-f4b6f48bf3b2" Dec 07 19:19:22 crc kubenswrapper[4815]: E1207 19:19:22.800560 4815 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.2:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 07 19:19:22 crc kubenswrapper[4815]: I1207 19:19:22.801420 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 07 19:19:22 crc kubenswrapper[4815]: E1207 19:19:22.825574 4815 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.2:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-k99gl" volumeName="registry-storage" Dec 07 19:19:22 crc kubenswrapper[4815]: E1207 19:19:22.838611 4815 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.2:6443: connect: connection refused" interval="6.4s" Dec 07 19:19:23 crc kubenswrapper[4815]: I1207 19:19:23.682158 4815 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="a9a35e3567ce30682722173f11c9e387eed04c4a5f16f1b7d43a912fe3a9bc40" exitCode=0 Dec 07 19:19:23 crc kubenswrapper[4815]: I1207 19:19:23.682290 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"a9a35e3567ce30682722173f11c9e387eed04c4a5f16f1b7d43a912fe3a9bc40"} Dec 07 19:19:23 crc kubenswrapper[4815]: I1207 19:19:23.682599 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"3993d4a6be68c6953eaf26753fe49b4b61f69784109f4a39d71a86104db3f720"} Dec 07 19:19:23 crc kubenswrapper[4815]: I1207 19:19:23.682982 4815 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="fe01e9d3-9839-41ea-82d9-f4b6f48bf3b2" Dec 07 19:19:23 crc kubenswrapper[4815]: I1207 19:19:23.683036 4815 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="fe01e9d3-9839-41ea-82d9-f4b6f48bf3b2" Dec 07 19:19:23 crc kubenswrapper[4815]: E1207 19:19:23.683476 4815 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.2:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 07 19:19:23 crc kubenswrapper[4815]: I1207 19:19:23.683521 4815 status_manager.go:851] "Failed to get status for pod" podUID="b8f1d1be-2fad-4cfa-9b67-1b91879bf70d" pod="openshift-authentication/oauth-openshift-5d4f55d7c5-kcwqp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-5d4f55d7c5-kcwqp\": dial tcp 38.102.83.2:6443: connect: connection refused" Dec 07 19:19:23 crc kubenswrapper[4815]: I1207 19:19:23.683770 4815 status_manager.go:851] "Failed to get status for pod" podUID="8f52b789-4509-4834-95f0-62d38f4e620c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.2:6443: connect: connection refused" Dec 07 19:19:24 crc kubenswrapper[4815]: I1207 19:19:24.690174 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"e744c1f8e30c220e9e37e875d1514681780a0f26779e6439ecfa0bc011885f6f"} Dec 07 19:19:24 crc kubenswrapper[4815]: I1207 19:19:24.690213 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"e4d75bc9616165437e0e1489a472014c84191ea0da284e042a8fd13d4f67deca"} Dec 07 19:19:24 crc kubenswrapper[4815]: I1207 19:19:24.690222 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"67fc2cb0cc80b427d8a00e3c007eef7406dee5d0dd09134b5773e2b3a51ff137"} Dec 07 19:19:25 crc kubenswrapper[4815]: I1207 19:19:25.697095 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 07 19:19:25 crc kubenswrapper[4815]: I1207 19:19:25.697350 4815 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="8f92a2d37af34c519a6b00a9643f4a46de779ae0070adb0826e9f227960fe4a0" exitCode=1 Dec 07 19:19:25 crc kubenswrapper[4815]: I1207 19:19:25.697399 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"8f92a2d37af34c519a6b00a9643f4a46de779ae0070adb0826e9f227960fe4a0"} Dec 07 19:19:25 crc kubenswrapper[4815]: I1207 19:19:25.697872 4815 scope.go:117] "RemoveContainer" containerID="8f92a2d37af34c519a6b00a9643f4a46de779ae0070adb0826e9f227960fe4a0" Dec 07 19:19:25 crc kubenswrapper[4815]: I1207 19:19:25.717407 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"45633faabf45122f2ba74aa99e4c6f1aa14f0b5ab995e26fabbc4e15e6a4e9f6"} Dec 07 19:19:25 crc kubenswrapper[4815]: I1207 19:19:25.717564 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"9e9ba0fa7135c81873d95937512697e38b5c2e2e447d7e507b28042fb8e0683b"} Dec 07 19:19:25 crc kubenswrapper[4815]: I1207 19:19:25.717930 4815 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="fe01e9d3-9839-41ea-82d9-f4b6f48bf3b2" Dec 07 19:19:25 crc kubenswrapper[4815]: I1207 19:19:25.717949 4815 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="fe01e9d3-9839-41ea-82d9-f4b6f48bf3b2" Dec 07 19:19:25 crc kubenswrapper[4815]: I1207 19:19:25.718113 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 07 19:19:26 crc kubenswrapper[4815]: I1207 19:19:26.730048 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 07 19:19:26 crc kubenswrapper[4815]: I1207 19:19:26.730109 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"9ec3c453bd3a52822f4347d9c9d9e25492fb9c4c944188391295214c73b6ffa9"} Dec 07 19:19:27 crc kubenswrapper[4815]: I1207 19:19:27.801751 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 07 19:19:27 crc kubenswrapper[4815]: I1207 19:19:27.802236 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 07 19:19:27 crc kubenswrapper[4815]: I1207 19:19:27.811024 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 07 19:19:30 crc kubenswrapper[4815]: I1207 19:19:30.727722 4815 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 07 19:19:30 crc kubenswrapper[4815]: I1207 19:19:30.758035 4815 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="fe01e9d3-9839-41ea-82d9-f4b6f48bf3b2" Dec 07 19:19:30 crc kubenswrapper[4815]: I1207 19:19:30.758062 4815 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="fe01e9d3-9839-41ea-82d9-f4b6f48bf3b2" Dec 07 19:19:30 crc kubenswrapper[4815]: I1207 19:19:30.763095 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 07 19:19:30 crc kubenswrapper[4815]: I1207 19:19:30.780840 4815 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="49afaf0a-48d4-4ab1-b026-52235c0df3a7" Dec 07 19:19:31 crc kubenswrapper[4815]: I1207 19:19:31.763285 4815 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="fe01e9d3-9839-41ea-82d9-f4b6f48bf3b2" Dec 07 19:19:31 crc kubenswrapper[4815]: I1207 19:19:31.763637 4815 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="fe01e9d3-9839-41ea-82d9-f4b6f48bf3b2" Dec 07 19:19:31 crc kubenswrapper[4815]: I1207 19:19:31.765998 4815 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="49afaf0a-48d4-4ab1-b026-52235c0df3a7" Dec 07 19:19:33 crc kubenswrapper[4815]: I1207 19:19:33.899129 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 07 19:19:34 crc kubenswrapper[4815]: I1207 19:19:34.770135 4815 scope.go:117] "RemoveContainer" containerID="4df41cf0c3043b4f25e4e5545f1accc93408e2a493dbf240f4917cc9faf8448e" Dec 07 19:19:35 crc kubenswrapper[4815]: I1207 19:19:35.549402 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 07 19:19:35 crc kubenswrapper[4815]: I1207 19:19:35.553046 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 07 19:19:35 crc kubenswrapper[4815]: I1207 19:19:35.789262 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-5d4f55d7c5-kcwqp_b8f1d1be-2fad-4cfa-9b67-1b91879bf70d/oauth-openshift/2.log" Dec 07 19:19:35 crc kubenswrapper[4815]: I1207 19:19:35.790055 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-5d4f55d7c5-kcwqp_b8f1d1be-2fad-4cfa-9b67-1b91879bf70d/oauth-openshift/1.log" Dec 07 19:19:35 crc kubenswrapper[4815]: I1207 19:19:35.790098 4815 generic.go:334] "Generic (PLEG): container finished" podID="b8f1d1be-2fad-4cfa-9b67-1b91879bf70d" containerID="7979a219f32e609d89a2fb8bf24bae0d9382fc93aba2d43438b12a47b40a48f3" exitCode=255 Dec 07 19:19:35 crc kubenswrapper[4815]: I1207 19:19:35.790232 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5d4f55d7c5-kcwqp" event={"ID":"b8f1d1be-2fad-4cfa-9b67-1b91879bf70d","Type":"ContainerDied","Data":"7979a219f32e609d89a2fb8bf24bae0d9382fc93aba2d43438b12a47b40a48f3"} Dec 07 19:19:35 crc kubenswrapper[4815]: I1207 19:19:35.790301 4815 scope.go:117] "RemoveContainer" containerID="4df41cf0c3043b4f25e4e5545f1accc93408e2a493dbf240f4917cc9faf8448e" Dec 07 19:19:35 crc kubenswrapper[4815]: I1207 19:19:35.790655 4815 scope.go:117] "RemoveContainer" containerID="7979a219f32e609d89a2fb8bf24bae0d9382fc93aba2d43438b12a47b40a48f3" Dec 07 19:19:35 crc kubenswrapper[4815]: E1207 19:19:35.790850 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oauth-openshift\" with CrashLoopBackOff: \"back-off 20s restarting failed container=oauth-openshift pod=oauth-openshift-5d4f55d7c5-kcwqp_openshift-authentication(b8f1d1be-2fad-4cfa-9b67-1b91879bf70d)\"" pod="openshift-authentication/oauth-openshift-5d4f55d7c5-kcwqp" podUID="b8f1d1be-2fad-4cfa-9b67-1b91879bf70d" Dec 07 19:19:36 crc kubenswrapper[4815]: I1207 19:19:36.797634 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-5d4f55d7c5-kcwqp_b8f1d1be-2fad-4cfa-9b67-1b91879bf70d/oauth-openshift/2.log" Dec 07 19:19:40 crc kubenswrapper[4815]: I1207 19:19:40.121603 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-5d4f55d7c5-kcwqp" Dec 07 19:19:40 crc kubenswrapper[4815]: I1207 19:19:40.121886 4815 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-authentication/oauth-openshift-5d4f55d7c5-kcwqp" Dec 07 19:19:40 crc kubenswrapper[4815]: I1207 19:19:40.122394 4815 scope.go:117] "RemoveContainer" containerID="7979a219f32e609d89a2fb8bf24bae0d9382fc93aba2d43438b12a47b40a48f3" Dec 07 19:19:40 crc kubenswrapper[4815]: E1207 19:19:40.122567 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oauth-openshift\" with CrashLoopBackOff: \"back-off 20s restarting failed container=oauth-openshift pod=oauth-openshift-5d4f55d7c5-kcwqp_openshift-authentication(b8f1d1be-2fad-4cfa-9b67-1b91879bf70d)\"" pod="openshift-authentication/oauth-openshift-5d4f55d7c5-kcwqp" podUID="b8f1d1be-2fad-4cfa-9b67-1b91879bf70d" Dec 07 19:19:40 crc kubenswrapper[4815]: I1207 19:19:40.903011 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 07 19:19:41 crc kubenswrapper[4815]: I1207 19:19:41.176534 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 07 19:19:41 crc kubenswrapper[4815]: I1207 19:19:41.248998 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 07 19:19:41 crc kubenswrapper[4815]: I1207 19:19:41.355003 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 07 19:19:42 crc kubenswrapper[4815]: I1207 19:19:42.113570 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 07 19:19:42 crc kubenswrapper[4815]: I1207 19:19:42.162547 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 07 19:19:42 crc kubenswrapper[4815]: I1207 19:19:42.220100 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 07 19:19:42 crc kubenswrapper[4815]: I1207 19:19:42.232760 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 07 19:19:42 crc kubenswrapper[4815]: I1207 19:19:42.509730 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 07 19:19:42 crc kubenswrapper[4815]: I1207 19:19:42.572945 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 07 19:19:42 crc kubenswrapper[4815]: I1207 19:19:42.776992 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 07 19:19:42 crc kubenswrapper[4815]: I1207 19:19:42.962903 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 07 19:19:43 crc kubenswrapper[4815]: I1207 19:19:43.566394 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 07 19:19:43 crc kubenswrapper[4815]: I1207 19:19:43.601790 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 07 19:19:43 crc kubenswrapper[4815]: I1207 19:19:43.603512 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 07 19:19:43 crc kubenswrapper[4815]: I1207 19:19:43.704153 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 07 19:19:43 crc kubenswrapper[4815]: I1207 19:19:43.738755 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 07 19:19:43 crc kubenswrapper[4815]: I1207 19:19:43.843290 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 07 19:19:43 crc kubenswrapper[4815]: I1207 19:19:43.862283 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 07 19:19:43 crc kubenswrapper[4815]: I1207 19:19:43.903135 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 07 19:19:43 crc kubenswrapper[4815]: I1207 19:19:43.912591 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 07 19:19:43 crc kubenswrapper[4815]: I1207 19:19:43.936901 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 07 19:19:44 crc kubenswrapper[4815]: I1207 19:19:44.168820 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 07 19:19:44 crc kubenswrapper[4815]: I1207 19:19:44.353277 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 07 19:19:44 crc kubenswrapper[4815]: I1207 19:19:44.366927 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 07 19:19:44 crc kubenswrapper[4815]: I1207 19:19:44.416936 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 07 19:19:44 crc kubenswrapper[4815]: I1207 19:19:44.494113 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 07 19:19:44 crc kubenswrapper[4815]: I1207 19:19:44.572551 4815 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 07 19:19:44 crc kubenswrapper[4815]: I1207 19:19:44.576946 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 07 19:19:44 crc kubenswrapper[4815]: I1207 19:19:44.577000 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 07 19:19:44 crc kubenswrapper[4815]: I1207 19:19:44.582966 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 07 19:19:44 crc kubenswrapper[4815]: I1207 19:19:44.597767 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=14.597752843 podStartE2EDuration="14.597752843s" podCreationTimestamp="2025-12-07 19:19:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:19:44.593696607 +0000 UTC m=+289.172686652" watchObservedRunningTime="2025-12-07 19:19:44.597752843 +0000 UTC m=+289.176742888" Dec 07 19:19:44 crc kubenswrapper[4815]: I1207 19:19:44.605519 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 07 19:19:44 crc kubenswrapper[4815]: I1207 19:19:44.642635 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 07 19:19:44 crc kubenswrapper[4815]: I1207 19:19:44.748615 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 07 19:19:44 crc kubenswrapper[4815]: I1207 19:19:44.750316 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 07 19:19:44 crc kubenswrapper[4815]: I1207 19:19:44.919245 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 07 19:19:44 crc kubenswrapper[4815]: I1207 19:19:44.942268 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 07 19:19:45 crc kubenswrapper[4815]: I1207 19:19:45.037452 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 07 19:19:45 crc kubenswrapper[4815]: I1207 19:19:45.075238 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 07 19:19:45 crc kubenswrapper[4815]: I1207 19:19:45.092512 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 07 19:19:45 crc kubenswrapper[4815]: I1207 19:19:45.164817 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 07 19:19:45 crc kubenswrapper[4815]: I1207 19:19:45.306605 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 07 19:19:45 crc kubenswrapper[4815]: I1207 19:19:45.337693 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 07 19:19:45 crc kubenswrapper[4815]: I1207 19:19:45.375358 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 07 19:19:45 crc kubenswrapper[4815]: I1207 19:19:45.396902 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 07 19:19:45 crc kubenswrapper[4815]: I1207 19:19:45.402119 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 07 19:19:45 crc kubenswrapper[4815]: I1207 19:19:45.451048 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 07 19:19:45 crc kubenswrapper[4815]: I1207 19:19:45.462715 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 07 19:19:45 crc kubenswrapper[4815]: I1207 19:19:45.517059 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 07 19:19:45 crc kubenswrapper[4815]: I1207 19:19:45.536900 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 07 19:19:45 crc kubenswrapper[4815]: I1207 19:19:45.707779 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 07 19:19:45 crc kubenswrapper[4815]: I1207 19:19:45.733133 4815 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 07 19:19:45 crc kubenswrapper[4815]: I1207 19:19:45.848606 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 07 19:19:45 crc kubenswrapper[4815]: I1207 19:19:45.869552 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 07 19:19:45 crc kubenswrapper[4815]: I1207 19:19:45.922334 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 07 19:19:46 crc kubenswrapper[4815]: I1207 19:19:46.009497 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 07 19:19:46 crc kubenswrapper[4815]: I1207 19:19:46.021235 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 07 19:19:46 crc kubenswrapper[4815]: I1207 19:19:46.110827 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 07 19:19:46 crc kubenswrapper[4815]: I1207 19:19:46.110873 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 07 19:19:46 crc kubenswrapper[4815]: I1207 19:19:46.166974 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 07 19:19:46 crc kubenswrapper[4815]: I1207 19:19:46.238851 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 07 19:19:46 crc kubenswrapper[4815]: I1207 19:19:46.300141 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 07 19:19:46 crc kubenswrapper[4815]: I1207 19:19:46.338708 4815 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 07 19:19:46 crc kubenswrapper[4815]: I1207 19:19:46.397507 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 07 19:19:46 crc kubenswrapper[4815]: I1207 19:19:46.618261 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 07 19:19:46 crc kubenswrapper[4815]: I1207 19:19:46.660020 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 07 19:19:46 crc kubenswrapper[4815]: I1207 19:19:46.761246 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 07 19:19:46 crc kubenswrapper[4815]: I1207 19:19:46.803395 4815 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 07 19:19:46 crc kubenswrapper[4815]: I1207 19:19:46.906793 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 07 19:19:46 crc kubenswrapper[4815]: I1207 19:19:46.937001 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 07 19:19:46 crc kubenswrapper[4815]: I1207 19:19:46.965818 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 07 19:19:47 crc kubenswrapper[4815]: I1207 19:19:47.199205 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 07 19:19:47 crc kubenswrapper[4815]: I1207 19:19:47.249394 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 07 19:19:47 crc kubenswrapper[4815]: I1207 19:19:47.318055 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 07 19:19:47 crc kubenswrapper[4815]: I1207 19:19:47.394661 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 07 19:19:47 crc kubenswrapper[4815]: I1207 19:19:47.451662 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 07 19:19:47 crc kubenswrapper[4815]: I1207 19:19:47.496625 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 07 19:19:47 crc kubenswrapper[4815]: I1207 19:19:47.552719 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 07 19:19:47 crc kubenswrapper[4815]: I1207 19:19:47.555646 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 07 19:19:47 crc kubenswrapper[4815]: I1207 19:19:47.599949 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 07 19:19:47 crc kubenswrapper[4815]: I1207 19:19:47.620404 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 07 19:19:47 crc kubenswrapper[4815]: I1207 19:19:47.631902 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 07 19:19:47 crc kubenswrapper[4815]: I1207 19:19:47.796610 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 07 19:19:47 crc kubenswrapper[4815]: I1207 19:19:47.801854 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 07 19:19:47 crc kubenswrapper[4815]: I1207 19:19:47.824332 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 07 19:19:47 crc kubenswrapper[4815]: I1207 19:19:47.881508 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 07 19:19:47 crc kubenswrapper[4815]: I1207 19:19:47.963593 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 07 19:19:47 crc kubenswrapper[4815]: I1207 19:19:47.977250 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 07 19:19:48 crc kubenswrapper[4815]: I1207 19:19:48.224107 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 07 19:19:48 crc kubenswrapper[4815]: I1207 19:19:48.249322 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 07 19:19:48 crc kubenswrapper[4815]: I1207 19:19:48.315909 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 07 19:19:48 crc kubenswrapper[4815]: I1207 19:19:48.322809 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 07 19:19:48 crc kubenswrapper[4815]: I1207 19:19:48.340273 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 07 19:19:48 crc kubenswrapper[4815]: I1207 19:19:48.388358 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 07 19:19:48 crc kubenswrapper[4815]: I1207 19:19:48.394976 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 07 19:19:48 crc kubenswrapper[4815]: I1207 19:19:48.441106 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 07 19:19:48 crc kubenswrapper[4815]: I1207 19:19:48.505019 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 07 19:19:48 crc kubenswrapper[4815]: I1207 19:19:48.517318 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 07 19:19:48 crc kubenswrapper[4815]: I1207 19:19:48.525868 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 07 19:19:48 crc kubenswrapper[4815]: I1207 19:19:48.701005 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 07 19:19:48 crc kubenswrapper[4815]: I1207 19:19:48.702381 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 07 19:19:48 crc kubenswrapper[4815]: I1207 19:19:48.723463 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 07 19:19:48 crc kubenswrapper[4815]: I1207 19:19:48.725649 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 07 19:19:48 crc kubenswrapper[4815]: I1207 19:19:48.837013 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 07 19:19:48 crc kubenswrapper[4815]: I1207 19:19:48.874820 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 07 19:19:48 crc kubenswrapper[4815]: I1207 19:19:48.894594 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 07 19:19:48 crc kubenswrapper[4815]: I1207 19:19:48.903844 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 07 19:19:49 crc kubenswrapper[4815]: I1207 19:19:49.088824 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 07 19:19:49 crc kubenswrapper[4815]: I1207 19:19:49.138116 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 07 19:19:49 crc kubenswrapper[4815]: I1207 19:19:49.206873 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 07 19:19:49 crc kubenswrapper[4815]: I1207 19:19:49.257786 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 07 19:19:49 crc kubenswrapper[4815]: I1207 19:19:49.275741 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 07 19:19:49 crc kubenswrapper[4815]: I1207 19:19:49.366419 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 07 19:19:49 crc kubenswrapper[4815]: I1207 19:19:49.406067 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 07 19:19:49 crc kubenswrapper[4815]: I1207 19:19:49.413269 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 07 19:19:49 crc kubenswrapper[4815]: I1207 19:19:49.508853 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 07 19:19:49 crc kubenswrapper[4815]: I1207 19:19:49.565682 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 07 19:19:49 crc kubenswrapper[4815]: I1207 19:19:49.578984 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 07 19:19:49 crc kubenswrapper[4815]: I1207 19:19:49.613737 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 07 19:19:49 crc kubenswrapper[4815]: I1207 19:19:49.637454 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 07 19:19:49 crc kubenswrapper[4815]: I1207 19:19:49.744691 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 07 19:19:49 crc kubenswrapper[4815]: I1207 19:19:49.820520 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 07 19:19:49 crc kubenswrapper[4815]: I1207 19:19:49.842501 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 07 19:19:49 crc kubenswrapper[4815]: I1207 19:19:49.962372 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 07 19:19:50 crc kubenswrapper[4815]: I1207 19:19:50.024037 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 07 19:19:50 crc kubenswrapper[4815]: I1207 19:19:50.030511 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 07 19:19:50 crc kubenswrapper[4815]: I1207 19:19:50.043276 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 07 19:19:50 crc kubenswrapper[4815]: I1207 19:19:50.119313 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 07 19:19:50 crc kubenswrapper[4815]: I1207 19:19:50.151512 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 07 19:19:50 crc kubenswrapper[4815]: I1207 19:19:50.324328 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 07 19:19:50 crc kubenswrapper[4815]: I1207 19:19:50.335417 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 07 19:19:50 crc kubenswrapper[4815]: I1207 19:19:50.355150 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 07 19:19:50 crc kubenswrapper[4815]: I1207 19:19:50.438362 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 07 19:19:50 crc kubenswrapper[4815]: I1207 19:19:50.464213 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 07 19:19:50 crc kubenswrapper[4815]: I1207 19:19:50.545831 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 07 19:19:50 crc kubenswrapper[4815]: I1207 19:19:50.592771 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 07 19:19:50 crc kubenswrapper[4815]: I1207 19:19:50.597336 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 07 19:19:50 crc kubenswrapper[4815]: I1207 19:19:50.613591 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 07 19:19:50 crc kubenswrapper[4815]: I1207 19:19:50.617723 4815 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 07 19:19:50 crc kubenswrapper[4815]: I1207 19:19:50.655187 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 07 19:19:50 crc kubenswrapper[4815]: I1207 19:19:50.691655 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 07 19:19:50 crc kubenswrapper[4815]: I1207 19:19:50.715523 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 07 19:19:50 crc kubenswrapper[4815]: I1207 19:19:50.715790 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 07 19:19:50 crc kubenswrapper[4815]: I1207 19:19:50.900839 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 07 19:19:50 crc kubenswrapper[4815]: I1207 19:19:50.909825 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 07 19:19:50 crc kubenswrapper[4815]: I1207 19:19:50.920714 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 07 19:19:50 crc kubenswrapper[4815]: I1207 19:19:50.975065 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 07 19:19:51 crc kubenswrapper[4815]: I1207 19:19:51.063141 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 07 19:19:51 crc kubenswrapper[4815]: I1207 19:19:51.208181 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 07 19:19:51 crc kubenswrapper[4815]: I1207 19:19:51.289502 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 07 19:19:51 crc kubenswrapper[4815]: I1207 19:19:51.300194 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 07 19:19:51 crc kubenswrapper[4815]: I1207 19:19:51.310658 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 07 19:19:51 crc kubenswrapper[4815]: I1207 19:19:51.326556 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 07 19:19:51 crc kubenswrapper[4815]: I1207 19:19:51.457316 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 07 19:19:51 crc kubenswrapper[4815]: I1207 19:19:51.532592 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 07 19:19:51 crc kubenswrapper[4815]: I1207 19:19:51.732136 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 07 19:19:51 crc kubenswrapper[4815]: I1207 19:19:51.817450 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 07 19:19:51 crc kubenswrapper[4815]: I1207 19:19:51.904002 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 07 19:19:52 crc kubenswrapper[4815]: I1207 19:19:52.011654 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 07 19:19:52 crc kubenswrapper[4815]: I1207 19:19:52.143331 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 07 19:19:52 crc kubenswrapper[4815]: I1207 19:19:52.215552 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 07 19:19:52 crc kubenswrapper[4815]: I1207 19:19:52.315658 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 07 19:19:52 crc kubenswrapper[4815]: I1207 19:19:52.323571 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 07 19:19:52 crc kubenswrapper[4815]: I1207 19:19:52.360555 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 07 19:19:52 crc kubenswrapper[4815]: I1207 19:19:52.388366 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 07 19:19:52 crc kubenswrapper[4815]: I1207 19:19:52.388617 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 07 19:19:52 crc kubenswrapper[4815]: I1207 19:19:52.388761 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 07 19:19:52 crc kubenswrapper[4815]: I1207 19:19:52.486957 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 07 19:19:52 crc kubenswrapper[4815]: I1207 19:19:52.508672 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 07 19:19:52 crc kubenswrapper[4815]: I1207 19:19:52.533988 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 07 19:19:52 crc kubenswrapper[4815]: I1207 19:19:52.541358 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 07 19:19:52 crc kubenswrapper[4815]: I1207 19:19:52.545751 4815 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 07 19:19:52 crc kubenswrapper[4815]: I1207 19:19:52.582040 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 07 19:19:52 crc kubenswrapper[4815]: I1207 19:19:52.589781 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 07 19:19:52 crc kubenswrapper[4815]: I1207 19:19:52.630978 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 07 19:19:52 crc kubenswrapper[4815]: I1207 19:19:52.654089 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 07 19:19:52 crc kubenswrapper[4815]: I1207 19:19:52.726197 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 07 19:19:52 crc kubenswrapper[4815]: I1207 19:19:52.757903 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 07 19:19:52 crc kubenswrapper[4815]: I1207 19:19:52.811565 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 07 19:19:52 crc kubenswrapper[4815]: I1207 19:19:52.920473 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 07 19:19:53 crc kubenswrapper[4815]: I1207 19:19:53.006298 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 07 19:19:53 crc kubenswrapper[4815]: I1207 19:19:53.060850 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 07 19:19:53 crc kubenswrapper[4815]: I1207 19:19:53.083887 4815 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 07 19:19:53 crc kubenswrapper[4815]: I1207 19:19:53.084130 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://14345a5e5f1047bc8de1a8d5458a23eb2bcf7b63ea43575787421aaf6fbe964c" gracePeriod=5 Dec 07 19:19:53 crc kubenswrapper[4815]: I1207 19:19:53.105143 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 07 19:19:53 crc kubenswrapper[4815]: I1207 19:19:53.136564 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 07 19:19:53 crc kubenswrapper[4815]: I1207 19:19:53.166145 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 07 19:19:53 crc kubenswrapper[4815]: I1207 19:19:53.199238 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 07 19:19:53 crc kubenswrapper[4815]: I1207 19:19:53.206850 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 07 19:19:53 crc kubenswrapper[4815]: I1207 19:19:53.266209 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 07 19:19:53 crc kubenswrapper[4815]: I1207 19:19:53.495383 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 07 19:19:53 crc kubenswrapper[4815]: I1207 19:19:53.504822 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 07 19:19:53 crc kubenswrapper[4815]: I1207 19:19:53.520291 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 07 19:19:53 crc kubenswrapper[4815]: I1207 19:19:53.576338 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 07 19:19:53 crc kubenswrapper[4815]: I1207 19:19:53.743984 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 07 19:19:53 crc kubenswrapper[4815]: I1207 19:19:53.769016 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 07 19:19:53 crc kubenswrapper[4815]: I1207 19:19:53.798006 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 07 19:19:53 crc kubenswrapper[4815]: I1207 19:19:53.906491 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 07 19:19:53 crc kubenswrapper[4815]: I1207 19:19:53.906554 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 07 19:19:53 crc kubenswrapper[4815]: I1207 19:19:53.931868 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 07 19:19:53 crc kubenswrapper[4815]: I1207 19:19:53.938435 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 07 19:19:53 crc kubenswrapper[4815]: I1207 19:19:53.983820 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 07 19:19:54 crc kubenswrapper[4815]: I1207 19:19:54.049381 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 07 19:19:54 crc kubenswrapper[4815]: I1207 19:19:54.052229 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 07 19:19:54 crc kubenswrapper[4815]: I1207 19:19:54.330715 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 07 19:19:54 crc kubenswrapper[4815]: I1207 19:19:54.359988 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 07 19:19:54 crc kubenswrapper[4815]: I1207 19:19:54.386757 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 07 19:19:54 crc kubenswrapper[4815]: I1207 19:19:54.394714 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 07 19:19:54 crc kubenswrapper[4815]: I1207 19:19:54.511169 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 07 19:19:54 crc kubenswrapper[4815]: I1207 19:19:54.570639 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 07 19:19:54 crc kubenswrapper[4815]: I1207 19:19:54.576572 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 07 19:19:54 crc kubenswrapper[4815]: I1207 19:19:54.621252 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 07 19:19:54 crc kubenswrapper[4815]: I1207 19:19:54.629361 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 07 19:19:54 crc kubenswrapper[4815]: I1207 19:19:54.666757 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 07 19:19:54 crc kubenswrapper[4815]: I1207 19:19:54.770997 4815 scope.go:117] "RemoveContainer" containerID="7979a219f32e609d89a2fb8bf24bae0d9382fc93aba2d43438b12a47b40a48f3" Dec 07 19:19:54 crc kubenswrapper[4815]: E1207 19:19:54.771325 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oauth-openshift\" with CrashLoopBackOff: \"back-off 20s restarting failed container=oauth-openshift pod=oauth-openshift-5d4f55d7c5-kcwqp_openshift-authentication(b8f1d1be-2fad-4cfa-9b67-1b91879bf70d)\"" pod="openshift-authentication/oauth-openshift-5d4f55d7c5-kcwqp" podUID="b8f1d1be-2fad-4cfa-9b67-1b91879bf70d" Dec 07 19:19:54 crc kubenswrapper[4815]: I1207 19:19:54.785981 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 07 19:19:54 crc kubenswrapper[4815]: I1207 19:19:54.823649 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 07 19:19:55 crc kubenswrapper[4815]: I1207 19:19:55.045889 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 07 19:19:55 crc kubenswrapper[4815]: I1207 19:19:55.090285 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 07 19:19:55 crc kubenswrapper[4815]: I1207 19:19:55.125305 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 07 19:19:55 crc kubenswrapper[4815]: I1207 19:19:55.148224 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 07 19:19:55 crc kubenswrapper[4815]: I1207 19:19:55.152032 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 07 19:19:55 crc kubenswrapper[4815]: I1207 19:19:55.191197 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 07 19:19:55 crc kubenswrapper[4815]: I1207 19:19:55.193695 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 07 19:19:55 crc kubenswrapper[4815]: I1207 19:19:55.270216 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 07 19:19:55 crc kubenswrapper[4815]: I1207 19:19:55.274578 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 07 19:19:55 crc kubenswrapper[4815]: I1207 19:19:55.323735 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 07 19:19:55 crc kubenswrapper[4815]: I1207 19:19:55.453868 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 07 19:19:55 crc kubenswrapper[4815]: I1207 19:19:55.570819 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 07 19:19:55 crc kubenswrapper[4815]: I1207 19:19:55.571859 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 07 19:19:55 crc kubenswrapper[4815]: I1207 19:19:55.623778 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 07 19:19:55 crc kubenswrapper[4815]: I1207 19:19:55.638962 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 07 19:19:55 crc kubenswrapper[4815]: I1207 19:19:55.639653 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 07 19:19:55 crc kubenswrapper[4815]: I1207 19:19:55.688578 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 07 19:19:55 crc kubenswrapper[4815]: I1207 19:19:55.731435 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 07 19:19:55 crc kubenswrapper[4815]: I1207 19:19:55.772287 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 07 19:19:55 crc kubenswrapper[4815]: I1207 19:19:55.830286 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 07 19:19:55 crc kubenswrapper[4815]: I1207 19:19:55.867845 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 07 19:19:55 crc kubenswrapper[4815]: I1207 19:19:55.914449 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 07 19:19:55 crc kubenswrapper[4815]: I1207 19:19:55.923412 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 07 19:19:55 crc kubenswrapper[4815]: I1207 19:19:55.948248 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 07 19:19:56 crc kubenswrapper[4815]: I1207 19:19:56.019512 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 07 19:19:56 crc kubenswrapper[4815]: I1207 19:19:56.060713 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 07 19:19:56 crc kubenswrapper[4815]: I1207 19:19:56.138010 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 07 19:19:56 crc kubenswrapper[4815]: I1207 19:19:56.168246 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 07 19:19:56 crc kubenswrapper[4815]: I1207 19:19:56.349087 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 07 19:19:56 crc kubenswrapper[4815]: I1207 19:19:56.662908 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 07 19:19:56 crc kubenswrapper[4815]: I1207 19:19:56.761620 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 07 19:19:56 crc kubenswrapper[4815]: I1207 19:19:56.968683 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 07 19:19:57 crc kubenswrapper[4815]: I1207 19:19:57.157911 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 07 19:19:57 crc kubenswrapper[4815]: I1207 19:19:57.304832 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 07 19:19:57 crc kubenswrapper[4815]: I1207 19:19:57.336274 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 07 19:19:57 crc kubenswrapper[4815]: I1207 19:19:57.371608 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 07 19:19:57 crc kubenswrapper[4815]: I1207 19:19:57.387500 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 07 19:19:57 crc kubenswrapper[4815]: I1207 19:19:57.792239 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 07 19:19:58 crc kubenswrapper[4815]: I1207 19:19:58.110475 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 07 19:19:58 crc kubenswrapper[4815]: I1207 19:19:58.671440 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 07 19:19:58 crc kubenswrapper[4815]: I1207 19:19:58.671546 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 07 19:19:58 crc kubenswrapper[4815]: I1207 19:19:58.821734 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 07 19:19:58 crc kubenswrapper[4815]: I1207 19:19:58.821772 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 07 19:19:58 crc kubenswrapper[4815]: I1207 19:19:58.821794 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 07 19:19:58 crc kubenswrapper[4815]: I1207 19:19:58.821834 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 07 19:19:58 crc kubenswrapper[4815]: I1207 19:19:58.821841 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 07 19:19:58 crc kubenswrapper[4815]: I1207 19:19:58.821856 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 07 19:19:58 crc kubenswrapper[4815]: I1207 19:19:58.821868 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 07 19:19:58 crc kubenswrapper[4815]: I1207 19:19:58.821895 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 07 19:19:58 crc kubenswrapper[4815]: I1207 19:19:58.821898 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 07 19:19:58 crc kubenswrapper[4815]: I1207 19:19:58.822121 4815 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Dec 07 19:19:58 crc kubenswrapper[4815]: I1207 19:19:58.822133 4815 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 07 19:19:58 crc kubenswrapper[4815]: I1207 19:19:58.822142 4815 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Dec 07 19:19:58 crc kubenswrapper[4815]: I1207 19:19:58.822150 4815 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Dec 07 19:19:58 crc kubenswrapper[4815]: I1207 19:19:58.829493 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 07 19:19:58 crc kubenswrapper[4815]: I1207 19:19:58.922806 4815 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 07 19:19:58 crc kubenswrapper[4815]: I1207 19:19:58.925171 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 07 19:19:58 crc kubenswrapper[4815]: I1207 19:19:58.925652 4815 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="14345a5e5f1047bc8de1a8d5458a23eb2bcf7b63ea43575787421aaf6fbe964c" exitCode=137 Dec 07 19:19:58 crc kubenswrapper[4815]: I1207 19:19:58.925785 4815 scope.go:117] "RemoveContainer" containerID="14345a5e5f1047bc8de1a8d5458a23eb2bcf7b63ea43575787421aaf6fbe964c" Dec 07 19:19:58 crc kubenswrapper[4815]: I1207 19:19:58.926094 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 07 19:19:58 crc kubenswrapper[4815]: I1207 19:19:58.941655 4815 scope.go:117] "RemoveContainer" containerID="14345a5e5f1047bc8de1a8d5458a23eb2bcf7b63ea43575787421aaf6fbe964c" Dec 07 19:19:58 crc kubenswrapper[4815]: E1207 19:19:58.942102 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14345a5e5f1047bc8de1a8d5458a23eb2bcf7b63ea43575787421aaf6fbe964c\": container with ID starting with 14345a5e5f1047bc8de1a8d5458a23eb2bcf7b63ea43575787421aaf6fbe964c not found: ID does not exist" containerID="14345a5e5f1047bc8de1a8d5458a23eb2bcf7b63ea43575787421aaf6fbe964c" Dec 07 19:19:58 crc kubenswrapper[4815]: I1207 19:19:58.942141 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14345a5e5f1047bc8de1a8d5458a23eb2bcf7b63ea43575787421aaf6fbe964c"} err="failed to get container status \"14345a5e5f1047bc8de1a8d5458a23eb2bcf7b63ea43575787421aaf6fbe964c\": rpc error: code = NotFound desc = could not find container \"14345a5e5f1047bc8de1a8d5458a23eb2bcf7b63ea43575787421aaf6fbe964c\": container with ID starting with 14345a5e5f1047bc8de1a8d5458a23eb2bcf7b63ea43575787421aaf6fbe964c not found: ID does not exist" Dec 07 19:19:59 crc kubenswrapper[4815]: I1207 19:19:59.348249 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 07 19:19:59 crc kubenswrapper[4815]: I1207 19:19:59.400535 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 07 19:19:59 crc kubenswrapper[4815]: I1207 19:19:59.777503 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Dec 07 19:19:59 crc kubenswrapper[4815]: I1207 19:19:59.921512 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 07 19:20:06 crc kubenswrapper[4815]: I1207 19:20:06.770693 4815 scope.go:117] "RemoveContainer" containerID="7979a219f32e609d89a2fb8bf24bae0d9382fc93aba2d43438b12a47b40a48f3" Dec 07 19:20:07 crc kubenswrapper[4815]: I1207 19:20:07.988706 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-5d4f55d7c5-kcwqp_b8f1d1be-2fad-4cfa-9b67-1b91879bf70d/oauth-openshift/2.log" Dec 07 19:20:07 crc kubenswrapper[4815]: I1207 19:20:07.989046 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5d4f55d7c5-kcwqp" event={"ID":"b8f1d1be-2fad-4cfa-9b67-1b91879bf70d","Type":"ContainerStarted","Data":"1852857dff7f91d9fd1e8ea8f61e4f3ea505990b99dcbc9eb838b9e4d0298955"} Dec 07 19:20:07 crc kubenswrapper[4815]: I1207 19:20:07.990196 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-5d4f55d7c5-kcwqp" Dec 07 19:20:07 crc kubenswrapper[4815]: I1207 19:20:07.996026 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-5d4f55d7c5-kcwqp" Dec 07 19:20:08 crc kubenswrapper[4815]: I1207 19:20:08.015325 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-5d4f55d7c5-kcwqp" podStartSLOduration=88.015307291 podStartE2EDuration="1m28.015307291s" podCreationTimestamp="2025-12-07 19:18:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:20:08.01397183 +0000 UTC m=+312.592961885" watchObservedRunningTime="2025-12-07 19:20:08.015307291 +0000 UTC m=+312.594297336" Dec 07 19:20:13 crc kubenswrapper[4815]: I1207 19:20:13.015520 4815 generic.go:334] "Generic (PLEG): container finished" podID="e58da229-9be5-4d48-a1af-74d5316d09f3" containerID="254c86c7a7a54b29d6f66e30ae0211606dfb4924aac8c6f2bd2717b2514b73ad" exitCode=0 Dec 07 19:20:13 crc kubenswrapper[4815]: I1207 19:20:13.015624 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-2r46v" event={"ID":"e58da229-9be5-4d48-a1af-74d5316d09f3","Type":"ContainerDied","Data":"254c86c7a7a54b29d6f66e30ae0211606dfb4924aac8c6f2bd2717b2514b73ad"} Dec 07 19:20:13 crc kubenswrapper[4815]: I1207 19:20:13.016410 4815 scope.go:117] "RemoveContainer" containerID="254c86c7a7a54b29d6f66e30ae0211606dfb4924aac8c6f2bd2717b2514b73ad" Dec 07 19:20:13 crc kubenswrapper[4815]: I1207 19:20:13.356002 4815 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-marketplace/marketplace-operator-79b997595-2r46v" Dec 07 19:20:13 crc kubenswrapper[4815]: I1207 19:20:13.356359 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-2r46v" Dec 07 19:20:14 crc kubenswrapper[4815]: I1207 19:20:14.024871 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-2r46v" event={"ID":"e58da229-9be5-4d48-a1af-74d5316d09f3","Type":"ContainerStarted","Data":"5c1367bd020b3ea1e7ef021e2d310dd5a91fb5d9fd7e9470fc1d557d44f6656c"} Dec 07 19:20:14 crc kubenswrapper[4815]: I1207 19:20:14.025341 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-2r46v" Dec 07 19:20:14 crc kubenswrapper[4815]: I1207 19:20:14.029873 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-2r46v" Dec 07 19:20:26 crc kubenswrapper[4815]: I1207 19:20:26.701335 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-6hr2j"] Dec 07 19:20:26 crc kubenswrapper[4815]: I1207 19:20:26.703279 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-6hr2j" podUID="df54df8a-669c-4230-b377-640a79b757ab" containerName="controller-manager" containerID="cri-o://cf7a7eeb0161b77fa9cbe45b5ed688b4fe92d870fc90ef417650db5b3c9887b2" gracePeriod=30 Dec 07 19:20:26 crc kubenswrapper[4815]: I1207 19:20:26.793036 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-mh7gn"] Dec 07 19:20:26 crc kubenswrapper[4815]: I1207 19:20:26.793495 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mh7gn" podUID="9f6cc294-684f-4ac2-8eb1-183af364c619" containerName="route-controller-manager" containerID="cri-o://503e958d61e1dfcaca0ec3d4a2ae6fa25c4b0e6db4e19faac84c19fc7c1d4225" gracePeriod=30 Dec 07 19:20:27 crc kubenswrapper[4815]: I1207 19:20:27.090619 4815 generic.go:334] "Generic (PLEG): container finished" podID="df54df8a-669c-4230-b377-640a79b757ab" containerID="cf7a7eeb0161b77fa9cbe45b5ed688b4fe92d870fc90ef417650db5b3c9887b2" exitCode=0 Dec 07 19:20:27 crc kubenswrapper[4815]: I1207 19:20:27.090711 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-6hr2j" event={"ID":"df54df8a-669c-4230-b377-640a79b757ab","Type":"ContainerDied","Data":"cf7a7eeb0161b77fa9cbe45b5ed688b4fe92d870fc90ef417650db5b3c9887b2"} Dec 07 19:20:27 crc kubenswrapper[4815]: I1207 19:20:27.092230 4815 generic.go:334] "Generic (PLEG): container finished" podID="9f6cc294-684f-4ac2-8eb1-183af364c619" containerID="503e958d61e1dfcaca0ec3d4a2ae6fa25c4b0e6db4e19faac84c19fc7c1d4225" exitCode=0 Dec 07 19:20:27 crc kubenswrapper[4815]: I1207 19:20:27.092260 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mh7gn" event={"ID":"9f6cc294-684f-4ac2-8eb1-183af364c619","Type":"ContainerDied","Data":"503e958d61e1dfcaca0ec3d4a2ae6fa25c4b0e6db4e19faac84c19fc7c1d4225"} Dec 07 19:20:27 crc kubenswrapper[4815]: I1207 19:20:27.128022 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-6hr2j" Dec 07 19:20:27 crc kubenswrapper[4815]: I1207 19:20:27.262849 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mh7gn" Dec 07 19:20:27 crc kubenswrapper[4815]: I1207 19:20:27.307712 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df54df8a-669c-4230-b377-640a79b757ab-config\") pod \"df54df8a-669c-4230-b377-640a79b757ab\" (UID: \"df54df8a-669c-4230-b377-640a79b757ab\") " Dec 07 19:20:27 crc kubenswrapper[4815]: I1207 19:20:27.307793 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/df54df8a-669c-4230-b377-640a79b757ab-client-ca\") pod \"df54df8a-669c-4230-b377-640a79b757ab\" (UID: \"df54df8a-669c-4230-b377-640a79b757ab\") " Dec 07 19:20:27 crc kubenswrapper[4815]: I1207 19:20:27.307836 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/df54df8a-669c-4230-b377-640a79b757ab-proxy-ca-bundles\") pod \"df54df8a-669c-4230-b377-640a79b757ab\" (UID: \"df54df8a-669c-4230-b377-640a79b757ab\") " Dec 07 19:20:27 crc kubenswrapper[4815]: I1207 19:20:27.308431 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9p4zx\" (UniqueName: \"kubernetes.io/projected/df54df8a-669c-4230-b377-640a79b757ab-kube-api-access-9p4zx\") pod \"df54df8a-669c-4230-b377-640a79b757ab\" (UID: \"df54df8a-669c-4230-b377-640a79b757ab\") " Dec 07 19:20:27 crc kubenswrapper[4815]: I1207 19:20:27.308529 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/df54df8a-669c-4230-b377-640a79b757ab-serving-cert\") pod \"df54df8a-669c-4230-b377-640a79b757ab\" (UID: \"df54df8a-669c-4230-b377-640a79b757ab\") " Dec 07 19:20:27 crc kubenswrapper[4815]: I1207 19:20:27.309937 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df54df8a-669c-4230-b377-640a79b757ab-config" (OuterVolumeSpecName: "config") pod "df54df8a-669c-4230-b377-640a79b757ab" (UID: "df54df8a-669c-4230-b377-640a79b757ab"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:20:27 crc kubenswrapper[4815]: I1207 19:20:27.310804 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df54df8a-669c-4230-b377-640a79b757ab-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "df54df8a-669c-4230-b377-640a79b757ab" (UID: "df54df8a-669c-4230-b377-640a79b757ab"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:20:27 crc kubenswrapper[4815]: I1207 19:20:27.311007 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df54df8a-669c-4230-b377-640a79b757ab-client-ca" (OuterVolumeSpecName: "client-ca") pod "df54df8a-669c-4230-b377-640a79b757ab" (UID: "df54df8a-669c-4230-b377-640a79b757ab"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:20:27 crc kubenswrapper[4815]: I1207 19:20:27.314952 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df54df8a-669c-4230-b377-640a79b757ab-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "df54df8a-669c-4230-b377-640a79b757ab" (UID: "df54df8a-669c-4230-b377-640a79b757ab"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:20:27 crc kubenswrapper[4815]: I1207 19:20:27.318782 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df54df8a-669c-4230-b377-640a79b757ab-kube-api-access-9p4zx" (OuterVolumeSpecName: "kube-api-access-9p4zx") pod "df54df8a-669c-4230-b377-640a79b757ab" (UID: "df54df8a-669c-4230-b377-640a79b757ab"). InnerVolumeSpecName "kube-api-access-9p4zx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:20:27 crc kubenswrapper[4815]: I1207 19:20:27.410162 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjtrc\" (UniqueName: \"kubernetes.io/projected/9f6cc294-684f-4ac2-8eb1-183af364c619-kube-api-access-pjtrc\") pod \"9f6cc294-684f-4ac2-8eb1-183af364c619\" (UID: \"9f6cc294-684f-4ac2-8eb1-183af364c619\") " Dec 07 19:20:27 crc kubenswrapper[4815]: I1207 19:20:27.410251 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9f6cc294-684f-4ac2-8eb1-183af364c619-client-ca\") pod \"9f6cc294-684f-4ac2-8eb1-183af364c619\" (UID: \"9f6cc294-684f-4ac2-8eb1-183af364c619\") " Dec 07 19:20:27 crc kubenswrapper[4815]: I1207 19:20:27.410316 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f6cc294-684f-4ac2-8eb1-183af364c619-config\") pod \"9f6cc294-684f-4ac2-8eb1-183af364c619\" (UID: \"9f6cc294-684f-4ac2-8eb1-183af364c619\") " Dec 07 19:20:27 crc kubenswrapper[4815]: I1207 19:20:27.410412 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f6cc294-684f-4ac2-8eb1-183af364c619-serving-cert\") pod \"9f6cc294-684f-4ac2-8eb1-183af364c619\" (UID: \"9f6cc294-684f-4ac2-8eb1-183af364c619\") " Dec 07 19:20:27 crc kubenswrapper[4815]: I1207 19:20:27.410633 4815 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df54df8a-669c-4230-b377-640a79b757ab-config\") on node \"crc\" DevicePath \"\"" Dec 07 19:20:27 crc kubenswrapper[4815]: I1207 19:20:27.410659 4815 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/df54df8a-669c-4230-b377-640a79b757ab-client-ca\") on node \"crc\" DevicePath \"\"" Dec 07 19:20:27 crc kubenswrapper[4815]: I1207 19:20:27.410671 4815 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/df54df8a-669c-4230-b377-640a79b757ab-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 07 19:20:27 crc kubenswrapper[4815]: I1207 19:20:27.410686 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9p4zx\" (UniqueName: \"kubernetes.io/projected/df54df8a-669c-4230-b377-640a79b757ab-kube-api-access-9p4zx\") on node \"crc\" DevicePath \"\"" Dec 07 19:20:27 crc kubenswrapper[4815]: I1207 19:20:27.410698 4815 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/df54df8a-669c-4230-b377-640a79b757ab-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 07 19:20:27 crc kubenswrapper[4815]: I1207 19:20:27.411347 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f6cc294-684f-4ac2-8eb1-183af364c619-client-ca" (OuterVolumeSpecName: "client-ca") pod "9f6cc294-684f-4ac2-8eb1-183af364c619" (UID: "9f6cc294-684f-4ac2-8eb1-183af364c619"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:20:27 crc kubenswrapper[4815]: I1207 19:20:27.411356 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f6cc294-684f-4ac2-8eb1-183af364c619-config" (OuterVolumeSpecName: "config") pod "9f6cc294-684f-4ac2-8eb1-183af364c619" (UID: "9f6cc294-684f-4ac2-8eb1-183af364c619"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:20:27 crc kubenswrapper[4815]: I1207 19:20:27.413950 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f6cc294-684f-4ac2-8eb1-183af364c619-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9f6cc294-684f-4ac2-8eb1-183af364c619" (UID: "9f6cc294-684f-4ac2-8eb1-183af364c619"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:20:27 crc kubenswrapper[4815]: I1207 19:20:27.414140 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f6cc294-684f-4ac2-8eb1-183af364c619-kube-api-access-pjtrc" (OuterVolumeSpecName: "kube-api-access-pjtrc") pod "9f6cc294-684f-4ac2-8eb1-183af364c619" (UID: "9f6cc294-684f-4ac2-8eb1-183af364c619"). InnerVolumeSpecName "kube-api-access-pjtrc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:20:27 crc kubenswrapper[4815]: I1207 19:20:27.512064 4815 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f6cc294-684f-4ac2-8eb1-183af364c619-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 07 19:20:27 crc kubenswrapper[4815]: I1207 19:20:27.512105 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjtrc\" (UniqueName: \"kubernetes.io/projected/9f6cc294-684f-4ac2-8eb1-183af364c619-kube-api-access-pjtrc\") on node \"crc\" DevicePath \"\"" Dec 07 19:20:27 crc kubenswrapper[4815]: I1207 19:20:27.512147 4815 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9f6cc294-684f-4ac2-8eb1-183af364c619-client-ca\") on node \"crc\" DevicePath \"\"" Dec 07 19:20:27 crc kubenswrapper[4815]: I1207 19:20:27.512161 4815 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f6cc294-684f-4ac2-8eb1-183af364c619-config\") on node \"crc\" DevicePath \"\"" Dec 07 19:20:28 crc kubenswrapper[4815]: I1207 19:20:28.102838 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mh7gn" event={"ID":"9f6cc294-684f-4ac2-8eb1-183af364c619","Type":"ContainerDied","Data":"f2e64ceb5a41b2db28d78f7abd922bde732d98001eab561a7aecde4718a39fe9"} Dec 07 19:20:28 crc kubenswrapper[4815]: I1207 19:20:28.102905 4815 scope.go:117] "RemoveContainer" containerID="503e958d61e1dfcaca0ec3d4a2ae6fa25c4b0e6db4e19faac84c19fc7c1d4225" Dec 07 19:20:28 crc kubenswrapper[4815]: I1207 19:20:28.103757 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mh7gn" Dec 07 19:20:28 crc kubenswrapper[4815]: I1207 19:20:28.108462 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-6hr2j" event={"ID":"df54df8a-669c-4230-b377-640a79b757ab","Type":"ContainerDied","Data":"4e31e68bffd3deabc48fc1b3d681fac041f1704c95869de04ce2e847c138532e"} Dec 07 19:20:28 crc kubenswrapper[4815]: I1207 19:20:28.108544 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-6hr2j" Dec 07 19:20:28 crc kubenswrapper[4815]: I1207 19:20:28.136944 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-mh7gn"] Dec 07 19:20:28 crc kubenswrapper[4815]: I1207 19:20:28.142989 4815 scope.go:117] "RemoveContainer" containerID="cf7a7eeb0161b77fa9cbe45b5ed688b4fe92d870fc90ef417650db5b3c9887b2" Dec 07 19:20:28 crc kubenswrapper[4815]: I1207 19:20:28.145664 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-mh7gn"] Dec 07 19:20:28 crc kubenswrapper[4815]: I1207 19:20:28.152860 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-6hr2j"] Dec 07 19:20:28 crc kubenswrapper[4815]: I1207 19:20:28.160018 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-6hr2j"] Dec 07 19:20:28 crc kubenswrapper[4815]: I1207 19:20:28.814652 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c4c7d9b57-b8jkp"] Dec 07 19:20:28 crc kubenswrapper[4815]: E1207 19:20:28.814935 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 07 19:20:28 crc kubenswrapper[4815]: I1207 19:20:28.814950 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 07 19:20:28 crc kubenswrapper[4815]: E1207 19:20:28.814962 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df54df8a-669c-4230-b377-640a79b757ab" containerName="controller-manager" Dec 07 19:20:28 crc kubenswrapper[4815]: I1207 19:20:28.814971 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="df54df8a-669c-4230-b377-640a79b757ab" containerName="controller-manager" Dec 07 19:20:28 crc kubenswrapper[4815]: E1207 19:20:28.814984 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f6cc294-684f-4ac2-8eb1-183af364c619" containerName="route-controller-manager" Dec 07 19:20:28 crc kubenswrapper[4815]: I1207 19:20:28.814991 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f6cc294-684f-4ac2-8eb1-183af364c619" containerName="route-controller-manager" Dec 07 19:20:28 crc kubenswrapper[4815]: E1207 19:20:28.815005 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f52b789-4509-4834-95f0-62d38f4e620c" containerName="installer" Dec 07 19:20:28 crc kubenswrapper[4815]: I1207 19:20:28.815013 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f52b789-4509-4834-95f0-62d38f4e620c" containerName="installer" Dec 07 19:20:28 crc kubenswrapper[4815]: I1207 19:20:28.815120 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 07 19:20:28 crc kubenswrapper[4815]: I1207 19:20:28.815137 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f52b789-4509-4834-95f0-62d38f4e620c" containerName="installer" Dec 07 19:20:28 crc kubenswrapper[4815]: I1207 19:20:28.815148 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f6cc294-684f-4ac2-8eb1-183af364c619" containerName="route-controller-manager" Dec 07 19:20:28 crc kubenswrapper[4815]: I1207 19:20:28.815157 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="df54df8a-669c-4230-b377-640a79b757ab" containerName="controller-manager" Dec 07 19:20:28 crc kubenswrapper[4815]: I1207 19:20:28.815611 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-c4c7d9b57-b8jkp" Dec 07 19:20:28 crc kubenswrapper[4815]: I1207 19:20:28.819847 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 07 19:20:28 crc kubenswrapper[4815]: I1207 19:20:28.819939 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 07 19:20:28 crc kubenswrapper[4815]: I1207 19:20:28.820849 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 07 19:20:28 crc kubenswrapper[4815]: I1207 19:20:28.821136 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 07 19:20:28 crc kubenswrapper[4815]: I1207 19:20:28.821669 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 07 19:20:28 crc kubenswrapper[4815]: I1207 19:20:28.821981 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 07 19:20:28 crc kubenswrapper[4815]: I1207 19:20:28.826000 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6799c5f44c-lhp92"] Dec 07 19:20:28 crc kubenswrapper[4815]: I1207 19:20:28.826733 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6799c5f44c-lhp92" Dec 07 19:20:28 crc kubenswrapper[4815]: I1207 19:20:28.828254 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 07 19:20:28 crc kubenswrapper[4815]: I1207 19:20:28.831562 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c4c7d9b57-b8jkp"] Dec 07 19:20:28 crc kubenswrapper[4815]: I1207 19:20:28.835567 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 07 19:20:28 crc kubenswrapper[4815]: I1207 19:20:28.836554 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6799c5f44c-lhp92"] Dec 07 19:20:28 crc kubenswrapper[4815]: I1207 19:20:28.838233 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 07 19:20:28 crc kubenswrapper[4815]: I1207 19:20:28.838450 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 07 19:20:28 crc kubenswrapper[4815]: I1207 19:20:28.845546 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 07 19:20:28 crc kubenswrapper[4815]: I1207 19:20:28.845676 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 07 19:20:28 crc kubenswrapper[4815]: I1207 19:20:28.854827 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 07 19:20:28 crc kubenswrapper[4815]: I1207 19:20:28.931093 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ed9e1527-b614-4f46-bdb1-787c5db70e7b-client-ca\") pod \"route-controller-manager-c4c7d9b57-b8jkp\" (UID: \"ed9e1527-b614-4f46-bdb1-787c5db70e7b\") " pod="openshift-route-controller-manager/route-controller-manager-c4c7d9b57-b8jkp" Dec 07 19:20:28 crc kubenswrapper[4815]: I1207 19:20:28.931232 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5xnk\" (UniqueName: \"kubernetes.io/projected/ed9e1527-b614-4f46-bdb1-787c5db70e7b-kube-api-access-q5xnk\") pod \"route-controller-manager-c4c7d9b57-b8jkp\" (UID: \"ed9e1527-b614-4f46-bdb1-787c5db70e7b\") " pod="openshift-route-controller-manager/route-controller-manager-c4c7d9b57-b8jkp" Dec 07 19:20:28 crc kubenswrapper[4815]: I1207 19:20:28.931311 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7d21150d-3baa-49af-ac9d-3eb0ab484b18-client-ca\") pod \"controller-manager-6799c5f44c-lhp92\" (UID: \"7d21150d-3baa-49af-ac9d-3eb0ab484b18\") " pod="openshift-controller-manager/controller-manager-6799c5f44c-lhp92" Dec 07 19:20:28 crc kubenswrapper[4815]: I1207 19:20:28.931393 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed9e1527-b614-4f46-bdb1-787c5db70e7b-serving-cert\") pod \"route-controller-manager-c4c7d9b57-b8jkp\" (UID: \"ed9e1527-b614-4f46-bdb1-787c5db70e7b\") " pod="openshift-route-controller-manager/route-controller-manager-c4c7d9b57-b8jkp" Dec 07 19:20:28 crc kubenswrapper[4815]: I1207 19:20:28.931427 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7d21150d-3baa-49af-ac9d-3eb0ab484b18-proxy-ca-bundles\") pod \"controller-manager-6799c5f44c-lhp92\" (UID: \"7d21150d-3baa-49af-ac9d-3eb0ab484b18\") " pod="openshift-controller-manager/controller-manager-6799c5f44c-lhp92" Dec 07 19:20:28 crc kubenswrapper[4815]: I1207 19:20:28.931468 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed9e1527-b614-4f46-bdb1-787c5db70e7b-config\") pod \"route-controller-manager-c4c7d9b57-b8jkp\" (UID: \"ed9e1527-b614-4f46-bdb1-787c5db70e7b\") " pod="openshift-route-controller-manager/route-controller-manager-c4c7d9b57-b8jkp" Dec 07 19:20:28 crc kubenswrapper[4815]: I1207 19:20:28.931510 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d21150d-3baa-49af-ac9d-3eb0ab484b18-serving-cert\") pod \"controller-manager-6799c5f44c-lhp92\" (UID: \"7d21150d-3baa-49af-ac9d-3eb0ab484b18\") " pod="openshift-controller-manager/controller-manager-6799c5f44c-lhp92" Dec 07 19:20:28 crc kubenswrapper[4815]: I1207 19:20:28.931540 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d21150d-3baa-49af-ac9d-3eb0ab484b18-config\") pod \"controller-manager-6799c5f44c-lhp92\" (UID: \"7d21150d-3baa-49af-ac9d-3eb0ab484b18\") " pod="openshift-controller-manager/controller-manager-6799c5f44c-lhp92" Dec 07 19:20:28 crc kubenswrapper[4815]: I1207 19:20:28.931575 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shwx8\" (UniqueName: \"kubernetes.io/projected/7d21150d-3baa-49af-ac9d-3eb0ab484b18-kube-api-access-shwx8\") pod \"controller-manager-6799c5f44c-lhp92\" (UID: \"7d21150d-3baa-49af-ac9d-3eb0ab484b18\") " pod="openshift-controller-manager/controller-manager-6799c5f44c-lhp92" Dec 07 19:20:29 crc kubenswrapper[4815]: I1207 19:20:29.032981 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed9e1527-b614-4f46-bdb1-787c5db70e7b-config\") pod \"route-controller-manager-c4c7d9b57-b8jkp\" (UID: \"ed9e1527-b614-4f46-bdb1-787c5db70e7b\") " pod="openshift-route-controller-manager/route-controller-manager-c4c7d9b57-b8jkp" Dec 07 19:20:29 crc kubenswrapper[4815]: I1207 19:20:29.033037 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d21150d-3baa-49af-ac9d-3eb0ab484b18-serving-cert\") pod \"controller-manager-6799c5f44c-lhp92\" (UID: \"7d21150d-3baa-49af-ac9d-3eb0ab484b18\") " pod="openshift-controller-manager/controller-manager-6799c5f44c-lhp92" Dec 07 19:20:29 crc kubenswrapper[4815]: I1207 19:20:29.033063 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d21150d-3baa-49af-ac9d-3eb0ab484b18-config\") pod \"controller-manager-6799c5f44c-lhp92\" (UID: \"7d21150d-3baa-49af-ac9d-3eb0ab484b18\") " pod="openshift-controller-manager/controller-manager-6799c5f44c-lhp92" Dec 07 19:20:29 crc kubenswrapper[4815]: I1207 19:20:29.033088 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shwx8\" (UniqueName: \"kubernetes.io/projected/7d21150d-3baa-49af-ac9d-3eb0ab484b18-kube-api-access-shwx8\") pod \"controller-manager-6799c5f44c-lhp92\" (UID: \"7d21150d-3baa-49af-ac9d-3eb0ab484b18\") " pod="openshift-controller-manager/controller-manager-6799c5f44c-lhp92" Dec 07 19:20:29 crc kubenswrapper[4815]: I1207 19:20:29.033139 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ed9e1527-b614-4f46-bdb1-787c5db70e7b-client-ca\") pod \"route-controller-manager-c4c7d9b57-b8jkp\" (UID: \"ed9e1527-b614-4f46-bdb1-787c5db70e7b\") " pod="openshift-route-controller-manager/route-controller-manager-c4c7d9b57-b8jkp" Dec 07 19:20:29 crc kubenswrapper[4815]: I1207 19:20:29.033176 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5xnk\" (UniqueName: \"kubernetes.io/projected/ed9e1527-b614-4f46-bdb1-787c5db70e7b-kube-api-access-q5xnk\") pod \"route-controller-manager-c4c7d9b57-b8jkp\" (UID: \"ed9e1527-b614-4f46-bdb1-787c5db70e7b\") " pod="openshift-route-controller-manager/route-controller-manager-c4c7d9b57-b8jkp" Dec 07 19:20:29 crc kubenswrapper[4815]: I1207 19:20:29.033200 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7d21150d-3baa-49af-ac9d-3eb0ab484b18-client-ca\") pod \"controller-manager-6799c5f44c-lhp92\" (UID: \"7d21150d-3baa-49af-ac9d-3eb0ab484b18\") " pod="openshift-controller-manager/controller-manager-6799c5f44c-lhp92" Dec 07 19:20:29 crc kubenswrapper[4815]: I1207 19:20:29.033231 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed9e1527-b614-4f46-bdb1-787c5db70e7b-serving-cert\") pod \"route-controller-manager-c4c7d9b57-b8jkp\" (UID: \"ed9e1527-b614-4f46-bdb1-787c5db70e7b\") " pod="openshift-route-controller-manager/route-controller-manager-c4c7d9b57-b8jkp" Dec 07 19:20:29 crc kubenswrapper[4815]: I1207 19:20:29.033253 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7d21150d-3baa-49af-ac9d-3eb0ab484b18-proxy-ca-bundles\") pod \"controller-manager-6799c5f44c-lhp92\" (UID: \"7d21150d-3baa-49af-ac9d-3eb0ab484b18\") " pod="openshift-controller-manager/controller-manager-6799c5f44c-lhp92" Dec 07 19:20:29 crc kubenswrapper[4815]: I1207 19:20:29.035118 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7d21150d-3baa-49af-ac9d-3eb0ab484b18-client-ca\") pod \"controller-manager-6799c5f44c-lhp92\" (UID: \"7d21150d-3baa-49af-ac9d-3eb0ab484b18\") " pod="openshift-controller-manager/controller-manager-6799c5f44c-lhp92" Dec 07 19:20:29 crc kubenswrapper[4815]: I1207 19:20:29.035174 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ed9e1527-b614-4f46-bdb1-787c5db70e7b-client-ca\") pod \"route-controller-manager-c4c7d9b57-b8jkp\" (UID: \"ed9e1527-b614-4f46-bdb1-787c5db70e7b\") " pod="openshift-route-controller-manager/route-controller-manager-c4c7d9b57-b8jkp" Dec 07 19:20:29 crc kubenswrapper[4815]: I1207 19:20:29.035312 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed9e1527-b614-4f46-bdb1-787c5db70e7b-config\") pod \"route-controller-manager-c4c7d9b57-b8jkp\" (UID: \"ed9e1527-b614-4f46-bdb1-787c5db70e7b\") " pod="openshift-route-controller-manager/route-controller-manager-c4c7d9b57-b8jkp" Dec 07 19:20:29 crc kubenswrapper[4815]: I1207 19:20:29.036116 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7d21150d-3baa-49af-ac9d-3eb0ab484b18-proxy-ca-bundles\") pod \"controller-manager-6799c5f44c-lhp92\" (UID: \"7d21150d-3baa-49af-ac9d-3eb0ab484b18\") " pod="openshift-controller-manager/controller-manager-6799c5f44c-lhp92" Dec 07 19:20:29 crc kubenswrapper[4815]: I1207 19:20:29.036640 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d21150d-3baa-49af-ac9d-3eb0ab484b18-config\") pod \"controller-manager-6799c5f44c-lhp92\" (UID: \"7d21150d-3baa-49af-ac9d-3eb0ab484b18\") " pod="openshift-controller-manager/controller-manager-6799c5f44c-lhp92" Dec 07 19:20:29 crc kubenswrapper[4815]: I1207 19:20:29.039771 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed9e1527-b614-4f46-bdb1-787c5db70e7b-serving-cert\") pod \"route-controller-manager-c4c7d9b57-b8jkp\" (UID: \"ed9e1527-b614-4f46-bdb1-787c5db70e7b\") " pod="openshift-route-controller-manager/route-controller-manager-c4c7d9b57-b8jkp" Dec 07 19:20:29 crc kubenswrapper[4815]: I1207 19:20:29.046322 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d21150d-3baa-49af-ac9d-3eb0ab484b18-serving-cert\") pod \"controller-manager-6799c5f44c-lhp92\" (UID: \"7d21150d-3baa-49af-ac9d-3eb0ab484b18\") " pod="openshift-controller-manager/controller-manager-6799c5f44c-lhp92" Dec 07 19:20:29 crc kubenswrapper[4815]: I1207 19:20:29.054499 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shwx8\" (UniqueName: \"kubernetes.io/projected/7d21150d-3baa-49af-ac9d-3eb0ab484b18-kube-api-access-shwx8\") pod \"controller-manager-6799c5f44c-lhp92\" (UID: \"7d21150d-3baa-49af-ac9d-3eb0ab484b18\") " pod="openshift-controller-manager/controller-manager-6799c5f44c-lhp92" Dec 07 19:20:29 crc kubenswrapper[4815]: I1207 19:20:29.055683 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5xnk\" (UniqueName: \"kubernetes.io/projected/ed9e1527-b614-4f46-bdb1-787c5db70e7b-kube-api-access-q5xnk\") pod \"route-controller-manager-c4c7d9b57-b8jkp\" (UID: \"ed9e1527-b614-4f46-bdb1-787c5db70e7b\") " pod="openshift-route-controller-manager/route-controller-manager-c4c7d9b57-b8jkp" Dec 07 19:20:29 crc kubenswrapper[4815]: I1207 19:20:29.146687 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-c4c7d9b57-b8jkp" Dec 07 19:20:29 crc kubenswrapper[4815]: I1207 19:20:29.160348 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6799c5f44c-lhp92" Dec 07 19:20:29 crc kubenswrapper[4815]: I1207 19:20:29.391637 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c4c7d9b57-b8jkp"] Dec 07 19:20:29 crc kubenswrapper[4815]: W1207 19:20:29.397568 4815 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded9e1527_b614_4f46_bdb1_787c5db70e7b.slice/crio-a4559a70be43660d267afaf8170d6452214ce6583be91db906d36ad277450041 WatchSource:0}: Error finding container a4559a70be43660d267afaf8170d6452214ce6583be91db906d36ad277450041: Status 404 returned error can't find the container with id a4559a70be43660d267afaf8170d6452214ce6583be91db906d36ad277450041 Dec 07 19:20:29 crc kubenswrapper[4815]: I1207 19:20:29.436467 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6799c5f44c-lhp92"] Dec 07 19:20:29 crc kubenswrapper[4815]: W1207 19:20:29.439128 4815 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d21150d_3baa_49af_ac9d_3eb0ab484b18.slice/crio-d3e0640fe5cd661cf5069297b282171f185f1d71dcf48593282f2ac7a999c682 WatchSource:0}: Error finding container d3e0640fe5cd661cf5069297b282171f185f1d71dcf48593282f2ac7a999c682: Status 404 returned error can't find the container with id d3e0640fe5cd661cf5069297b282171f185f1d71dcf48593282f2ac7a999c682 Dec 07 19:20:29 crc kubenswrapper[4815]: I1207 19:20:29.775414 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f6cc294-684f-4ac2-8eb1-183af364c619" path="/var/lib/kubelet/pods/9f6cc294-684f-4ac2-8eb1-183af364c619/volumes" Dec 07 19:20:29 crc kubenswrapper[4815]: I1207 19:20:29.776050 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df54df8a-669c-4230-b377-640a79b757ab" path="/var/lib/kubelet/pods/df54df8a-669c-4230-b377-640a79b757ab/volumes" Dec 07 19:20:30 crc kubenswrapper[4815]: I1207 19:20:30.134402 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6799c5f44c-lhp92" event={"ID":"7d21150d-3baa-49af-ac9d-3eb0ab484b18","Type":"ContainerStarted","Data":"9351066d4f93cc11a37a9f39ae9bcd698e6070facb939763b5eb2f27b2dd526b"} Dec 07 19:20:30 crc kubenswrapper[4815]: I1207 19:20:30.134736 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6799c5f44c-lhp92" event={"ID":"7d21150d-3baa-49af-ac9d-3eb0ab484b18","Type":"ContainerStarted","Data":"d3e0640fe5cd661cf5069297b282171f185f1d71dcf48593282f2ac7a999c682"} Dec 07 19:20:30 crc kubenswrapper[4815]: I1207 19:20:30.136497 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6799c5f44c-lhp92" Dec 07 19:20:30 crc kubenswrapper[4815]: I1207 19:20:30.146007 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6799c5f44c-lhp92" Dec 07 19:20:30 crc kubenswrapper[4815]: I1207 19:20:30.158902 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-c4c7d9b57-b8jkp" event={"ID":"ed9e1527-b614-4f46-bdb1-787c5db70e7b","Type":"ContainerStarted","Data":"6bd651d83cbd8c8b1f00d4f9b7d8ec6895dada3b38b19251fed41ad8c6662059"} Dec 07 19:20:30 crc kubenswrapper[4815]: I1207 19:20:30.159000 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-c4c7d9b57-b8jkp" event={"ID":"ed9e1527-b614-4f46-bdb1-787c5db70e7b","Type":"ContainerStarted","Data":"a4559a70be43660d267afaf8170d6452214ce6583be91db906d36ad277450041"} Dec 07 19:20:30 crc kubenswrapper[4815]: I1207 19:20:30.160028 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-c4c7d9b57-b8jkp" Dec 07 19:20:30 crc kubenswrapper[4815]: I1207 19:20:30.172515 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-c4c7d9b57-b8jkp" Dec 07 19:20:30 crc kubenswrapper[4815]: I1207 19:20:30.174072 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6799c5f44c-lhp92" podStartSLOduration=3.174056222 podStartE2EDuration="3.174056222s" podCreationTimestamp="2025-12-07 19:20:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:20:30.173015856 +0000 UTC m=+334.752005971" watchObservedRunningTime="2025-12-07 19:20:30.174056222 +0000 UTC m=+334.753046267" Dec 07 19:20:30 crc kubenswrapper[4815]: I1207 19:20:30.213499 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-c4c7d9b57-b8jkp" podStartSLOduration=4.213483422 podStartE2EDuration="4.213483422s" podCreationTimestamp="2025-12-07 19:20:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:20:30.213375218 +0000 UTC m=+334.792365263" watchObservedRunningTime="2025-12-07 19:20:30.213483422 +0000 UTC m=+334.792473467" Dec 07 19:20:56 crc kubenswrapper[4815]: I1207 19:20:56.359312 4815 patch_prober.go:28] interesting pod/machine-config-daemon-gkn4h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 07 19:20:56 crc kubenswrapper[4815]: I1207 19:20:56.359864 4815 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 07 19:21:26 crc kubenswrapper[4815]: I1207 19:21:26.359950 4815 patch_prober.go:28] interesting pod/machine-config-daemon-gkn4h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 07 19:21:26 crc kubenswrapper[4815]: I1207 19:21:26.360475 4815 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 07 19:21:26 crc kubenswrapper[4815]: I1207 19:21:26.665504 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6799c5f44c-lhp92"] Dec 07 19:21:26 crc kubenswrapper[4815]: I1207 19:21:26.665753 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6799c5f44c-lhp92" podUID="7d21150d-3baa-49af-ac9d-3eb0ab484b18" containerName="controller-manager" containerID="cri-o://9351066d4f93cc11a37a9f39ae9bcd698e6070facb939763b5eb2f27b2dd526b" gracePeriod=30 Dec 07 19:21:27 crc kubenswrapper[4815]: I1207 19:21:27.028533 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6799c5f44c-lhp92" Dec 07 19:21:27 crc kubenswrapper[4815]: I1207 19:21:27.173437 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7d21150d-3baa-49af-ac9d-3eb0ab484b18-client-ca\") pod \"7d21150d-3baa-49af-ac9d-3eb0ab484b18\" (UID: \"7d21150d-3baa-49af-ac9d-3eb0ab484b18\") " Dec 07 19:21:27 crc kubenswrapper[4815]: I1207 19:21:27.173501 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d21150d-3baa-49af-ac9d-3eb0ab484b18-config\") pod \"7d21150d-3baa-49af-ac9d-3eb0ab484b18\" (UID: \"7d21150d-3baa-49af-ac9d-3eb0ab484b18\") " Dec 07 19:21:27 crc kubenswrapper[4815]: I1207 19:21:27.173574 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d21150d-3baa-49af-ac9d-3eb0ab484b18-serving-cert\") pod \"7d21150d-3baa-49af-ac9d-3eb0ab484b18\" (UID: \"7d21150d-3baa-49af-ac9d-3eb0ab484b18\") " Dec 07 19:21:27 crc kubenswrapper[4815]: I1207 19:21:27.173592 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7d21150d-3baa-49af-ac9d-3eb0ab484b18-proxy-ca-bundles\") pod \"7d21150d-3baa-49af-ac9d-3eb0ab484b18\" (UID: \"7d21150d-3baa-49af-ac9d-3eb0ab484b18\") " Dec 07 19:21:27 crc kubenswrapper[4815]: I1207 19:21:27.173619 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-shwx8\" (UniqueName: \"kubernetes.io/projected/7d21150d-3baa-49af-ac9d-3eb0ab484b18-kube-api-access-shwx8\") pod \"7d21150d-3baa-49af-ac9d-3eb0ab484b18\" (UID: \"7d21150d-3baa-49af-ac9d-3eb0ab484b18\") " Dec 07 19:21:27 crc kubenswrapper[4815]: I1207 19:21:27.174278 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d21150d-3baa-49af-ac9d-3eb0ab484b18-client-ca" (OuterVolumeSpecName: "client-ca") pod "7d21150d-3baa-49af-ac9d-3eb0ab484b18" (UID: "7d21150d-3baa-49af-ac9d-3eb0ab484b18"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:21:27 crc kubenswrapper[4815]: I1207 19:21:27.174300 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d21150d-3baa-49af-ac9d-3eb0ab484b18-config" (OuterVolumeSpecName: "config") pod "7d21150d-3baa-49af-ac9d-3eb0ab484b18" (UID: "7d21150d-3baa-49af-ac9d-3eb0ab484b18"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:21:27 crc kubenswrapper[4815]: I1207 19:21:27.174665 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d21150d-3baa-49af-ac9d-3eb0ab484b18-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7d21150d-3baa-49af-ac9d-3eb0ab484b18" (UID: "7d21150d-3baa-49af-ac9d-3eb0ab484b18"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:21:27 crc kubenswrapper[4815]: I1207 19:21:27.181188 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d21150d-3baa-49af-ac9d-3eb0ab484b18-kube-api-access-shwx8" (OuterVolumeSpecName: "kube-api-access-shwx8") pod "7d21150d-3baa-49af-ac9d-3eb0ab484b18" (UID: "7d21150d-3baa-49af-ac9d-3eb0ab484b18"). InnerVolumeSpecName "kube-api-access-shwx8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:21:27 crc kubenswrapper[4815]: I1207 19:21:27.197426 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d21150d-3baa-49af-ac9d-3eb0ab484b18-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7d21150d-3baa-49af-ac9d-3eb0ab484b18" (UID: "7d21150d-3baa-49af-ac9d-3eb0ab484b18"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:21:27 crc kubenswrapper[4815]: I1207 19:21:27.275215 4815 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7d21150d-3baa-49af-ac9d-3eb0ab484b18-client-ca\") on node \"crc\" DevicePath \"\"" Dec 07 19:21:27 crc kubenswrapper[4815]: I1207 19:21:27.275267 4815 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d21150d-3baa-49af-ac9d-3eb0ab484b18-config\") on node \"crc\" DevicePath \"\"" Dec 07 19:21:27 crc kubenswrapper[4815]: I1207 19:21:27.275276 4815 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d21150d-3baa-49af-ac9d-3eb0ab484b18-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 07 19:21:27 crc kubenswrapper[4815]: I1207 19:21:27.275286 4815 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7d21150d-3baa-49af-ac9d-3eb0ab484b18-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 07 19:21:27 crc kubenswrapper[4815]: I1207 19:21:27.275302 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-shwx8\" (UniqueName: \"kubernetes.io/projected/7d21150d-3baa-49af-ac9d-3eb0ab484b18-kube-api-access-shwx8\") on node \"crc\" DevicePath \"\"" Dec 07 19:21:27 crc kubenswrapper[4815]: I1207 19:21:27.519064 4815 generic.go:334] "Generic (PLEG): container finished" podID="7d21150d-3baa-49af-ac9d-3eb0ab484b18" containerID="9351066d4f93cc11a37a9f39ae9bcd698e6070facb939763b5eb2f27b2dd526b" exitCode=0 Dec 07 19:21:27 crc kubenswrapper[4815]: I1207 19:21:27.519124 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6799c5f44c-lhp92" event={"ID":"7d21150d-3baa-49af-ac9d-3eb0ab484b18","Type":"ContainerDied","Data":"9351066d4f93cc11a37a9f39ae9bcd698e6070facb939763b5eb2f27b2dd526b"} Dec 07 19:21:27 crc kubenswrapper[4815]: I1207 19:21:27.519162 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6799c5f44c-lhp92" event={"ID":"7d21150d-3baa-49af-ac9d-3eb0ab484b18","Type":"ContainerDied","Data":"d3e0640fe5cd661cf5069297b282171f185f1d71dcf48593282f2ac7a999c682"} Dec 07 19:21:27 crc kubenswrapper[4815]: I1207 19:21:27.519188 4815 scope.go:117] "RemoveContainer" containerID="9351066d4f93cc11a37a9f39ae9bcd698e6070facb939763b5eb2f27b2dd526b" Dec 07 19:21:27 crc kubenswrapper[4815]: I1207 19:21:27.519336 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6799c5f44c-lhp92" Dec 07 19:21:27 crc kubenswrapper[4815]: I1207 19:21:27.551355 4815 scope.go:117] "RemoveContainer" containerID="9351066d4f93cc11a37a9f39ae9bcd698e6070facb939763b5eb2f27b2dd526b" Dec 07 19:21:27 crc kubenswrapper[4815]: E1207 19:21:27.551822 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9351066d4f93cc11a37a9f39ae9bcd698e6070facb939763b5eb2f27b2dd526b\": container with ID starting with 9351066d4f93cc11a37a9f39ae9bcd698e6070facb939763b5eb2f27b2dd526b not found: ID does not exist" containerID="9351066d4f93cc11a37a9f39ae9bcd698e6070facb939763b5eb2f27b2dd526b" Dec 07 19:21:27 crc kubenswrapper[4815]: I1207 19:21:27.551857 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9351066d4f93cc11a37a9f39ae9bcd698e6070facb939763b5eb2f27b2dd526b"} err="failed to get container status \"9351066d4f93cc11a37a9f39ae9bcd698e6070facb939763b5eb2f27b2dd526b\": rpc error: code = NotFound desc = could not find container \"9351066d4f93cc11a37a9f39ae9bcd698e6070facb939763b5eb2f27b2dd526b\": container with ID starting with 9351066d4f93cc11a37a9f39ae9bcd698e6070facb939763b5eb2f27b2dd526b not found: ID does not exist" Dec 07 19:21:27 crc kubenswrapper[4815]: I1207 19:21:27.559230 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6799c5f44c-lhp92"] Dec 07 19:21:27 crc kubenswrapper[4815]: I1207 19:21:27.565532 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6799c5f44c-lhp92"] Dec 07 19:21:27 crc kubenswrapper[4815]: I1207 19:21:27.775543 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d21150d-3baa-49af-ac9d-3eb0ab484b18" path="/var/lib/kubelet/pods/7d21150d-3baa-49af-ac9d-3eb0ab484b18/volumes" Dec 07 19:21:27 crc kubenswrapper[4815]: I1207 19:21:27.864755 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-744fcb7887-sj2qf"] Dec 07 19:21:27 crc kubenswrapper[4815]: E1207 19:21:27.864986 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d21150d-3baa-49af-ac9d-3eb0ab484b18" containerName="controller-manager" Dec 07 19:21:27 crc kubenswrapper[4815]: I1207 19:21:27.864999 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d21150d-3baa-49af-ac9d-3eb0ab484b18" containerName="controller-manager" Dec 07 19:21:27 crc kubenswrapper[4815]: I1207 19:21:27.865087 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d21150d-3baa-49af-ac9d-3eb0ab484b18" containerName="controller-manager" Dec 07 19:21:27 crc kubenswrapper[4815]: I1207 19:21:27.865445 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-744fcb7887-sj2qf" Dec 07 19:21:27 crc kubenswrapper[4815]: I1207 19:21:27.867522 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 07 19:21:27 crc kubenswrapper[4815]: I1207 19:21:27.867868 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 07 19:21:27 crc kubenswrapper[4815]: I1207 19:21:27.868285 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 07 19:21:27 crc kubenswrapper[4815]: I1207 19:21:27.868597 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 07 19:21:27 crc kubenswrapper[4815]: I1207 19:21:27.868945 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 07 19:21:27 crc kubenswrapper[4815]: I1207 19:21:27.869221 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 07 19:21:27 crc kubenswrapper[4815]: I1207 19:21:27.878686 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-744fcb7887-sj2qf"] Dec 07 19:21:27 crc kubenswrapper[4815]: I1207 19:21:27.881982 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/edf6d47c-7974-44e0-966b-445fb45f473e-proxy-ca-bundles\") pod \"controller-manager-744fcb7887-sj2qf\" (UID: \"edf6d47c-7974-44e0-966b-445fb45f473e\") " pod="openshift-controller-manager/controller-manager-744fcb7887-sj2qf" Dec 07 19:21:27 crc kubenswrapper[4815]: I1207 19:21:27.882021 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/edf6d47c-7974-44e0-966b-445fb45f473e-client-ca\") pod \"controller-manager-744fcb7887-sj2qf\" (UID: \"edf6d47c-7974-44e0-966b-445fb45f473e\") " pod="openshift-controller-manager/controller-manager-744fcb7887-sj2qf" Dec 07 19:21:27 crc kubenswrapper[4815]: I1207 19:21:27.882063 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/edf6d47c-7974-44e0-966b-445fb45f473e-serving-cert\") pod \"controller-manager-744fcb7887-sj2qf\" (UID: \"edf6d47c-7974-44e0-966b-445fb45f473e\") " pod="openshift-controller-manager/controller-manager-744fcb7887-sj2qf" Dec 07 19:21:27 crc kubenswrapper[4815]: I1207 19:21:27.882096 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tpg9\" (UniqueName: \"kubernetes.io/projected/edf6d47c-7974-44e0-966b-445fb45f473e-kube-api-access-4tpg9\") pod \"controller-manager-744fcb7887-sj2qf\" (UID: \"edf6d47c-7974-44e0-966b-445fb45f473e\") " pod="openshift-controller-manager/controller-manager-744fcb7887-sj2qf" Dec 07 19:21:27 crc kubenswrapper[4815]: I1207 19:21:27.882127 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/edf6d47c-7974-44e0-966b-445fb45f473e-config\") pod \"controller-manager-744fcb7887-sj2qf\" (UID: \"edf6d47c-7974-44e0-966b-445fb45f473e\") " pod="openshift-controller-manager/controller-manager-744fcb7887-sj2qf" Dec 07 19:21:27 crc kubenswrapper[4815]: I1207 19:21:27.891745 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 07 19:21:27 crc kubenswrapper[4815]: I1207 19:21:27.983051 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/edf6d47c-7974-44e0-966b-445fb45f473e-proxy-ca-bundles\") pod \"controller-manager-744fcb7887-sj2qf\" (UID: \"edf6d47c-7974-44e0-966b-445fb45f473e\") " pod="openshift-controller-manager/controller-manager-744fcb7887-sj2qf" Dec 07 19:21:27 crc kubenswrapper[4815]: I1207 19:21:27.983114 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/edf6d47c-7974-44e0-966b-445fb45f473e-client-ca\") pod \"controller-manager-744fcb7887-sj2qf\" (UID: \"edf6d47c-7974-44e0-966b-445fb45f473e\") " pod="openshift-controller-manager/controller-manager-744fcb7887-sj2qf" Dec 07 19:21:27 crc kubenswrapper[4815]: I1207 19:21:27.983172 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/edf6d47c-7974-44e0-966b-445fb45f473e-serving-cert\") pod \"controller-manager-744fcb7887-sj2qf\" (UID: \"edf6d47c-7974-44e0-966b-445fb45f473e\") " pod="openshift-controller-manager/controller-manager-744fcb7887-sj2qf" Dec 07 19:21:27 crc kubenswrapper[4815]: I1207 19:21:27.983207 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tpg9\" (UniqueName: \"kubernetes.io/projected/edf6d47c-7974-44e0-966b-445fb45f473e-kube-api-access-4tpg9\") pod \"controller-manager-744fcb7887-sj2qf\" (UID: \"edf6d47c-7974-44e0-966b-445fb45f473e\") " pod="openshift-controller-manager/controller-manager-744fcb7887-sj2qf" Dec 07 19:21:27 crc kubenswrapper[4815]: I1207 19:21:27.983252 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/edf6d47c-7974-44e0-966b-445fb45f473e-config\") pod \"controller-manager-744fcb7887-sj2qf\" (UID: \"edf6d47c-7974-44e0-966b-445fb45f473e\") " pod="openshift-controller-manager/controller-manager-744fcb7887-sj2qf" Dec 07 19:21:27 crc kubenswrapper[4815]: I1207 19:21:27.984197 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/edf6d47c-7974-44e0-966b-445fb45f473e-client-ca\") pod \"controller-manager-744fcb7887-sj2qf\" (UID: \"edf6d47c-7974-44e0-966b-445fb45f473e\") " pod="openshift-controller-manager/controller-manager-744fcb7887-sj2qf" Dec 07 19:21:27 crc kubenswrapper[4815]: I1207 19:21:27.984342 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/edf6d47c-7974-44e0-966b-445fb45f473e-proxy-ca-bundles\") pod \"controller-manager-744fcb7887-sj2qf\" (UID: \"edf6d47c-7974-44e0-966b-445fb45f473e\") " pod="openshift-controller-manager/controller-manager-744fcb7887-sj2qf" Dec 07 19:21:27 crc kubenswrapper[4815]: I1207 19:21:27.985070 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/edf6d47c-7974-44e0-966b-445fb45f473e-config\") pod \"controller-manager-744fcb7887-sj2qf\" (UID: \"edf6d47c-7974-44e0-966b-445fb45f473e\") " pod="openshift-controller-manager/controller-manager-744fcb7887-sj2qf" Dec 07 19:21:27 crc kubenswrapper[4815]: I1207 19:21:27.988566 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/edf6d47c-7974-44e0-966b-445fb45f473e-serving-cert\") pod \"controller-manager-744fcb7887-sj2qf\" (UID: \"edf6d47c-7974-44e0-966b-445fb45f473e\") " pod="openshift-controller-manager/controller-manager-744fcb7887-sj2qf" Dec 07 19:21:27 crc kubenswrapper[4815]: I1207 19:21:27.999716 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tpg9\" (UniqueName: \"kubernetes.io/projected/edf6d47c-7974-44e0-966b-445fb45f473e-kube-api-access-4tpg9\") pod \"controller-manager-744fcb7887-sj2qf\" (UID: \"edf6d47c-7974-44e0-966b-445fb45f473e\") " pod="openshift-controller-manager/controller-manager-744fcb7887-sj2qf" Dec 07 19:21:28 crc kubenswrapper[4815]: I1207 19:21:28.180954 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-744fcb7887-sj2qf" Dec 07 19:21:28 crc kubenswrapper[4815]: I1207 19:21:28.420324 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-744fcb7887-sj2qf"] Dec 07 19:21:28 crc kubenswrapper[4815]: I1207 19:21:28.524273 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-744fcb7887-sj2qf" event={"ID":"edf6d47c-7974-44e0-966b-445fb45f473e","Type":"ContainerStarted","Data":"413f5b4ef481869166857ad1dbadea1d2adca1c36ae569f3ec270c755b4a801e"} Dec 07 19:21:29 crc kubenswrapper[4815]: I1207 19:21:29.456469 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-jdcwj"] Dec 07 19:21:29 crc kubenswrapper[4815]: I1207 19:21:29.457759 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-jdcwj" Dec 07 19:21:29 crc kubenswrapper[4815]: I1207 19:21:29.483251 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-jdcwj"] Dec 07 19:21:29 crc kubenswrapper[4815]: I1207 19:21:29.530614 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-744fcb7887-sj2qf" event={"ID":"edf6d47c-7974-44e0-966b-445fb45f473e","Type":"ContainerStarted","Data":"9b08faf284ad9b6ace2a3cf7724d182d4920149be6beabb50b309d257c2a2d66"} Dec 07 19:21:29 crc kubenswrapper[4815]: I1207 19:21:29.531604 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-744fcb7887-sj2qf" Dec 07 19:21:29 crc kubenswrapper[4815]: I1207 19:21:29.535022 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-744fcb7887-sj2qf" Dec 07 19:21:29 crc kubenswrapper[4815]: I1207 19:21:29.552549 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-744fcb7887-sj2qf" podStartSLOduration=3.552527403 podStartE2EDuration="3.552527403s" podCreationTimestamp="2025-12-07 19:21:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:21:29.547519568 +0000 UTC m=+394.126509613" watchObservedRunningTime="2025-12-07 19:21:29.552527403 +0000 UTC m=+394.131517468" Dec 07 19:21:29 crc kubenswrapper[4815]: I1207 19:21:29.608081 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jclc\" (UniqueName: \"kubernetes.io/projected/af0f0b52-031d-47ec-b885-e1d7d02e7678-kube-api-access-7jclc\") pod \"image-registry-66df7c8f76-jdcwj\" (UID: \"af0f0b52-031d-47ec-b885-e1d7d02e7678\") " pod="openshift-image-registry/image-registry-66df7c8f76-jdcwj" Dec 07 19:21:29 crc kubenswrapper[4815]: I1207 19:21:29.608556 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/af0f0b52-031d-47ec-b885-e1d7d02e7678-trusted-ca\") pod \"image-registry-66df7c8f76-jdcwj\" (UID: \"af0f0b52-031d-47ec-b885-e1d7d02e7678\") " pod="openshift-image-registry/image-registry-66df7c8f76-jdcwj" Dec 07 19:21:29 crc kubenswrapper[4815]: I1207 19:21:29.608680 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/af0f0b52-031d-47ec-b885-e1d7d02e7678-bound-sa-token\") pod \"image-registry-66df7c8f76-jdcwj\" (UID: \"af0f0b52-031d-47ec-b885-e1d7d02e7678\") " pod="openshift-image-registry/image-registry-66df7c8f76-jdcwj" Dec 07 19:21:29 crc kubenswrapper[4815]: I1207 19:21:29.608828 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/af0f0b52-031d-47ec-b885-e1d7d02e7678-registry-tls\") pod \"image-registry-66df7c8f76-jdcwj\" (UID: \"af0f0b52-031d-47ec-b885-e1d7d02e7678\") " pod="openshift-image-registry/image-registry-66df7c8f76-jdcwj" Dec 07 19:21:29 crc kubenswrapper[4815]: I1207 19:21:29.609121 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/af0f0b52-031d-47ec-b885-e1d7d02e7678-ca-trust-extracted\") pod \"image-registry-66df7c8f76-jdcwj\" (UID: \"af0f0b52-031d-47ec-b885-e1d7d02e7678\") " pod="openshift-image-registry/image-registry-66df7c8f76-jdcwj" Dec 07 19:21:29 crc kubenswrapper[4815]: I1207 19:21:29.609229 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/af0f0b52-031d-47ec-b885-e1d7d02e7678-installation-pull-secrets\") pod \"image-registry-66df7c8f76-jdcwj\" (UID: \"af0f0b52-031d-47ec-b885-e1d7d02e7678\") " pod="openshift-image-registry/image-registry-66df7c8f76-jdcwj" Dec 07 19:21:29 crc kubenswrapper[4815]: I1207 19:21:29.609365 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-jdcwj\" (UID: \"af0f0b52-031d-47ec-b885-e1d7d02e7678\") " pod="openshift-image-registry/image-registry-66df7c8f76-jdcwj" Dec 07 19:21:29 crc kubenswrapper[4815]: I1207 19:21:29.609877 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/af0f0b52-031d-47ec-b885-e1d7d02e7678-registry-certificates\") pod \"image-registry-66df7c8f76-jdcwj\" (UID: \"af0f0b52-031d-47ec-b885-e1d7d02e7678\") " pod="openshift-image-registry/image-registry-66df7c8f76-jdcwj" Dec 07 19:21:29 crc kubenswrapper[4815]: I1207 19:21:29.640567 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-jdcwj\" (UID: \"af0f0b52-031d-47ec-b885-e1d7d02e7678\") " pod="openshift-image-registry/image-registry-66df7c8f76-jdcwj" Dec 07 19:21:29 crc kubenswrapper[4815]: I1207 19:21:29.711618 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/af0f0b52-031d-47ec-b885-e1d7d02e7678-installation-pull-secrets\") pod \"image-registry-66df7c8f76-jdcwj\" (UID: \"af0f0b52-031d-47ec-b885-e1d7d02e7678\") " pod="openshift-image-registry/image-registry-66df7c8f76-jdcwj" Dec 07 19:21:29 crc kubenswrapper[4815]: I1207 19:21:29.711966 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/af0f0b52-031d-47ec-b885-e1d7d02e7678-registry-certificates\") pod \"image-registry-66df7c8f76-jdcwj\" (UID: \"af0f0b52-031d-47ec-b885-e1d7d02e7678\") " pod="openshift-image-registry/image-registry-66df7c8f76-jdcwj" Dec 07 19:21:29 crc kubenswrapper[4815]: I1207 19:21:29.712066 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jclc\" (UniqueName: \"kubernetes.io/projected/af0f0b52-031d-47ec-b885-e1d7d02e7678-kube-api-access-7jclc\") pod \"image-registry-66df7c8f76-jdcwj\" (UID: \"af0f0b52-031d-47ec-b885-e1d7d02e7678\") " pod="openshift-image-registry/image-registry-66df7c8f76-jdcwj" Dec 07 19:21:29 crc kubenswrapper[4815]: I1207 19:21:29.712155 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/af0f0b52-031d-47ec-b885-e1d7d02e7678-trusted-ca\") pod \"image-registry-66df7c8f76-jdcwj\" (UID: \"af0f0b52-031d-47ec-b885-e1d7d02e7678\") " pod="openshift-image-registry/image-registry-66df7c8f76-jdcwj" Dec 07 19:21:29 crc kubenswrapper[4815]: I1207 19:21:29.712255 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/af0f0b52-031d-47ec-b885-e1d7d02e7678-bound-sa-token\") pod \"image-registry-66df7c8f76-jdcwj\" (UID: \"af0f0b52-031d-47ec-b885-e1d7d02e7678\") " pod="openshift-image-registry/image-registry-66df7c8f76-jdcwj" Dec 07 19:21:29 crc kubenswrapper[4815]: I1207 19:21:29.712349 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/af0f0b52-031d-47ec-b885-e1d7d02e7678-registry-tls\") pod \"image-registry-66df7c8f76-jdcwj\" (UID: \"af0f0b52-031d-47ec-b885-e1d7d02e7678\") " pod="openshift-image-registry/image-registry-66df7c8f76-jdcwj" Dec 07 19:21:29 crc kubenswrapper[4815]: I1207 19:21:29.712474 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/af0f0b52-031d-47ec-b885-e1d7d02e7678-ca-trust-extracted\") pod \"image-registry-66df7c8f76-jdcwj\" (UID: \"af0f0b52-031d-47ec-b885-e1d7d02e7678\") " pod="openshift-image-registry/image-registry-66df7c8f76-jdcwj" Dec 07 19:21:29 crc kubenswrapper[4815]: I1207 19:21:29.712894 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/af0f0b52-031d-47ec-b885-e1d7d02e7678-ca-trust-extracted\") pod \"image-registry-66df7c8f76-jdcwj\" (UID: \"af0f0b52-031d-47ec-b885-e1d7d02e7678\") " pod="openshift-image-registry/image-registry-66df7c8f76-jdcwj" Dec 07 19:21:29 crc kubenswrapper[4815]: I1207 19:21:29.713116 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/af0f0b52-031d-47ec-b885-e1d7d02e7678-registry-certificates\") pod \"image-registry-66df7c8f76-jdcwj\" (UID: \"af0f0b52-031d-47ec-b885-e1d7d02e7678\") " pod="openshift-image-registry/image-registry-66df7c8f76-jdcwj" Dec 07 19:21:29 crc kubenswrapper[4815]: I1207 19:21:29.715796 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/af0f0b52-031d-47ec-b885-e1d7d02e7678-trusted-ca\") pod \"image-registry-66df7c8f76-jdcwj\" (UID: \"af0f0b52-031d-47ec-b885-e1d7d02e7678\") " pod="openshift-image-registry/image-registry-66df7c8f76-jdcwj" Dec 07 19:21:29 crc kubenswrapper[4815]: I1207 19:21:29.727058 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/af0f0b52-031d-47ec-b885-e1d7d02e7678-installation-pull-secrets\") pod \"image-registry-66df7c8f76-jdcwj\" (UID: \"af0f0b52-031d-47ec-b885-e1d7d02e7678\") " pod="openshift-image-registry/image-registry-66df7c8f76-jdcwj" Dec 07 19:21:29 crc kubenswrapper[4815]: I1207 19:21:29.729758 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jclc\" (UniqueName: \"kubernetes.io/projected/af0f0b52-031d-47ec-b885-e1d7d02e7678-kube-api-access-7jclc\") pod \"image-registry-66df7c8f76-jdcwj\" (UID: \"af0f0b52-031d-47ec-b885-e1d7d02e7678\") " pod="openshift-image-registry/image-registry-66df7c8f76-jdcwj" Dec 07 19:21:29 crc kubenswrapper[4815]: I1207 19:21:29.733782 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/af0f0b52-031d-47ec-b885-e1d7d02e7678-registry-tls\") pod \"image-registry-66df7c8f76-jdcwj\" (UID: \"af0f0b52-031d-47ec-b885-e1d7d02e7678\") " pod="openshift-image-registry/image-registry-66df7c8f76-jdcwj" Dec 07 19:21:29 crc kubenswrapper[4815]: I1207 19:21:29.743409 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/af0f0b52-031d-47ec-b885-e1d7d02e7678-bound-sa-token\") pod \"image-registry-66df7c8f76-jdcwj\" (UID: \"af0f0b52-031d-47ec-b885-e1d7d02e7678\") " pod="openshift-image-registry/image-registry-66df7c8f76-jdcwj" Dec 07 19:21:29 crc kubenswrapper[4815]: I1207 19:21:29.772952 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-jdcwj" Dec 07 19:21:30 crc kubenswrapper[4815]: I1207 19:21:30.172162 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-jdcwj"] Dec 07 19:21:30 crc kubenswrapper[4815]: W1207 19:21:30.178106 4815 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf0f0b52_031d_47ec_b885_e1d7d02e7678.slice/crio-aaff1b2bb51f08656ce66a474ea64f69b19aea0dc7ae66518163ed964dff1fa9 WatchSource:0}: Error finding container aaff1b2bb51f08656ce66a474ea64f69b19aea0dc7ae66518163ed964dff1fa9: Status 404 returned error can't find the container with id aaff1b2bb51f08656ce66a474ea64f69b19aea0dc7ae66518163ed964dff1fa9 Dec 07 19:21:30 crc kubenswrapper[4815]: I1207 19:21:30.537823 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-jdcwj" event={"ID":"af0f0b52-031d-47ec-b885-e1d7d02e7678","Type":"ContainerStarted","Data":"8d9db9b842babc52d7e08a40a741c72fceb65eccb2fe27b57cd10c99a0337f7c"} Dec 07 19:21:30 crc kubenswrapper[4815]: I1207 19:21:30.538194 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-jdcwj" event={"ID":"af0f0b52-031d-47ec-b885-e1d7d02e7678","Type":"ContainerStarted","Data":"aaff1b2bb51f08656ce66a474ea64f69b19aea0dc7ae66518163ed964dff1fa9"} Dec 07 19:21:30 crc kubenswrapper[4815]: I1207 19:21:30.562340 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-jdcwj" podStartSLOduration=1.562318522 podStartE2EDuration="1.562318522s" podCreationTimestamp="2025-12-07 19:21:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:21:30.561698434 +0000 UTC m=+395.140688479" watchObservedRunningTime="2025-12-07 19:21:30.562318522 +0000 UTC m=+395.141308567" Dec 07 19:21:31 crc kubenswrapper[4815]: I1207 19:21:31.543707 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-jdcwj" Dec 07 19:21:39 crc kubenswrapper[4815]: I1207 19:21:39.633286 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-52nff"] Dec 07 19:21:39 crc kubenswrapper[4815]: I1207 19:21:39.634117 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-52nff" podUID="b9de4e38-617b-41a4-b97f-155d559d497a" containerName="registry-server" containerID="cri-o://4607e322a8006f0373c80f1e2059aa62873bb765427a576f43a4b8f37eb89f88" gracePeriod=30 Dec 07 19:21:39 crc kubenswrapper[4815]: I1207 19:21:39.644765 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nb88w"] Dec 07 19:21:39 crc kubenswrapper[4815]: I1207 19:21:39.645035 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-nb88w" podUID="69cc2280-9dd4-43d1-87d6-c54a5b801a32" containerName="registry-server" containerID="cri-o://a139fcab43f7388eece156b83d154b00b660e3ca46ef99e8ae0279877f79146b" gracePeriod=30 Dec 07 19:21:39 crc kubenswrapper[4815]: I1207 19:21:39.651331 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2r46v"] Dec 07 19:21:39 crc kubenswrapper[4815]: I1207 19:21:39.651576 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-2r46v" podUID="e58da229-9be5-4d48-a1af-74d5316d09f3" containerName="marketplace-operator" containerID="cri-o://5c1367bd020b3ea1e7ef021e2d310dd5a91fb5d9fd7e9470fc1d557d44f6656c" gracePeriod=30 Dec 07 19:21:39 crc kubenswrapper[4815]: I1207 19:21:39.662480 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sbt7w"] Dec 07 19:21:39 crc kubenswrapper[4815]: I1207 19:21:39.662742 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-sbt7w" podUID="04fe5f63-515f-4b66-963c-c2ce259b9bad" containerName="registry-server" containerID="cri-o://64b300ae46be3492721068216bda10b21d81f220749cbfafc2f39cf65db269a4" gracePeriod=30 Dec 07 19:21:39 crc kubenswrapper[4815]: I1207 19:21:39.680061 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-x887s"] Dec 07 19:21:39 crc kubenswrapper[4815]: I1207 19:21:39.680798 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-x887s" Dec 07 19:21:39 crc kubenswrapper[4815]: I1207 19:21:39.685010 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ssmr7"] Dec 07 19:21:39 crc kubenswrapper[4815]: I1207 19:21:39.685260 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ssmr7" podUID="34c69927-3b4e-4e18-8201-27eb981bad10" containerName="registry-server" containerID="cri-o://dc316da3ca756944a74687ba403b388d207be913f64f2d8a90ec49bd7520ac96" gracePeriod=30 Dec 07 19:21:39 crc kubenswrapper[4815]: I1207 19:21:39.698987 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-x887s"] Dec 07 19:21:39 crc kubenswrapper[4815]: E1207 19:21:39.702764 4815 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4607e322a8006f0373c80f1e2059aa62873bb765427a576f43a4b8f37eb89f88" cmd=["grpc_health_probe","-addr=:50051"] Dec 07 19:21:39 crc kubenswrapper[4815]: E1207 19:21:39.706080 4815 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4607e322a8006f0373c80f1e2059aa62873bb765427a576f43a4b8f37eb89f88" cmd=["grpc_health_probe","-addr=:50051"] Dec 07 19:21:39 crc kubenswrapper[4815]: E1207 19:21:39.707002 4815 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4607e322a8006f0373c80f1e2059aa62873bb765427a576f43a4b8f37eb89f88" cmd=["grpc_health_probe","-addr=:50051"] Dec 07 19:21:39 crc kubenswrapper[4815]: E1207 19:21:39.707044 4815 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-marketplace/certified-operators-52nff" podUID="b9de4e38-617b-41a4-b97f-155d559d497a" containerName="registry-server" Dec 07 19:21:39 crc kubenswrapper[4815]: I1207 19:21:39.743964 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qvp4\" (UniqueName: \"kubernetes.io/projected/c32bb3fd-40b2-4f28-9fad-9283162b80c1-kube-api-access-4qvp4\") pod \"marketplace-operator-79b997595-x887s\" (UID: \"c32bb3fd-40b2-4f28-9fad-9283162b80c1\") " pod="openshift-marketplace/marketplace-operator-79b997595-x887s" Dec 07 19:21:39 crc kubenswrapper[4815]: I1207 19:21:39.744023 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c32bb3fd-40b2-4f28-9fad-9283162b80c1-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-x887s\" (UID: \"c32bb3fd-40b2-4f28-9fad-9283162b80c1\") " pod="openshift-marketplace/marketplace-operator-79b997595-x887s" Dec 07 19:21:39 crc kubenswrapper[4815]: I1207 19:21:39.744054 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c32bb3fd-40b2-4f28-9fad-9283162b80c1-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-x887s\" (UID: \"c32bb3fd-40b2-4f28-9fad-9283162b80c1\") " pod="openshift-marketplace/marketplace-operator-79b997595-x887s" Dec 07 19:21:39 crc kubenswrapper[4815]: I1207 19:21:39.847135 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c32bb3fd-40b2-4f28-9fad-9283162b80c1-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-x887s\" (UID: \"c32bb3fd-40b2-4f28-9fad-9283162b80c1\") " pod="openshift-marketplace/marketplace-operator-79b997595-x887s" Dec 07 19:21:39 crc kubenswrapper[4815]: I1207 19:21:39.847404 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qvp4\" (UniqueName: \"kubernetes.io/projected/c32bb3fd-40b2-4f28-9fad-9283162b80c1-kube-api-access-4qvp4\") pod \"marketplace-operator-79b997595-x887s\" (UID: \"c32bb3fd-40b2-4f28-9fad-9283162b80c1\") " pod="openshift-marketplace/marketplace-operator-79b997595-x887s" Dec 07 19:21:39 crc kubenswrapper[4815]: I1207 19:21:39.847429 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c32bb3fd-40b2-4f28-9fad-9283162b80c1-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-x887s\" (UID: \"c32bb3fd-40b2-4f28-9fad-9283162b80c1\") " pod="openshift-marketplace/marketplace-operator-79b997595-x887s" Dec 07 19:21:39 crc kubenswrapper[4815]: I1207 19:21:39.848363 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c32bb3fd-40b2-4f28-9fad-9283162b80c1-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-x887s\" (UID: \"c32bb3fd-40b2-4f28-9fad-9283162b80c1\") " pod="openshift-marketplace/marketplace-operator-79b997595-x887s" Dec 07 19:21:39 crc kubenswrapper[4815]: I1207 19:21:39.855644 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c32bb3fd-40b2-4f28-9fad-9283162b80c1-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-x887s\" (UID: \"c32bb3fd-40b2-4f28-9fad-9283162b80c1\") " pod="openshift-marketplace/marketplace-operator-79b997595-x887s" Dec 07 19:21:39 crc kubenswrapper[4815]: I1207 19:21:39.879851 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qvp4\" (UniqueName: \"kubernetes.io/projected/c32bb3fd-40b2-4f28-9fad-9283162b80c1-kube-api-access-4qvp4\") pod \"marketplace-operator-79b997595-x887s\" (UID: \"c32bb3fd-40b2-4f28-9fad-9283162b80c1\") " pod="openshift-marketplace/marketplace-operator-79b997595-x887s" Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.011396 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-x887s" Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.144831 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-52nff" Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.251733 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9de4e38-617b-41a4-b97f-155d559d497a-utilities\") pod \"b9de4e38-617b-41a4-b97f-155d559d497a\" (UID: \"b9de4e38-617b-41a4-b97f-155d559d497a\") " Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.251806 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48h9v\" (UniqueName: \"kubernetes.io/projected/b9de4e38-617b-41a4-b97f-155d559d497a-kube-api-access-48h9v\") pod \"b9de4e38-617b-41a4-b97f-155d559d497a\" (UID: \"b9de4e38-617b-41a4-b97f-155d559d497a\") " Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.251862 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9de4e38-617b-41a4-b97f-155d559d497a-catalog-content\") pod \"b9de4e38-617b-41a4-b97f-155d559d497a\" (UID: \"b9de4e38-617b-41a4-b97f-155d559d497a\") " Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.252617 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9de4e38-617b-41a4-b97f-155d559d497a-utilities" (OuterVolumeSpecName: "utilities") pod "b9de4e38-617b-41a4-b97f-155d559d497a" (UID: "b9de4e38-617b-41a4-b97f-155d559d497a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.259964 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9de4e38-617b-41a4-b97f-155d559d497a-kube-api-access-48h9v" (OuterVolumeSpecName: "kube-api-access-48h9v") pod "b9de4e38-617b-41a4-b97f-155d559d497a" (UID: "b9de4e38-617b-41a4-b97f-155d559d497a"). InnerVolumeSpecName "kube-api-access-48h9v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.317693 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nb88w" Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.326468 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9de4e38-617b-41a4-b97f-155d559d497a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b9de4e38-617b-41a4-b97f-155d559d497a" (UID: "b9de4e38-617b-41a4-b97f-155d559d497a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.353592 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-48h9v\" (UniqueName: \"kubernetes.io/projected/b9de4e38-617b-41a4-b97f-155d559d497a-kube-api-access-48h9v\") on node \"crc\" DevicePath \"\"" Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.353614 4815 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9de4e38-617b-41a4-b97f-155d559d497a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.353622 4815 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9de4e38-617b-41a4-b97f-155d559d497a-utilities\") on node \"crc\" DevicePath \"\"" Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.353772 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ssmr7" Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.385079 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-2r46v" Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.434222 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sbt7w" Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.454119 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hvgpf\" (UniqueName: \"kubernetes.io/projected/69cc2280-9dd4-43d1-87d6-c54a5b801a32-kube-api-access-hvgpf\") pod \"69cc2280-9dd4-43d1-87d6-c54a5b801a32\" (UID: \"69cc2280-9dd4-43d1-87d6-c54a5b801a32\") " Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.454204 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34c69927-3b4e-4e18-8201-27eb981bad10-utilities\") pod \"34c69927-3b4e-4e18-8201-27eb981bad10\" (UID: \"34c69927-3b4e-4e18-8201-27eb981bad10\") " Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.454257 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e58da229-9be5-4d48-a1af-74d5316d09f3-marketplace-operator-metrics\") pod \"e58da229-9be5-4d48-a1af-74d5316d09f3\" (UID: \"e58da229-9be5-4d48-a1af-74d5316d09f3\") " Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.454307 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z8d2t\" (UniqueName: \"kubernetes.io/projected/e58da229-9be5-4d48-a1af-74d5316d09f3-kube-api-access-z8d2t\") pod \"e58da229-9be5-4d48-a1af-74d5316d09f3\" (UID: \"e58da229-9be5-4d48-a1af-74d5316d09f3\") " Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.454334 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69cc2280-9dd4-43d1-87d6-c54a5b801a32-catalog-content\") pod \"69cc2280-9dd4-43d1-87d6-c54a5b801a32\" (UID: \"69cc2280-9dd4-43d1-87d6-c54a5b801a32\") " Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.454361 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69cc2280-9dd4-43d1-87d6-c54a5b801a32-utilities\") pod \"69cc2280-9dd4-43d1-87d6-c54a5b801a32\" (UID: \"69cc2280-9dd4-43d1-87d6-c54a5b801a32\") " Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.454389 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e58da229-9be5-4d48-a1af-74d5316d09f3-marketplace-trusted-ca\") pod \"e58da229-9be5-4d48-a1af-74d5316d09f3\" (UID: \"e58da229-9be5-4d48-a1af-74d5316d09f3\") " Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.454422 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34c69927-3b4e-4e18-8201-27eb981bad10-catalog-content\") pod \"34c69927-3b4e-4e18-8201-27eb981bad10\" (UID: \"34c69927-3b4e-4e18-8201-27eb981bad10\") " Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.454448 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8lvjz\" (UniqueName: \"kubernetes.io/projected/34c69927-3b4e-4e18-8201-27eb981bad10-kube-api-access-8lvjz\") pod \"34c69927-3b4e-4e18-8201-27eb981bad10\" (UID: \"34c69927-3b4e-4e18-8201-27eb981bad10\") " Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.456562 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e58da229-9be5-4d48-a1af-74d5316d09f3-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "e58da229-9be5-4d48-a1af-74d5316d09f3" (UID: "e58da229-9be5-4d48-a1af-74d5316d09f3"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.456748 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69cc2280-9dd4-43d1-87d6-c54a5b801a32-utilities" (OuterVolumeSpecName: "utilities") pod "69cc2280-9dd4-43d1-87d6-c54a5b801a32" (UID: "69cc2280-9dd4-43d1-87d6-c54a5b801a32"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.459428 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e58da229-9be5-4d48-a1af-74d5316d09f3-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "e58da229-9be5-4d48-a1af-74d5316d09f3" (UID: "e58da229-9be5-4d48-a1af-74d5316d09f3"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.460527 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34c69927-3b4e-4e18-8201-27eb981bad10-utilities" (OuterVolumeSpecName: "utilities") pod "34c69927-3b4e-4e18-8201-27eb981bad10" (UID: "34c69927-3b4e-4e18-8201-27eb981bad10"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.460991 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69cc2280-9dd4-43d1-87d6-c54a5b801a32-kube-api-access-hvgpf" (OuterVolumeSpecName: "kube-api-access-hvgpf") pod "69cc2280-9dd4-43d1-87d6-c54a5b801a32" (UID: "69cc2280-9dd4-43d1-87d6-c54a5b801a32"). InnerVolumeSpecName "kube-api-access-hvgpf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.461018 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e58da229-9be5-4d48-a1af-74d5316d09f3-kube-api-access-z8d2t" (OuterVolumeSpecName: "kube-api-access-z8d2t") pod "e58da229-9be5-4d48-a1af-74d5316d09f3" (UID: "e58da229-9be5-4d48-a1af-74d5316d09f3"). InnerVolumeSpecName "kube-api-access-z8d2t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.465079 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34c69927-3b4e-4e18-8201-27eb981bad10-kube-api-access-8lvjz" (OuterVolumeSpecName: "kube-api-access-8lvjz") pod "34c69927-3b4e-4e18-8201-27eb981bad10" (UID: "34c69927-3b4e-4e18-8201-27eb981bad10"). InnerVolumeSpecName "kube-api-access-8lvjz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.542633 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69cc2280-9dd4-43d1-87d6-c54a5b801a32-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "69cc2280-9dd4-43d1-87d6-c54a5b801a32" (UID: "69cc2280-9dd4-43d1-87d6-c54a5b801a32"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.555080 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04fe5f63-515f-4b66-963c-c2ce259b9bad-utilities\") pod \"04fe5f63-515f-4b66-963c-c2ce259b9bad\" (UID: \"04fe5f63-515f-4b66-963c-c2ce259b9bad\") " Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.555206 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jtfrt\" (UniqueName: \"kubernetes.io/projected/04fe5f63-515f-4b66-963c-c2ce259b9bad-kube-api-access-jtfrt\") pod \"04fe5f63-515f-4b66-963c-c2ce259b9bad\" (UID: \"04fe5f63-515f-4b66-963c-c2ce259b9bad\") " Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.555249 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04fe5f63-515f-4b66-963c-c2ce259b9bad-catalog-content\") pod \"04fe5f63-515f-4b66-963c-c2ce259b9bad\" (UID: \"04fe5f63-515f-4b66-963c-c2ce259b9bad\") " Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.555476 4815 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e58da229-9be5-4d48-a1af-74d5316d09f3-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.555493 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z8d2t\" (UniqueName: \"kubernetes.io/projected/e58da229-9be5-4d48-a1af-74d5316d09f3-kube-api-access-z8d2t\") on node \"crc\" DevicePath \"\"" Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.555502 4815 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69cc2280-9dd4-43d1-87d6-c54a5b801a32-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.555511 4815 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69cc2280-9dd4-43d1-87d6-c54a5b801a32-utilities\") on node \"crc\" DevicePath \"\"" Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.555519 4815 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e58da229-9be5-4d48-a1af-74d5316d09f3-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.555528 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8lvjz\" (UniqueName: \"kubernetes.io/projected/34c69927-3b4e-4e18-8201-27eb981bad10-kube-api-access-8lvjz\") on node \"crc\" DevicePath \"\"" Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.555537 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hvgpf\" (UniqueName: \"kubernetes.io/projected/69cc2280-9dd4-43d1-87d6-c54a5b801a32-kube-api-access-hvgpf\") on node \"crc\" DevicePath \"\"" Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.555544 4815 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34c69927-3b4e-4e18-8201-27eb981bad10-utilities\") on node \"crc\" DevicePath \"\"" Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.556567 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04fe5f63-515f-4b66-963c-c2ce259b9bad-utilities" (OuterVolumeSpecName: "utilities") pod "04fe5f63-515f-4b66-963c-c2ce259b9bad" (UID: "04fe5f63-515f-4b66-963c-c2ce259b9bad"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.557934 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04fe5f63-515f-4b66-963c-c2ce259b9bad-kube-api-access-jtfrt" (OuterVolumeSpecName: "kube-api-access-jtfrt") pod "04fe5f63-515f-4b66-963c-c2ce259b9bad" (UID: "04fe5f63-515f-4b66-963c-c2ce259b9bad"). InnerVolumeSpecName "kube-api-access-jtfrt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.572645 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04fe5f63-515f-4b66-963c-c2ce259b9bad-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "04fe5f63-515f-4b66-963c-c2ce259b9bad" (UID: "04fe5f63-515f-4b66-963c-c2ce259b9bad"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.577894 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34c69927-3b4e-4e18-8201-27eb981bad10-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "34c69927-3b4e-4e18-8201-27eb981bad10" (UID: "34c69927-3b4e-4e18-8201-27eb981bad10"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.597413 4815 generic.go:334] "Generic (PLEG): container finished" podID="e58da229-9be5-4d48-a1af-74d5316d09f3" containerID="5c1367bd020b3ea1e7ef021e2d310dd5a91fb5d9fd7e9470fc1d557d44f6656c" exitCode=0 Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.597511 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-2r46v" event={"ID":"e58da229-9be5-4d48-a1af-74d5316d09f3","Type":"ContainerDied","Data":"5c1367bd020b3ea1e7ef021e2d310dd5a91fb5d9fd7e9470fc1d557d44f6656c"} Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.597684 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-2r46v" event={"ID":"e58da229-9be5-4d48-a1af-74d5316d09f3","Type":"ContainerDied","Data":"aac1a4ad813ee6b3188f289419041212a420e2ddd65928e238c66f9b75ebe0cf"} Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.597733 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-2r46v" Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.597810 4815 scope.go:117] "RemoveContainer" containerID="5c1367bd020b3ea1e7ef021e2d310dd5a91fb5d9fd7e9470fc1d557d44f6656c" Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.602069 4815 generic.go:334] "Generic (PLEG): container finished" podID="b9de4e38-617b-41a4-b97f-155d559d497a" containerID="4607e322a8006f0373c80f1e2059aa62873bb765427a576f43a4b8f37eb89f88" exitCode=0 Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.602162 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-52nff" event={"ID":"b9de4e38-617b-41a4-b97f-155d559d497a","Type":"ContainerDied","Data":"4607e322a8006f0373c80f1e2059aa62873bb765427a576f43a4b8f37eb89f88"} Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.602195 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-52nff" event={"ID":"b9de4e38-617b-41a4-b97f-155d559d497a","Type":"ContainerDied","Data":"05c3f5609de0256250c7d92e590cf20b8921e387c66f9b2185c74576dd3cfb82"} Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.602163 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-52nff" Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.604731 4815 generic.go:334] "Generic (PLEG): container finished" podID="69cc2280-9dd4-43d1-87d6-c54a5b801a32" containerID="a139fcab43f7388eece156b83d154b00b660e3ca46ef99e8ae0279877f79146b" exitCode=0 Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.604823 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nb88w" event={"ID":"69cc2280-9dd4-43d1-87d6-c54a5b801a32","Type":"ContainerDied","Data":"a139fcab43f7388eece156b83d154b00b660e3ca46ef99e8ae0279877f79146b"} Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.604855 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nb88w" event={"ID":"69cc2280-9dd4-43d1-87d6-c54a5b801a32","Type":"ContainerDied","Data":"58e4c181e8567b82c04d29f5203d79cd5f6b34f9ec242e1c5ded64d82435c9b7"} Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.604962 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nb88w" Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.613272 4815 generic.go:334] "Generic (PLEG): container finished" podID="04fe5f63-515f-4b66-963c-c2ce259b9bad" containerID="64b300ae46be3492721068216bda10b21d81f220749cbfafc2f39cf65db269a4" exitCode=0 Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.613346 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sbt7w" event={"ID":"04fe5f63-515f-4b66-963c-c2ce259b9bad","Type":"ContainerDied","Data":"64b300ae46be3492721068216bda10b21d81f220749cbfafc2f39cf65db269a4"} Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.613369 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sbt7w" event={"ID":"04fe5f63-515f-4b66-963c-c2ce259b9bad","Type":"ContainerDied","Data":"2bcd036bcecafcb1b6c576ce945b9afdf691bc050ebd3b750f071a2073b52c98"} Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.613604 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sbt7w" Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.621126 4815 scope.go:117] "RemoveContainer" containerID="254c86c7a7a54b29d6f66e30ae0211606dfb4924aac8c6f2bd2717b2514b73ad" Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.622896 4815 generic.go:334] "Generic (PLEG): container finished" podID="34c69927-3b4e-4e18-8201-27eb981bad10" containerID="dc316da3ca756944a74687ba403b388d207be913f64f2d8a90ec49bd7520ac96" exitCode=0 Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.623408 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ssmr7" Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.623071 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ssmr7" event={"ID":"34c69927-3b4e-4e18-8201-27eb981bad10","Type":"ContainerDied","Data":"dc316da3ca756944a74687ba403b388d207be913f64f2d8a90ec49bd7520ac96"} Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.624231 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ssmr7" event={"ID":"34c69927-3b4e-4e18-8201-27eb981bad10","Type":"ContainerDied","Data":"61a57393d896752786f03c0b482a2a62fb13d6d9163a360a2a3f1320e99f7332"} Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.654732 4815 scope.go:117] "RemoveContainer" containerID="5c1367bd020b3ea1e7ef021e2d310dd5a91fb5d9fd7e9470fc1d557d44f6656c" Dec 07 19:21:40 crc kubenswrapper[4815]: E1207 19:21:40.655232 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c1367bd020b3ea1e7ef021e2d310dd5a91fb5d9fd7e9470fc1d557d44f6656c\": container with ID starting with 5c1367bd020b3ea1e7ef021e2d310dd5a91fb5d9fd7e9470fc1d557d44f6656c not found: ID does not exist" containerID="5c1367bd020b3ea1e7ef021e2d310dd5a91fb5d9fd7e9470fc1d557d44f6656c" Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.655258 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c1367bd020b3ea1e7ef021e2d310dd5a91fb5d9fd7e9470fc1d557d44f6656c"} err="failed to get container status \"5c1367bd020b3ea1e7ef021e2d310dd5a91fb5d9fd7e9470fc1d557d44f6656c\": rpc error: code = NotFound desc = could not find container \"5c1367bd020b3ea1e7ef021e2d310dd5a91fb5d9fd7e9470fc1d557d44f6656c\": container with ID starting with 5c1367bd020b3ea1e7ef021e2d310dd5a91fb5d9fd7e9470fc1d557d44f6656c not found: ID does not exist" Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.655280 4815 scope.go:117] "RemoveContainer" containerID="254c86c7a7a54b29d6f66e30ae0211606dfb4924aac8c6f2bd2717b2514b73ad" Dec 07 19:21:40 crc kubenswrapper[4815]: E1207 19:21:40.655690 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"254c86c7a7a54b29d6f66e30ae0211606dfb4924aac8c6f2bd2717b2514b73ad\": container with ID starting with 254c86c7a7a54b29d6f66e30ae0211606dfb4924aac8c6f2bd2717b2514b73ad not found: ID does not exist" containerID="254c86c7a7a54b29d6f66e30ae0211606dfb4924aac8c6f2bd2717b2514b73ad" Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.655749 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"254c86c7a7a54b29d6f66e30ae0211606dfb4924aac8c6f2bd2717b2514b73ad"} err="failed to get container status \"254c86c7a7a54b29d6f66e30ae0211606dfb4924aac8c6f2bd2717b2514b73ad\": rpc error: code = NotFound desc = could not find container \"254c86c7a7a54b29d6f66e30ae0211606dfb4924aac8c6f2bd2717b2514b73ad\": container with ID starting with 254c86c7a7a54b29d6f66e30ae0211606dfb4924aac8c6f2bd2717b2514b73ad not found: ID does not exist" Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.655781 4815 scope.go:117] "RemoveContainer" containerID="4607e322a8006f0373c80f1e2059aa62873bb765427a576f43a4b8f37eb89f88" Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.671679 4815 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04fe5f63-515f-4b66-963c-c2ce259b9bad-utilities\") on node \"crc\" DevicePath \"\"" Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.671751 4815 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34c69927-3b4e-4e18-8201-27eb981bad10-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.671769 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jtfrt\" (UniqueName: \"kubernetes.io/projected/04fe5f63-515f-4b66-963c-c2ce259b9bad-kube-api-access-jtfrt\") on node \"crc\" DevicePath \"\"" Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.671794 4815 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04fe5f63-515f-4b66-963c-c2ce259b9bad-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.677387 4815 scope.go:117] "RemoveContainer" containerID="60d9e41befa3fa4fc4f6fe8bb9401a0d3d3a289e8ea313d0d44a502f85baeb2a" Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.681765 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-x887s"] Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.696557 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2r46v"] Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.707153 4815 scope.go:117] "RemoveContainer" containerID="1b6b48e4a2db772642abfa012759f50320a37c8fecebacfda624d9b0bf93d0e5" Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.710611 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2r46v"] Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.714537 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-52nff"] Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.718374 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-52nff"] Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.734242 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sbt7w"] Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.734906 4815 scope.go:117] "RemoveContainer" containerID="4607e322a8006f0373c80f1e2059aa62873bb765427a576f43a4b8f37eb89f88" Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.739019 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-sbt7w"] Dec 07 19:21:40 crc kubenswrapper[4815]: E1207 19:21:40.739524 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4607e322a8006f0373c80f1e2059aa62873bb765427a576f43a4b8f37eb89f88\": container with ID starting with 4607e322a8006f0373c80f1e2059aa62873bb765427a576f43a4b8f37eb89f88 not found: ID does not exist" containerID="4607e322a8006f0373c80f1e2059aa62873bb765427a576f43a4b8f37eb89f88" Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.739557 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4607e322a8006f0373c80f1e2059aa62873bb765427a576f43a4b8f37eb89f88"} err="failed to get container status \"4607e322a8006f0373c80f1e2059aa62873bb765427a576f43a4b8f37eb89f88\": rpc error: code = NotFound desc = could not find container \"4607e322a8006f0373c80f1e2059aa62873bb765427a576f43a4b8f37eb89f88\": container with ID starting with 4607e322a8006f0373c80f1e2059aa62873bb765427a576f43a4b8f37eb89f88 not found: ID does not exist" Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.739582 4815 scope.go:117] "RemoveContainer" containerID="60d9e41befa3fa4fc4f6fe8bb9401a0d3d3a289e8ea313d0d44a502f85baeb2a" Dec 07 19:21:40 crc kubenswrapper[4815]: E1207 19:21:40.739981 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60d9e41befa3fa4fc4f6fe8bb9401a0d3d3a289e8ea313d0d44a502f85baeb2a\": container with ID starting with 60d9e41befa3fa4fc4f6fe8bb9401a0d3d3a289e8ea313d0d44a502f85baeb2a not found: ID does not exist" containerID="60d9e41befa3fa4fc4f6fe8bb9401a0d3d3a289e8ea313d0d44a502f85baeb2a" Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.740081 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60d9e41befa3fa4fc4f6fe8bb9401a0d3d3a289e8ea313d0d44a502f85baeb2a"} err="failed to get container status \"60d9e41befa3fa4fc4f6fe8bb9401a0d3d3a289e8ea313d0d44a502f85baeb2a\": rpc error: code = NotFound desc = could not find container \"60d9e41befa3fa4fc4f6fe8bb9401a0d3d3a289e8ea313d0d44a502f85baeb2a\": container with ID starting with 60d9e41befa3fa4fc4f6fe8bb9401a0d3d3a289e8ea313d0d44a502f85baeb2a not found: ID does not exist" Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.740181 4815 scope.go:117] "RemoveContainer" containerID="1b6b48e4a2db772642abfa012759f50320a37c8fecebacfda624d9b0bf93d0e5" Dec 07 19:21:40 crc kubenswrapper[4815]: E1207 19:21:40.740785 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b6b48e4a2db772642abfa012759f50320a37c8fecebacfda624d9b0bf93d0e5\": container with ID starting with 1b6b48e4a2db772642abfa012759f50320a37c8fecebacfda624d9b0bf93d0e5 not found: ID does not exist" containerID="1b6b48e4a2db772642abfa012759f50320a37c8fecebacfda624d9b0bf93d0e5" Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.740853 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b6b48e4a2db772642abfa012759f50320a37c8fecebacfda624d9b0bf93d0e5"} err="failed to get container status \"1b6b48e4a2db772642abfa012759f50320a37c8fecebacfda624d9b0bf93d0e5\": rpc error: code = NotFound desc = could not find container \"1b6b48e4a2db772642abfa012759f50320a37c8fecebacfda624d9b0bf93d0e5\": container with ID starting with 1b6b48e4a2db772642abfa012759f50320a37c8fecebacfda624d9b0bf93d0e5 not found: ID does not exist" Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.740948 4815 scope.go:117] "RemoveContainer" containerID="a139fcab43f7388eece156b83d154b00b660e3ca46ef99e8ae0279877f79146b" Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.745421 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ssmr7"] Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.747150 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ssmr7"] Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.751532 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nb88w"] Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.754609 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-nb88w"] Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.758942 4815 scope.go:117] "RemoveContainer" containerID="277c34db773db200adcd120ae3061186720c3c3b1c37d3a7b08d8b052ca77590" Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.817058 4815 scope.go:117] "RemoveContainer" containerID="da221eb599563e2ce2605987eb6b83067980590999da90411e629d285171ed0f" Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.850428 4815 scope.go:117] "RemoveContainer" containerID="a139fcab43f7388eece156b83d154b00b660e3ca46ef99e8ae0279877f79146b" Dec 07 19:21:40 crc kubenswrapper[4815]: E1207 19:21:40.850859 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a139fcab43f7388eece156b83d154b00b660e3ca46ef99e8ae0279877f79146b\": container with ID starting with a139fcab43f7388eece156b83d154b00b660e3ca46ef99e8ae0279877f79146b not found: ID does not exist" containerID="a139fcab43f7388eece156b83d154b00b660e3ca46ef99e8ae0279877f79146b" Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.850907 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a139fcab43f7388eece156b83d154b00b660e3ca46ef99e8ae0279877f79146b"} err="failed to get container status \"a139fcab43f7388eece156b83d154b00b660e3ca46ef99e8ae0279877f79146b\": rpc error: code = NotFound desc = could not find container \"a139fcab43f7388eece156b83d154b00b660e3ca46ef99e8ae0279877f79146b\": container with ID starting with a139fcab43f7388eece156b83d154b00b660e3ca46ef99e8ae0279877f79146b not found: ID does not exist" Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.850956 4815 scope.go:117] "RemoveContainer" containerID="277c34db773db200adcd120ae3061186720c3c3b1c37d3a7b08d8b052ca77590" Dec 07 19:21:40 crc kubenswrapper[4815]: E1207 19:21:40.851290 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"277c34db773db200adcd120ae3061186720c3c3b1c37d3a7b08d8b052ca77590\": container with ID starting with 277c34db773db200adcd120ae3061186720c3c3b1c37d3a7b08d8b052ca77590 not found: ID does not exist" containerID="277c34db773db200adcd120ae3061186720c3c3b1c37d3a7b08d8b052ca77590" Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.851317 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"277c34db773db200adcd120ae3061186720c3c3b1c37d3a7b08d8b052ca77590"} err="failed to get container status \"277c34db773db200adcd120ae3061186720c3c3b1c37d3a7b08d8b052ca77590\": rpc error: code = NotFound desc = could not find container \"277c34db773db200adcd120ae3061186720c3c3b1c37d3a7b08d8b052ca77590\": container with ID starting with 277c34db773db200adcd120ae3061186720c3c3b1c37d3a7b08d8b052ca77590 not found: ID does not exist" Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.851333 4815 scope.go:117] "RemoveContainer" containerID="da221eb599563e2ce2605987eb6b83067980590999da90411e629d285171ed0f" Dec 07 19:21:40 crc kubenswrapper[4815]: E1207 19:21:40.851590 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da221eb599563e2ce2605987eb6b83067980590999da90411e629d285171ed0f\": container with ID starting with da221eb599563e2ce2605987eb6b83067980590999da90411e629d285171ed0f not found: ID does not exist" containerID="da221eb599563e2ce2605987eb6b83067980590999da90411e629d285171ed0f" Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.851615 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da221eb599563e2ce2605987eb6b83067980590999da90411e629d285171ed0f"} err="failed to get container status \"da221eb599563e2ce2605987eb6b83067980590999da90411e629d285171ed0f\": rpc error: code = NotFound desc = could not find container \"da221eb599563e2ce2605987eb6b83067980590999da90411e629d285171ed0f\": container with ID starting with da221eb599563e2ce2605987eb6b83067980590999da90411e629d285171ed0f not found: ID does not exist" Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.851634 4815 scope.go:117] "RemoveContainer" containerID="64b300ae46be3492721068216bda10b21d81f220749cbfafc2f39cf65db269a4" Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.877117 4815 scope.go:117] "RemoveContainer" containerID="395469c72e7cbee9c154af95f11191d9d938ab38b2a3cb573658d6979747564c" Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.893013 4815 scope.go:117] "RemoveContainer" containerID="565fe84718016e2f2636ae43b41a6f803b351df134f66c9e1713f4edfa90d028" Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.914807 4815 scope.go:117] "RemoveContainer" containerID="64b300ae46be3492721068216bda10b21d81f220749cbfafc2f39cf65db269a4" Dec 07 19:21:40 crc kubenswrapper[4815]: E1207 19:21:40.915233 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64b300ae46be3492721068216bda10b21d81f220749cbfafc2f39cf65db269a4\": container with ID starting with 64b300ae46be3492721068216bda10b21d81f220749cbfafc2f39cf65db269a4 not found: ID does not exist" containerID="64b300ae46be3492721068216bda10b21d81f220749cbfafc2f39cf65db269a4" Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.915329 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64b300ae46be3492721068216bda10b21d81f220749cbfafc2f39cf65db269a4"} err="failed to get container status \"64b300ae46be3492721068216bda10b21d81f220749cbfafc2f39cf65db269a4\": rpc error: code = NotFound desc = could not find container \"64b300ae46be3492721068216bda10b21d81f220749cbfafc2f39cf65db269a4\": container with ID starting with 64b300ae46be3492721068216bda10b21d81f220749cbfafc2f39cf65db269a4 not found: ID does not exist" Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.915438 4815 scope.go:117] "RemoveContainer" containerID="395469c72e7cbee9c154af95f11191d9d938ab38b2a3cb573658d6979747564c" Dec 07 19:21:40 crc kubenswrapper[4815]: E1207 19:21:40.915757 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"395469c72e7cbee9c154af95f11191d9d938ab38b2a3cb573658d6979747564c\": container with ID starting with 395469c72e7cbee9c154af95f11191d9d938ab38b2a3cb573658d6979747564c not found: ID does not exist" containerID="395469c72e7cbee9c154af95f11191d9d938ab38b2a3cb573658d6979747564c" Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.915852 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"395469c72e7cbee9c154af95f11191d9d938ab38b2a3cb573658d6979747564c"} err="failed to get container status \"395469c72e7cbee9c154af95f11191d9d938ab38b2a3cb573658d6979747564c\": rpc error: code = NotFound desc = could not find container \"395469c72e7cbee9c154af95f11191d9d938ab38b2a3cb573658d6979747564c\": container with ID starting with 395469c72e7cbee9c154af95f11191d9d938ab38b2a3cb573658d6979747564c not found: ID does not exist" Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.915942 4815 scope.go:117] "RemoveContainer" containerID="565fe84718016e2f2636ae43b41a6f803b351df134f66c9e1713f4edfa90d028" Dec 07 19:21:40 crc kubenswrapper[4815]: E1207 19:21:40.916262 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"565fe84718016e2f2636ae43b41a6f803b351df134f66c9e1713f4edfa90d028\": container with ID starting with 565fe84718016e2f2636ae43b41a6f803b351df134f66c9e1713f4edfa90d028 not found: ID does not exist" containerID="565fe84718016e2f2636ae43b41a6f803b351df134f66c9e1713f4edfa90d028" Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.916288 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"565fe84718016e2f2636ae43b41a6f803b351df134f66c9e1713f4edfa90d028"} err="failed to get container status \"565fe84718016e2f2636ae43b41a6f803b351df134f66c9e1713f4edfa90d028\": rpc error: code = NotFound desc = could not find container \"565fe84718016e2f2636ae43b41a6f803b351df134f66c9e1713f4edfa90d028\": container with ID starting with 565fe84718016e2f2636ae43b41a6f803b351df134f66c9e1713f4edfa90d028 not found: ID does not exist" Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.916307 4815 scope.go:117] "RemoveContainer" containerID="dc316da3ca756944a74687ba403b388d207be913f64f2d8a90ec49bd7520ac96" Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.933259 4815 scope.go:117] "RemoveContainer" containerID="75940c7ec359ac6048f851d381a8f43d4a904311f6d61fad6deb75f2674599cc" Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.948096 4815 scope.go:117] "RemoveContainer" containerID="27c861801eede3f346f990c138530d885e28b135b9bc397bd566b76f2581926a" Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.960466 4815 scope.go:117] "RemoveContainer" containerID="dc316da3ca756944a74687ba403b388d207be913f64f2d8a90ec49bd7520ac96" Dec 07 19:21:40 crc kubenswrapper[4815]: E1207 19:21:40.961210 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc316da3ca756944a74687ba403b388d207be913f64f2d8a90ec49bd7520ac96\": container with ID starting with dc316da3ca756944a74687ba403b388d207be913f64f2d8a90ec49bd7520ac96 not found: ID does not exist" containerID="dc316da3ca756944a74687ba403b388d207be913f64f2d8a90ec49bd7520ac96" Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.961246 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc316da3ca756944a74687ba403b388d207be913f64f2d8a90ec49bd7520ac96"} err="failed to get container status \"dc316da3ca756944a74687ba403b388d207be913f64f2d8a90ec49bd7520ac96\": rpc error: code = NotFound desc = could not find container \"dc316da3ca756944a74687ba403b388d207be913f64f2d8a90ec49bd7520ac96\": container with ID starting with dc316da3ca756944a74687ba403b388d207be913f64f2d8a90ec49bd7520ac96 not found: ID does not exist" Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.961291 4815 scope.go:117] "RemoveContainer" containerID="75940c7ec359ac6048f851d381a8f43d4a904311f6d61fad6deb75f2674599cc" Dec 07 19:21:40 crc kubenswrapper[4815]: E1207 19:21:40.961725 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75940c7ec359ac6048f851d381a8f43d4a904311f6d61fad6deb75f2674599cc\": container with ID starting with 75940c7ec359ac6048f851d381a8f43d4a904311f6d61fad6deb75f2674599cc not found: ID does not exist" containerID="75940c7ec359ac6048f851d381a8f43d4a904311f6d61fad6deb75f2674599cc" Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.961778 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75940c7ec359ac6048f851d381a8f43d4a904311f6d61fad6deb75f2674599cc"} err="failed to get container status \"75940c7ec359ac6048f851d381a8f43d4a904311f6d61fad6deb75f2674599cc\": rpc error: code = NotFound desc = could not find container \"75940c7ec359ac6048f851d381a8f43d4a904311f6d61fad6deb75f2674599cc\": container with ID starting with 75940c7ec359ac6048f851d381a8f43d4a904311f6d61fad6deb75f2674599cc not found: ID does not exist" Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.961812 4815 scope.go:117] "RemoveContainer" containerID="27c861801eede3f346f990c138530d885e28b135b9bc397bd566b76f2581926a" Dec 07 19:21:40 crc kubenswrapper[4815]: E1207 19:21:40.962150 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27c861801eede3f346f990c138530d885e28b135b9bc397bd566b76f2581926a\": container with ID starting with 27c861801eede3f346f990c138530d885e28b135b9bc397bd566b76f2581926a not found: ID does not exist" containerID="27c861801eede3f346f990c138530d885e28b135b9bc397bd566b76f2581926a" Dec 07 19:21:40 crc kubenswrapper[4815]: I1207 19:21:40.962173 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27c861801eede3f346f990c138530d885e28b135b9bc397bd566b76f2581926a"} err="failed to get container status \"27c861801eede3f346f990c138530d885e28b135b9bc397bd566b76f2581926a\": rpc error: code = NotFound desc = could not find container \"27c861801eede3f346f990c138530d885e28b135b9bc397bd566b76f2581926a\": container with ID starting with 27c861801eede3f346f990c138530d885e28b135b9bc397bd566b76f2581926a not found: ID does not exist" Dec 07 19:21:41 crc kubenswrapper[4815]: I1207 19:21:41.632546 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-x887s" event={"ID":"c32bb3fd-40b2-4f28-9fad-9283162b80c1","Type":"ContainerStarted","Data":"5e4ef9a4ae3979f2e13eb6c35a733192255fa9b257e713bb4edba91725070b14"} Dec 07 19:21:41 crc kubenswrapper[4815]: I1207 19:21:41.632587 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-x887s" event={"ID":"c32bb3fd-40b2-4f28-9fad-9283162b80c1","Type":"ContainerStarted","Data":"909d3689689e80d2331d2214c6b64a524312addc797b35c259c9c336cc1eb3b7"} Dec 07 19:21:41 crc kubenswrapper[4815]: I1207 19:21:41.634627 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-x887s" Dec 07 19:21:41 crc kubenswrapper[4815]: I1207 19:21:41.635969 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-x887s" Dec 07 19:21:41 crc kubenswrapper[4815]: I1207 19:21:41.662375 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-x887s" podStartSLOduration=2.662332816 podStartE2EDuration="2.662332816s" podCreationTimestamp="2025-12-07 19:21:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:21:41.65417726 +0000 UTC m=+406.233167315" watchObservedRunningTime="2025-12-07 19:21:41.662332816 +0000 UTC m=+406.241322871" Dec 07 19:21:41 crc kubenswrapper[4815]: I1207 19:21:41.777341 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04fe5f63-515f-4b66-963c-c2ce259b9bad" path="/var/lib/kubelet/pods/04fe5f63-515f-4b66-963c-c2ce259b9bad/volumes" Dec 07 19:21:41 crc kubenswrapper[4815]: I1207 19:21:41.778149 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34c69927-3b4e-4e18-8201-27eb981bad10" path="/var/lib/kubelet/pods/34c69927-3b4e-4e18-8201-27eb981bad10/volumes" Dec 07 19:21:41 crc kubenswrapper[4815]: I1207 19:21:41.778846 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69cc2280-9dd4-43d1-87d6-c54a5b801a32" path="/var/lib/kubelet/pods/69cc2280-9dd4-43d1-87d6-c54a5b801a32/volumes" Dec 07 19:21:41 crc kubenswrapper[4815]: I1207 19:21:41.780903 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9de4e38-617b-41a4-b97f-155d559d497a" path="/var/lib/kubelet/pods/b9de4e38-617b-41a4-b97f-155d559d497a/volumes" Dec 07 19:21:41 crc kubenswrapper[4815]: I1207 19:21:41.781577 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e58da229-9be5-4d48-a1af-74d5316d09f3" path="/var/lib/kubelet/pods/e58da229-9be5-4d48-a1af-74d5316d09f3/volumes" Dec 07 19:21:41 crc kubenswrapper[4815]: I1207 19:21:41.849841 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4qt4c"] Dec 07 19:21:41 crc kubenswrapper[4815]: E1207 19:21:41.850054 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69cc2280-9dd4-43d1-87d6-c54a5b801a32" containerName="registry-server" Dec 07 19:21:41 crc kubenswrapper[4815]: I1207 19:21:41.850067 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="69cc2280-9dd4-43d1-87d6-c54a5b801a32" containerName="registry-server" Dec 07 19:21:41 crc kubenswrapper[4815]: E1207 19:21:41.850079 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e58da229-9be5-4d48-a1af-74d5316d09f3" containerName="marketplace-operator" Dec 07 19:21:41 crc kubenswrapper[4815]: I1207 19:21:41.850085 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="e58da229-9be5-4d48-a1af-74d5316d09f3" containerName="marketplace-operator" Dec 07 19:21:41 crc kubenswrapper[4815]: E1207 19:21:41.850093 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34c69927-3b4e-4e18-8201-27eb981bad10" containerName="registry-server" Dec 07 19:21:41 crc kubenswrapper[4815]: I1207 19:21:41.850100 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="34c69927-3b4e-4e18-8201-27eb981bad10" containerName="registry-server" Dec 07 19:21:41 crc kubenswrapper[4815]: E1207 19:21:41.850109 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69cc2280-9dd4-43d1-87d6-c54a5b801a32" containerName="extract-content" Dec 07 19:21:41 crc kubenswrapper[4815]: I1207 19:21:41.850114 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="69cc2280-9dd4-43d1-87d6-c54a5b801a32" containerName="extract-content" Dec 07 19:21:41 crc kubenswrapper[4815]: E1207 19:21:41.850123 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04fe5f63-515f-4b66-963c-c2ce259b9bad" containerName="extract-utilities" Dec 07 19:21:41 crc kubenswrapper[4815]: I1207 19:21:41.850130 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="04fe5f63-515f-4b66-963c-c2ce259b9bad" containerName="extract-utilities" Dec 07 19:21:41 crc kubenswrapper[4815]: E1207 19:21:41.850141 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04fe5f63-515f-4b66-963c-c2ce259b9bad" containerName="extract-content" Dec 07 19:21:41 crc kubenswrapper[4815]: I1207 19:21:41.850147 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="04fe5f63-515f-4b66-963c-c2ce259b9bad" containerName="extract-content" Dec 07 19:21:41 crc kubenswrapper[4815]: E1207 19:21:41.850155 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9de4e38-617b-41a4-b97f-155d559d497a" containerName="extract-content" Dec 07 19:21:41 crc kubenswrapper[4815]: I1207 19:21:41.850160 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9de4e38-617b-41a4-b97f-155d559d497a" containerName="extract-content" Dec 07 19:21:41 crc kubenswrapper[4815]: E1207 19:21:41.850169 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9de4e38-617b-41a4-b97f-155d559d497a" containerName="extract-utilities" Dec 07 19:21:41 crc kubenswrapper[4815]: I1207 19:21:41.850175 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9de4e38-617b-41a4-b97f-155d559d497a" containerName="extract-utilities" Dec 07 19:21:41 crc kubenswrapper[4815]: E1207 19:21:41.850183 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34c69927-3b4e-4e18-8201-27eb981bad10" containerName="extract-utilities" Dec 07 19:21:41 crc kubenswrapper[4815]: I1207 19:21:41.850189 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="34c69927-3b4e-4e18-8201-27eb981bad10" containerName="extract-utilities" Dec 07 19:21:41 crc kubenswrapper[4815]: E1207 19:21:41.850198 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34c69927-3b4e-4e18-8201-27eb981bad10" containerName="extract-content" Dec 07 19:21:41 crc kubenswrapper[4815]: I1207 19:21:41.850206 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="34c69927-3b4e-4e18-8201-27eb981bad10" containerName="extract-content" Dec 07 19:21:41 crc kubenswrapper[4815]: E1207 19:21:41.850216 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04fe5f63-515f-4b66-963c-c2ce259b9bad" containerName="registry-server" Dec 07 19:21:41 crc kubenswrapper[4815]: I1207 19:21:41.850222 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="04fe5f63-515f-4b66-963c-c2ce259b9bad" containerName="registry-server" Dec 07 19:21:41 crc kubenswrapper[4815]: E1207 19:21:41.850228 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9de4e38-617b-41a4-b97f-155d559d497a" containerName="registry-server" Dec 07 19:21:41 crc kubenswrapper[4815]: I1207 19:21:41.850233 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9de4e38-617b-41a4-b97f-155d559d497a" containerName="registry-server" Dec 07 19:21:41 crc kubenswrapper[4815]: E1207 19:21:41.850242 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69cc2280-9dd4-43d1-87d6-c54a5b801a32" containerName="extract-utilities" Dec 07 19:21:41 crc kubenswrapper[4815]: I1207 19:21:41.850248 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="69cc2280-9dd4-43d1-87d6-c54a5b801a32" containerName="extract-utilities" Dec 07 19:21:41 crc kubenswrapper[4815]: I1207 19:21:41.850324 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="34c69927-3b4e-4e18-8201-27eb981bad10" containerName="registry-server" Dec 07 19:21:41 crc kubenswrapper[4815]: I1207 19:21:41.850335 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="e58da229-9be5-4d48-a1af-74d5316d09f3" containerName="marketplace-operator" Dec 07 19:21:41 crc kubenswrapper[4815]: I1207 19:21:41.850343 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="e58da229-9be5-4d48-a1af-74d5316d09f3" containerName="marketplace-operator" Dec 07 19:21:41 crc kubenswrapper[4815]: I1207 19:21:41.850352 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="04fe5f63-515f-4b66-963c-c2ce259b9bad" containerName="registry-server" Dec 07 19:21:41 crc kubenswrapper[4815]: I1207 19:21:41.850357 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9de4e38-617b-41a4-b97f-155d559d497a" containerName="registry-server" Dec 07 19:21:41 crc kubenswrapper[4815]: I1207 19:21:41.850366 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="69cc2280-9dd4-43d1-87d6-c54a5b801a32" containerName="registry-server" Dec 07 19:21:41 crc kubenswrapper[4815]: E1207 19:21:41.850515 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e58da229-9be5-4d48-a1af-74d5316d09f3" containerName="marketplace-operator" Dec 07 19:21:41 crc kubenswrapper[4815]: I1207 19:21:41.850522 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="e58da229-9be5-4d48-a1af-74d5316d09f3" containerName="marketplace-operator" Dec 07 19:21:41 crc kubenswrapper[4815]: I1207 19:21:41.851095 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4qt4c" Dec 07 19:21:41 crc kubenswrapper[4815]: I1207 19:21:41.853597 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 07 19:21:41 crc kubenswrapper[4815]: I1207 19:21:41.857608 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4qt4c"] Dec 07 19:21:41 crc kubenswrapper[4815]: I1207 19:21:41.988685 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20784fa7-7f38-4dde-a229-37dc1db2e351-catalog-content\") pod \"redhat-marketplace-4qt4c\" (UID: \"20784fa7-7f38-4dde-a229-37dc1db2e351\") " pod="openshift-marketplace/redhat-marketplace-4qt4c" Dec 07 19:21:41 crc kubenswrapper[4815]: I1207 19:21:41.988746 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qwph\" (UniqueName: \"kubernetes.io/projected/20784fa7-7f38-4dde-a229-37dc1db2e351-kube-api-access-8qwph\") pod \"redhat-marketplace-4qt4c\" (UID: \"20784fa7-7f38-4dde-a229-37dc1db2e351\") " pod="openshift-marketplace/redhat-marketplace-4qt4c" Dec 07 19:21:41 crc kubenswrapper[4815]: I1207 19:21:41.988789 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20784fa7-7f38-4dde-a229-37dc1db2e351-utilities\") pod \"redhat-marketplace-4qt4c\" (UID: \"20784fa7-7f38-4dde-a229-37dc1db2e351\") " pod="openshift-marketplace/redhat-marketplace-4qt4c" Dec 07 19:21:42 crc kubenswrapper[4815]: I1207 19:21:42.044889 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-rgtd5"] Dec 07 19:21:42 crc kubenswrapper[4815]: I1207 19:21:42.045847 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rgtd5" Dec 07 19:21:42 crc kubenswrapper[4815]: I1207 19:21:42.049233 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 07 19:21:42 crc kubenswrapper[4815]: I1207 19:21:42.056789 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rgtd5"] Dec 07 19:21:42 crc kubenswrapper[4815]: I1207 19:21:42.089981 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20784fa7-7f38-4dde-a229-37dc1db2e351-utilities\") pod \"redhat-marketplace-4qt4c\" (UID: \"20784fa7-7f38-4dde-a229-37dc1db2e351\") " pod="openshift-marketplace/redhat-marketplace-4qt4c" Dec 07 19:21:42 crc kubenswrapper[4815]: I1207 19:21:42.090069 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20784fa7-7f38-4dde-a229-37dc1db2e351-catalog-content\") pod \"redhat-marketplace-4qt4c\" (UID: \"20784fa7-7f38-4dde-a229-37dc1db2e351\") " pod="openshift-marketplace/redhat-marketplace-4qt4c" Dec 07 19:21:42 crc kubenswrapper[4815]: I1207 19:21:42.090102 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qwph\" (UniqueName: \"kubernetes.io/projected/20784fa7-7f38-4dde-a229-37dc1db2e351-kube-api-access-8qwph\") pod \"redhat-marketplace-4qt4c\" (UID: \"20784fa7-7f38-4dde-a229-37dc1db2e351\") " pod="openshift-marketplace/redhat-marketplace-4qt4c" Dec 07 19:21:42 crc kubenswrapper[4815]: I1207 19:21:42.090721 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20784fa7-7f38-4dde-a229-37dc1db2e351-utilities\") pod \"redhat-marketplace-4qt4c\" (UID: \"20784fa7-7f38-4dde-a229-37dc1db2e351\") " pod="openshift-marketplace/redhat-marketplace-4qt4c" Dec 07 19:21:42 crc kubenswrapper[4815]: I1207 19:21:42.090972 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20784fa7-7f38-4dde-a229-37dc1db2e351-catalog-content\") pod \"redhat-marketplace-4qt4c\" (UID: \"20784fa7-7f38-4dde-a229-37dc1db2e351\") " pod="openshift-marketplace/redhat-marketplace-4qt4c" Dec 07 19:21:42 crc kubenswrapper[4815]: I1207 19:21:42.115218 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qwph\" (UniqueName: \"kubernetes.io/projected/20784fa7-7f38-4dde-a229-37dc1db2e351-kube-api-access-8qwph\") pod \"redhat-marketplace-4qt4c\" (UID: \"20784fa7-7f38-4dde-a229-37dc1db2e351\") " pod="openshift-marketplace/redhat-marketplace-4qt4c" Dec 07 19:21:42 crc kubenswrapper[4815]: I1207 19:21:42.177566 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4qt4c" Dec 07 19:21:42 crc kubenswrapper[4815]: I1207 19:21:42.191929 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64tcn\" (UniqueName: \"kubernetes.io/projected/ca317190-c788-4b00-9702-237d043cb5ed-kube-api-access-64tcn\") pod \"redhat-operators-rgtd5\" (UID: \"ca317190-c788-4b00-9702-237d043cb5ed\") " pod="openshift-marketplace/redhat-operators-rgtd5" Dec 07 19:21:42 crc kubenswrapper[4815]: I1207 19:21:42.192091 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca317190-c788-4b00-9702-237d043cb5ed-catalog-content\") pod \"redhat-operators-rgtd5\" (UID: \"ca317190-c788-4b00-9702-237d043cb5ed\") " pod="openshift-marketplace/redhat-operators-rgtd5" Dec 07 19:21:42 crc kubenswrapper[4815]: I1207 19:21:42.192146 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca317190-c788-4b00-9702-237d043cb5ed-utilities\") pod \"redhat-operators-rgtd5\" (UID: \"ca317190-c788-4b00-9702-237d043cb5ed\") " pod="openshift-marketplace/redhat-operators-rgtd5" Dec 07 19:21:42 crc kubenswrapper[4815]: I1207 19:21:42.293207 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64tcn\" (UniqueName: \"kubernetes.io/projected/ca317190-c788-4b00-9702-237d043cb5ed-kube-api-access-64tcn\") pod \"redhat-operators-rgtd5\" (UID: \"ca317190-c788-4b00-9702-237d043cb5ed\") " pod="openshift-marketplace/redhat-operators-rgtd5" Dec 07 19:21:42 crc kubenswrapper[4815]: I1207 19:21:42.293477 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca317190-c788-4b00-9702-237d043cb5ed-catalog-content\") pod \"redhat-operators-rgtd5\" (UID: \"ca317190-c788-4b00-9702-237d043cb5ed\") " pod="openshift-marketplace/redhat-operators-rgtd5" Dec 07 19:21:42 crc kubenswrapper[4815]: I1207 19:21:42.293516 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca317190-c788-4b00-9702-237d043cb5ed-utilities\") pod \"redhat-operators-rgtd5\" (UID: \"ca317190-c788-4b00-9702-237d043cb5ed\") " pod="openshift-marketplace/redhat-operators-rgtd5" Dec 07 19:21:42 crc kubenswrapper[4815]: I1207 19:21:42.294012 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca317190-c788-4b00-9702-237d043cb5ed-catalog-content\") pod \"redhat-operators-rgtd5\" (UID: \"ca317190-c788-4b00-9702-237d043cb5ed\") " pod="openshift-marketplace/redhat-operators-rgtd5" Dec 07 19:21:42 crc kubenswrapper[4815]: I1207 19:21:42.294233 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca317190-c788-4b00-9702-237d043cb5ed-utilities\") pod \"redhat-operators-rgtd5\" (UID: \"ca317190-c788-4b00-9702-237d043cb5ed\") " pod="openshift-marketplace/redhat-operators-rgtd5" Dec 07 19:21:42 crc kubenswrapper[4815]: I1207 19:21:42.311874 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64tcn\" (UniqueName: \"kubernetes.io/projected/ca317190-c788-4b00-9702-237d043cb5ed-kube-api-access-64tcn\") pod \"redhat-operators-rgtd5\" (UID: \"ca317190-c788-4b00-9702-237d043cb5ed\") " pod="openshift-marketplace/redhat-operators-rgtd5" Dec 07 19:21:42 crc kubenswrapper[4815]: I1207 19:21:42.364532 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rgtd5" Dec 07 19:21:42 crc kubenswrapper[4815]: I1207 19:21:42.583284 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4qt4c"] Dec 07 19:21:42 crc kubenswrapper[4815]: W1207 19:21:42.589471 4815 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20784fa7_7f38_4dde_a229_37dc1db2e351.slice/crio-2d71e5ba4b5560f65ee4085abac1d048bb7bde6263d0415fdbbcc99b27c443aa WatchSource:0}: Error finding container 2d71e5ba4b5560f65ee4085abac1d048bb7bde6263d0415fdbbcc99b27c443aa: Status 404 returned error can't find the container with id 2d71e5ba4b5560f65ee4085abac1d048bb7bde6263d0415fdbbcc99b27c443aa Dec 07 19:21:42 crc kubenswrapper[4815]: I1207 19:21:42.643487 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4qt4c" event={"ID":"20784fa7-7f38-4dde-a229-37dc1db2e351","Type":"ContainerStarted","Data":"2d71e5ba4b5560f65ee4085abac1d048bb7bde6263d0415fdbbcc99b27c443aa"} Dec 07 19:21:42 crc kubenswrapper[4815]: I1207 19:21:42.836855 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rgtd5"] Dec 07 19:21:42 crc kubenswrapper[4815]: W1207 19:21:42.845122 4815 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca317190_c788_4b00_9702_237d043cb5ed.slice/crio-a730b68299c56c38eff8e60344b477e33fb3aa06f5000d043f65847730d5ecd1 WatchSource:0}: Error finding container a730b68299c56c38eff8e60344b477e33fb3aa06f5000d043f65847730d5ecd1: Status 404 returned error can't find the container with id a730b68299c56c38eff8e60344b477e33fb3aa06f5000d043f65847730d5ecd1 Dec 07 19:21:43 crc kubenswrapper[4815]: I1207 19:21:43.648697 4815 generic.go:334] "Generic (PLEG): container finished" podID="20784fa7-7f38-4dde-a229-37dc1db2e351" containerID="df863c31e8db1a7488154c407ce810e2de8310c2b0c37cab05ec7ffcae202cf0" exitCode=0 Dec 07 19:21:43 crc kubenswrapper[4815]: I1207 19:21:43.648795 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4qt4c" event={"ID":"20784fa7-7f38-4dde-a229-37dc1db2e351","Type":"ContainerDied","Data":"df863c31e8db1a7488154c407ce810e2de8310c2b0c37cab05ec7ffcae202cf0"} Dec 07 19:21:43 crc kubenswrapper[4815]: I1207 19:21:43.650071 4815 generic.go:334] "Generic (PLEG): container finished" podID="ca317190-c788-4b00-9702-237d043cb5ed" containerID="b127c82514166a8aad6d41b645c54618198d73f67580a9122ac738140a1101e7" exitCode=0 Dec 07 19:21:43 crc kubenswrapper[4815]: I1207 19:21:43.650245 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rgtd5" event={"ID":"ca317190-c788-4b00-9702-237d043cb5ed","Type":"ContainerDied","Data":"b127c82514166a8aad6d41b645c54618198d73f67580a9122ac738140a1101e7"} Dec 07 19:21:43 crc kubenswrapper[4815]: I1207 19:21:43.650302 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rgtd5" event={"ID":"ca317190-c788-4b00-9702-237d043cb5ed","Type":"ContainerStarted","Data":"a730b68299c56c38eff8e60344b477e33fb3aa06f5000d043f65847730d5ecd1"} Dec 07 19:21:44 crc kubenswrapper[4815]: I1207 19:21:44.257385 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qz8lw"] Dec 07 19:21:44 crc kubenswrapper[4815]: I1207 19:21:44.259512 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qz8lw" Dec 07 19:21:44 crc kubenswrapper[4815]: I1207 19:21:44.262668 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qz8lw"] Dec 07 19:21:44 crc kubenswrapper[4815]: I1207 19:21:44.263163 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 07 19:21:44 crc kubenswrapper[4815]: I1207 19:21:44.419053 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c190de3-1f0d-4455-ad0d-551ead201424-catalog-content\") pod \"certified-operators-qz8lw\" (UID: \"3c190de3-1f0d-4455-ad0d-551ead201424\") " pod="openshift-marketplace/certified-operators-qz8lw" Dec 07 19:21:44 crc kubenswrapper[4815]: I1207 19:21:44.419129 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgnkt\" (UniqueName: \"kubernetes.io/projected/3c190de3-1f0d-4455-ad0d-551ead201424-kube-api-access-rgnkt\") pod \"certified-operators-qz8lw\" (UID: \"3c190de3-1f0d-4455-ad0d-551ead201424\") " pod="openshift-marketplace/certified-operators-qz8lw" Dec 07 19:21:44 crc kubenswrapper[4815]: I1207 19:21:44.419156 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c190de3-1f0d-4455-ad0d-551ead201424-utilities\") pod \"certified-operators-qz8lw\" (UID: \"3c190de3-1f0d-4455-ad0d-551ead201424\") " pod="openshift-marketplace/certified-operators-qz8lw" Dec 07 19:21:44 crc kubenswrapper[4815]: I1207 19:21:44.447847 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9xq4b"] Dec 07 19:21:44 crc kubenswrapper[4815]: I1207 19:21:44.448816 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9xq4b" Dec 07 19:21:44 crc kubenswrapper[4815]: I1207 19:21:44.450442 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 07 19:21:44 crc kubenswrapper[4815]: I1207 19:21:44.501105 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9xq4b"] Dec 07 19:21:44 crc kubenswrapper[4815]: I1207 19:21:44.520154 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bbe87d56-0208-4acc-aa62-97ba74eb4163-catalog-content\") pod \"community-operators-9xq4b\" (UID: \"bbe87d56-0208-4acc-aa62-97ba74eb4163\") " pod="openshift-marketplace/community-operators-9xq4b" Dec 07 19:21:44 crc kubenswrapper[4815]: I1207 19:21:44.520217 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bbe87d56-0208-4acc-aa62-97ba74eb4163-utilities\") pod \"community-operators-9xq4b\" (UID: \"bbe87d56-0208-4acc-aa62-97ba74eb4163\") " pod="openshift-marketplace/community-operators-9xq4b" Dec 07 19:21:44 crc kubenswrapper[4815]: I1207 19:21:44.520325 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rd47\" (UniqueName: \"kubernetes.io/projected/bbe87d56-0208-4acc-aa62-97ba74eb4163-kube-api-access-8rd47\") pod \"community-operators-9xq4b\" (UID: \"bbe87d56-0208-4acc-aa62-97ba74eb4163\") " pod="openshift-marketplace/community-operators-9xq4b" Dec 07 19:21:44 crc kubenswrapper[4815]: I1207 19:21:44.520380 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c190de3-1f0d-4455-ad0d-551ead201424-catalog-content\") pod \"certified-operators-qz8lw\" (UID: \"3c190de3-1f0d-4455-ad0d-551ead201424\") " pod="openshift-marketplace/certified-operators-qz8lw" Dec 07 19:21:44 crc kubenswrapper[4815]: I1207 19:21:44.520423 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgnkt\" (UniqueName: \"kubernetes.io/projected/3c190de3-1f0d-4455-ad0d-551ead201424-kube-api-access-rgnkt\") pod \"certified-operators-qz8lw\" (UID: \"3c190de3-1f0d-4455-ad0d-551ead201424\") " pod="openshift-marketplace/certified-operators-qz8lw" Dec 07 19:21:44 crc kubenswrapper[4815]: I1207 19:21:44.520453 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c190de3-1f0d-4455-ad0d-551ead201424-utilities\") pod \"certified-operators-qz8lw\" (UID: \"3c190de3-1f0d-4455-ad0d-551ead201424\") " pod="openshift-marketplace/certified-operators-qz8lw" Dec 07 19:21:44 crc kubenswrapper[4815]: I1207 19:21:44.521019 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c190de3-1f0d-4455-ad0d-551ead201424-catalog-content\") pod \"certified-operators-qz8lw\" (UID: \"3c190de3-1f0d-4455-ad0d-551ead201424\") " pod="openshift-marketplace/certified-operators-qz8lw" Dec 07 19:21:44 crc kubenswrapper[4815]: I1207 19:21:44.521400 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c190de3-1f0d-4455-ad0d-551ead201424-utilities\") pod \"certified-operators-qz8lw\" (UID: \"3c190de3-1f0d-4455-ad0d-551ead201424\") " pod="openshift-marketplace/certified-operators-qz8lw" Dec 07 19:21:44 crc kubenswrapper[4815]: I1207 19:21:44.538772 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgnkt\" (UniqueName: \"kubernetes.io/projected/3c190de3-1f0d-4455-ad0d-551ead201424-kube-api-access-rgnkt\") pod \"certified-operators-qz8lw\" (UID: \"3c190de3-1f0d-4455-ad0d-551ead201424\") " pod="openshift-marketplace/certified-operators-qz8lw" Dec 07 19:21:44 crc kubenswrapper[4815]: I1207 19:21:44.619846 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qz8lw" Dec 07 19:21:44 crc kubenswrapper[4815]: I1207 19:21:44.621175 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bbe87d56-0208-4acc-aa62-97ba74eb4163-utilities\") pod \"community-operators-9xq4b\" (UID: \"bbe87d56-0208-4acc-aa62-97ba74eb4163\") " pod="openshift-marketplace/community-operators-9xq4b" Dec 07 19:21:44 crc kubenswrapper[4815]: I1207 19:21:44.621244 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rd47\" (UniqueName: \"kubernetes.io/projected/bbe87d56-0208-4acc-aa62-97ba74eb4163-kube-api-access-8rd47\") pod \"community-operators-9xq4b\" (UID: \"bbe87d56-0208-4acc-aa62-97ba74eb4163\") " pod="openshift-marketplace/community-operators-9xq4b" Dec 07 19:21:44 crc kubenswrapper[4815]: I1207 19:21:44.621294 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bbe87d56-0208-4acc-aa62-97ba74eb4163-catalog-content\") pod \"community-operators-9xq4b\" (UID: \"bbe87d56-0208-4acc-aa62-97ba74eb4163\") " pod="openshift-marketplace/community-operators-9xq4b" Dec 07 19:21:44 crc kubenswrapper[4815]: I1207 19:21:44.621877 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bbe87d56-0208-4acc-aa62-97ba74eb4163-catalog-content\") pod \"community-operators-9xq4b\" (UID: \"bbe87d56-0208-4acc-aa62-97ba74eb4163\") " pod="openshift-marketplace/community-operators-9xq4b" Dec 07 19:21:44 crc kubenswrapper[4815]: I1207 19:21:44.622065 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bbe87d56-0208-4acc-aa62-97ba74eb4163-utilities\") pod \"community-operators-9xq4b\" (UID: \"bbe87d56-0208-4acc-aa62-97ba74eb4163\") " pod="openshift-marketplace/community-operators-9xq4b" Dec 07 19:21:44 crc kubenswrapper[4815]: I1207 19:21:44.651391 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rd47\" (UniqueName: \"kubernetes.io/projected/bbe87d56-0208-4acc-aa62-97ba74eb4163-kube-api-access-8rd47\") pod \"community-operators-9xq4b\" (UID: \"bbe87d56-0208-4acc-aa62-97ba74eb4163\") " pod="openshift-marketplace/community-operators-9xq4b" Dec 07 19:21:44 crc kubenswrapper[4815]: I1207 19:21:44.659143 4815 generic.go:334] "Generic (PLEG): container finished" podID="20784fa7-7f38-4dde-a229-37dc1db2e351" containerID="746fe0144eb4498b8c5f7698e100a7307106b86b18874a5b2a081890f097c449" exitCode=0 Dec 07 19:21:44 crc kubenswrapper[4815]: I1207 19:21:44.659497 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4qt4c" event={"ID":"20784fa7-7f38-4dde-a229-37dc1db2e351","Type":"ContainerDied","Data":"746fe0144eb4498b8c5f7698e100a7307106b86b18874a5b2a081890f097c449"} Dec 07 19:21:44 crc kubenswrapper[4815]: I1207 19:21:44.663459 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rgtd5" event={"ID":"ca317190-c788-4b00-9702-237d043cb5ed","Type":"ContainerStarted","Data":"a00d1fa633be6bae799c9635ec2ce9cb0e73a40cfeeb1158b254297086340818"} Dec 07 19:21:44 crc kubenswrapper[4815]: I1207 19:21:44.761369 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9xq4b" Dec 07 19:21:45 crc kubenswrapper[4815]: I1207 19:21:45.078506 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qz8lw"] Dec 07 19:21:45 crc kubenswrapper[4815]: W1207 19:21:45.085416 4815 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c190de3_1f0d_4455_ad0d_551ead201424.slice/crio-c803c5a4307033f7c8a9d4d9dcb289f85dd8294803b9ff52330f7013a68cf73a WatchSource:0}: Error finding container c803c5a4307033f7c8a9d4d9dcb289f85dd8294803b9ff52330f7013a68cf73a: Status 404 returned error can't find the container with id c803c5a4307033f7c8a9d4d9dcb289f85dd8294803b9ff52330f7013a68cf73a Dec 07 19:21:45 crc kubenswrapper[4815]: I1207 19:21:45.199755 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9xq4b"] Dec 07 19:21:45 crc kubenswrapper[4815]: I1207 19:21:45.674531 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4qt4c" event={"ID":"20784fa7-7f38-4dde-a229-37dc1db2e351","Type":"ContainerStarted","Data":"b4af74fe673f2b7f3cbf7775c812deae17a58e9b0a79c399bd077459472c5599"} Dec 07 19:21:45 crc kubenswrapper[4815]: I1207 19:21:45.677511 4815 generic.go:334] "Generic (PLEG): container finished" podID="ca317190-c788-4b00-9702-237d043cb5ed" containerID="a00d1fa633be6bae799c9635ec2ce9cb0e73a40cfeeb1158b254297086340818" exitCode=0 Dec 07 19:21:45 crc kubenswrapper[4815]: I1207 19:21:45.677880 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rgtd5" event={"ID":"ca317190-c788-4b00-9702-237d043cb5ed","Type":"ContainerDied","Data":"a00d1fa633be6bae799c9635ec2ce9cb0e73a40cfeeb1158b254297086340818"} Dec 07 19:21:45 crc kubenswrapper[4815]: I1207 19:21:45.683704 4815 generic.go:334] "Generic (PLEG): container finished" podID="bbe87d56-0208-4acc-aa62-97ba74eb4163" containerID="d70d91439df492d7e97feb5e83c2189214337b93fc11a286cfb3ad0408097065" exitCode=0 Dec 07 19:21:45 crc kubenswrapper[4815]: I1207 19:21:45.683785 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9xq4b" event={"ID":"bbe87d56-0208-4acc-aa62-97ba74eb4163","Type":"ContainerDied","Data":"d70d91439df492d7e97feb5e83c2189214337b93fc11a286cfb3ad0408097065"} Dec 07 19:21:45 crc kubenswrapper[4815]: I1207 19:21:45.683827 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9xq4b" event={"ID":"bbe87d56-0208-4acc-aa62-97ba74eb4163","Type":"ContainerStarted","Data":"838e2de474e1226b8ab243803fa9867cc89b6c54868adfac75c1354cdf30e5b2"} Dec 07 19:21:45 crc kubenswrapper[4815]: I1207 19:21:45.692975 4815 generic.go:334] "Generic (PLEG): container finished" podID="3c190de3-1f0d-4455-ad0d-551ead201424" containerID="1363b538628f45ac63e14f50105ea27a9ba4405534a4549fc6213793c74d9c76" exitCode=0 Dec 07 19:21:45 crc kubenswrapper[4815]: I1207 19:21:45.693023 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qz8lw" event={"ID":"3c190de3-1f0d-4455-ad0d-551ead201424","Type":"ContainerDied","Data":"1363b538628f45ac63e14f50105ea27a9ba4405534a4549fc6213793c74d9c76"} Dec 07 19:21:45 crc kubenswrapper[4815]: I1207 19:21:45.693049 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qz8lw" event={"ID":"3c190de3-1f0d-4455-ad0d-551ead201424","Type":"ContainerStarted","Data":"c803c5a4307033f7c8a9d4d9dcb289f85dd8294803b9ff52330f7013a68cf73a"} Dec 07 19:21:45 crc kubenswrapper[4815]: I1207 19:21:45.694258 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4qt4c" podStartSLOduration=3.2622264420000002 podStartE2EDuration="4.694250084s" podCreationTimestamp="2025-12-07 19:21:41 +0000 UTC" firstStartedPulling="2025-12-07 19:21:43.652157683 +0000 UTC m=+408.231147728" lastFinishedPulling="2025-12-07 19:21:45.084181315 +0000 UTC m=+409.663171370" observedRunningTime="2025-12-07 19:21:45.693207984 +0000 UTC m=+410.272198029" watchObservedRunningTime="2025-12-07 19:21:45.694250084 +0000 UTC m=+410.273240129" Dec 07 19:21:46 crc kubenswrapper[4815]: I1207 19:21:46.700483 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rgtd5" event={"ID":"ca317190-c788-4b00-9702-237d043cb5ed","Type":"ContainerStarted","Data":"7f46e3461c47f9828818be9a1180eb3323698b7af378a1d8175a27fe045aa6f8"} Dec 07 19:21:46 crc kubenswrapper[4815]: I1207 19:21:46.703004 4815 generic.go:334] "Generic (PLEG): container finished" podID="bbe87d56-0208-4acc-aa62-97ba74eb4163" containerID="f6e8005237bf88836ca936665b7d707780492d4359c7db9c55ac08bfbb89327c" exitCode=0 Dec 07 19:21:46 crc kubenswrapper[4815]: I1207 19:21:46.703894 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9xq4b" event={"ID":"bbe87d56-0208-4acc-aa62-97ba74eb4163","Type":"ContainerDied","Data":"f6e8005237bf88836ca936665b7d707780492d4359c7db9c55ac08bfbb89327c"} Dec 07 19:21:46 crc kubenswrapper[4815]: I1207 19:21:46.720003 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-rgtd5" podStartSLOduration=2.253925121 podStartE2EDuration="4.719984185s" podCreationTimestamp="2025-12-07 19:21:42 +0000 UTC" firstStartedPulling="2025-12-07 19:21:43.652159013 +0000 UTC m=+408.231149058" lastFinishedPulling="2025-12-07 19:21:46.118218077 +0000 UTC m=+410.697208122" observedRunningTime="2025-12-07 19:21:46.719570083 +0000 UTC m=+411.298560138" watchObservedRunningTime="2025-12-07 19:21:46.719984185 +0000 UTC m=+411.298974230" Dec 07 19:21:47 crc kubenswrapper[4815]: I1207 19:21:47.710033 4815 generic.go:334] "Generic (PLEG): container finished" podID="3c190de3-1f0d-4455-ad0d-551ead201424" containerID="4c9299b0980990fe5b779f4eeb9429e06edc81bcf1c499439ad9ba440d1d77a4" exitCode=0 Dec 07 19:21:47 crc kubenswrapper[4815]: I1207 19:21:47.710250 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qz8lw" event={"ID":"3c190de3-1f0d-4455-ad0d-551ead201424","Type":"ContainerDied","Data":"4c9299b0980990fe5b779f4eeb9429e06edc81bcf1c499439ad9ba440d1d77a4"} Dec 07 19:21:47 crc kubenswrapper[4815]: I1207 19:21:47.715499 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9xq4b" event={"ID":"bbe87d56-0208-4acc-aa62-97ba74eb4163","Type":"ContainerStarted","Data":"b83871108804b482da45984681f95458f6886dad464d62cf8eb65746a5fd8e8d"} Dec 07 19:21:47 crc kubenswrapper[4815]: I1207 19:21:47.745995 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9xq4b" podStartSLOduration=2.028785533 podStartE2EDuration="3.745978053s" podCreationTimestamp="2025-12-07 19:21:44 +0000 UTC" firstStartedPulling="2025-12-07 19:21:45.686085697 +0000 UTC m=+410.265075742" lastFinishedPulling="2025-12-07 19:21:47.403278217 +0000 UTC m=+411.982268262" observedRunningTime="2025-12-07 19:21:47.744081728 +0000 UTC m=+412.323071783" watchObservedRunningTime="2025-12-07 19:21:47.745978053 +0000 UTC m=+412.324968098" Dec 07 19:21:48 crc kubenswrapper[4815]: I1207 19:21:48.722361 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qz8lw" event={"ID":"3c190de3-1f0d-4455-ad0d-551ead201424","Type":"ContainerStarted","Data":"6369e57d16e022bb53d9d868470e6cb26a25c16e4d7b97e45911731ba60c6b5c"} Dec 07 19:21:49 crc kubenswrapper[4815]: I1207 19:21:49.807083 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-jdcwj" Dec 07 19:21:49 crc kubenswrapper[4815]: I1207 19:21:49.848331 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qz8lw" podStartSLOduration=3.462453573 podStartE2EDuration="5.848316101s" podCreationTimestamp="2025-12-07 19:21:44 +0000 UTC" firstStartedPulling="2025-12-07 19:21:45.695295154 +0000 UTC m=+410.274285199" lastFinishedPulling="2025-12-07 19:21:48.081157682 +0000 UTC m=+412.660147727" observedRunningTime="2025-12-07 19:21:48.753300191 +0000 UTC m=+413.332290246" watchObservedRunningTime="2025-12-07 19:21:49.848316101 +0000 UTC m=+414.427306146" Dec 07 19:21:49 crc kubenswrapper[4815]: I1207 19:21:49.907258 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-k99gl"] Dec 07 19:21:52 crc kubenswrapper[4815]: I1207 19:21:52.179615 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4qt4c" Dec 07 19:21:52 crc kubenswrapper[4815]: I1207 19:21:52.180354 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4qt4c" Dec 07 19:21:52 crc kubenswrapper[4815]: I1207 19:21:52.228671 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4qt4c" Dec 07 19:21:52 crc kubenswrapper[4815]: I1207 19:21:52.364775 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-rgtd5" Dec 07 19:21:52 crc kubenswrapper[4815]: I1207 19:21:52.365864 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-rgtd5" Dec 07 19:21:52 crc kubenswrapper[4815]: I1207 19:21:52.406544 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-rgtd5" Dec 07 19:21:52 crc kubenswrapper[4815]: I1207 19:21:52.785835 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4qt4c" Dec 07 19:21:52 crc kubenswrapper[4815]: I1207 19:21:52.786276 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-rgtd5" Dec 07 19:21:54 crc kubenswrapper[4815]: I1207 19:21:54.620995 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qz8lw" Dec 07 19:21:54 crc kubenswrapper[4815]: I1207 19:21:54.621470 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qz8lw" Dec 07 19:21:54 crc kubenswrapper[4815]: I1207 19:21:54.658015 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qz8lw" Dec 07 19:21:54 crc kubenswrapper[4815]: I1207 19:21:54.762598 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9xq4b" Dec 07 19:21:54 crc kubenswrapper[4815]: I1207 19:21:54.762644 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9xq4b" Dec 07 19:21:54 crc kubenswrapper[4815]: I1207 19:21:54.790946 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qz8lw" Dec 07 19:21:54 crc kubenswrapper[4815]: I1207 19:21:54.797049 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9xq4b" Dec 07 19:21:55 crc kubenswrapper[4815]: I1207 19:21:55.801518 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9xq4b" Dec 07 19:21:56 crc kubenswrapper[4815]: I1207 19:21:56.359443 4815 patch_prober.go:28] interesting pod/machine-config-daemon-gkn4h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 07 19:21:56 crc kubenswrapper[4815]: I1207 19:21:56.359752 4815 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 07 19:21:56 crc kubenswrapper[4815]: I1207 19:21:56.359887 4815 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" Dec 07 19:21:56 crc kubenswrapper[4815]: I1207 19:21:56.360553 4815 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"66d22a4a12607541a7479c5436a415fa94c0b6d69786fcddbf663ce01735e9bc"} pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 07 19:21:56 crc kubenswrapper[4815]: I1207 19:21:56.360679 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" containerName="machine-config-daemon" containerID="cri-o://66d22a4a12607541a7479c5436a415fa94c0b6d69786fcddbf663ce01735e9bc" gracePeriod=600 Dec 07 19:21:58 crc kubenswrapper[4815]: I1207 19:21:58.777827 4815 generic.go:334] "Generic (PLEG): container finished" podID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" containerID="66d22a4a12607541a7479c5436a415fa94c0b6d69786fcddbf663ce01735e9bc" exitCode=0 Dec 07 19:21:58 crc kubenswrapper[4815]: I1207 19:21:58.777907 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" event={"ID":"3d662ba2-aa03-4eea-bd30-8ad40638f6c7","Type":"ContainerDied","Data":"66d22a4a12607541a7479c5436a415fa94c0b6d69786fcddbf663ce01735e9bc"} Dec 07 19:21:58 crc kubenswrapper[4815]: I1207 19:21:58.778374 4815 scope.go:117] "RemoveContainer" containerID="55eba62082762b6527923446c2d419d2eb00919331cf8e59bfa9cb27ca65a82e" Dec 07 19:21:59 crc kubenswrapper[4815]: I1207 19:21:59.784034 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" event={"ID":"3d662ba2-aa03-4eea-bd30-8ad40638f6c7","Type":"ContainerStarted","Data":"2066ef4b58d85dd6cfa5f3ca947456828e3a7a719f157903ecc03c9ca5f9a8f6"} Dec 07 19:22:14 crc kubenswrapper[4815]: I1207 19:22:14.961598 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-k99gl" podUID="4c3b59c8-1311-4f2a-8ec6-699405d3a4ac" containerName="registry" containerID="cri-o://7b32d1fc45f0aa64cb349a9c3afbfadae18e339e188c18f54e630b43efa9ab21" gracePeriod=30 Dec 07 19:22:15 crc kubenswrapper[4815]: I1207 19:22:15.476468 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-k99gl" Dec 07 19:22:15 crc kubenswrapper[4815]: I1207 19:22:15.562194 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4c3b59c8-1311-4f2a-8ec6-699405d3a4ac-registry-certificates\") pod \"4c3b59c8-1311-4f2a-8ec6-699405d3a4ac\" (UID: \"4c3b59c8-1311-4f2a-8ec6-699405d3a4ac\") " Dec 07 19:22:15 crc kubenswrapper[4815]: I1207 19:22:15.562271 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4c3b59c8-1311-4f2a-8ec6-699405d3a4ac-trusted-ca\") pod \"4c3b59c8-1311-4f2a-8ec6-699405d3a4ac\" (UID: \"4c3b59c8-1311-4f2a-8ec6-699405d3a4ac\") " Dec 07 19:22:15 crc kubenswrapper[4815]: I1207 19:22:15.562314 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4c3b59c8-1311-4f2a-8ec6-699405d3a4ac-bound-sa-token\") pod \"4c3b59c8-1311-4f2a-8ec6-699405d3a4ac\" (UID: \"4c3b59c8-1311-4f2a-8ec6-699405d3a4ac\") " Dec 07 19:22:15 crc kubenswrapper[4815]: I1207 19:22:15.562369 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4c3b59c8-1311-4f2a-8ec6-699405d3a4ac-registry-tls\") pod \"4c3b59c8-1311-4f2a-8ec6-699405d3a4ac\" (UID: \"4c3b59c8-1311-4f2a-8ec6-699405d3a4ac\") " Dec 07 19:22:15 crc kubenswrapper[4815]: I1207 19:22:15.562657 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"4c3b59c8-1311-4f2a-8ec6-699405d3a4ac\" (UID: \"4c3b59c8-1311-4f2a-8ec6-699405d3a4ac\") " Dec 07 19:22:15 crc kubenswrapper[4815]: I1207 19:22:15.562759 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4c3b59c8-1311-4f2a-8ec6-699405d3a4ac-installation-pull-secrets\") pod \"4c3b59c8-1311-4f2a-8ec6-699405d3a4ac\" (UID: \"4c3b59c8-1311-4f2a-8ec6-699405d3a4ac\") " Dec 07 19:22:15 crc kubenswrapper[4815]: I1207 19:22:15.562808 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4c3b59c8-1311-4f2a-8ec6-699405d3a4ac-ca-trust-extracted\") pod \"4c3b59c8-1311-4f2a-8ec6-699405d3a4ac\" (UID: \"4c3b59c8-1311-4f2a-8ec6-699405d3a4ac\") " Dec 07 19:22:15 crc kubenswrapper[4815]: I1207 19:22:15.562955 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l99rh\" (UniqueName: \"kubernetes.io/projected/4c3b59c8-1311-4f2a-8ec6-699405d3a4ac-kube-api-access-l99rh\") pod \"4c3b59c8-1311-4f2a-8ec6-699405d3a4ac\" (UID: \"4c3b59c8-1311-4f2a-8ec6-699405d3a4ac\") " Dec 07 19:22:15 crc kubenswrapper[4815]: I1207 19:22:15.564618 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c3b59c8-1311-4f2a-8ec6-699405d3a4ac-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "4c3b59c8-1311-4f2a-8ec6-699405d3a4ac" (UID: "4c3b59c8-1311-4f2a-8ec6-699405d3a4ac"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:22:15 crc kubenswrapper[4815]: I1207 19:22:15.568237 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c3b59c8-1311-4f2a-8ec6-699405d3a4ac-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "4c3b59c8-1311-4f2a-8ec6-699405d3a4ac" (UID: "4c3b59c8-1311-4f2a-8ec6-699405d3a4ac"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:22:15 crc kubenswrapper[4815]: I1207 19:22:15.570218 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c3b59c8-1311-4f2a-8ec6-699405d3a4ac-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "4c3b59c8-1311-4f2a-8ec6-699405d3a4ac" (UID: "4c3b59c8-1311-4f2a-8ec6-699405d3a4ac"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:22:15 crc kubenswrapper[4815]: I1207 19:22:15.570649 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c3b59c8-1311-4f2a-8ec6-699405d3a4ac-kube-api-access-l99rh" (OuterVolumeSpecName: "kube-api-access-l99rh") pod "4c3b59c8-1311-4f2a-8ec6-699405d3a4ac" (UID: "4c3b59c8-1311-4f2a-8ec6-699405d3a4ac"). InnerVolumeSpecName "kube-api-access-l99rh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:22:15 crc kubenswrapper[4815]: I1207 19:22:15.571333 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c3b59c8-1311-4f2a-8ec6-699405d3a4ac-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "4c3b59c8-1311-4f2a-8ec6-699405d3a4ac" (UID: "4c3b59c8-1311-4f2a-8ec6-699405d3a4ac"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:22:15 crc kubenswrapper[4815]: I1207 19:22:15.572479 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c3b59c8-1311-4f2a-8ec6-699405d3a4ac-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "4c3b59c8-1311-4f2a-8ec6-699405d3a4ac" (UID: "4c3b59c8-1311-4f2a-8ec6-699405d3a4ac"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:22:15 crc kubenswrapper[4815]: I1207 19:22:15.580228 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "4c3b59c8-1311-4f2a-8ec6-699405d3a4ac" (UID: "4c3b59c8-1311-4f2a-8ec6-699405d3a4ac"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 07 19:22:15 crc kubenswrapper[4815]: I1207 19:22:15.602532 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c3b59c8-1311-4f2a-8ec6-699405d3a4ac-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "4c3b59c8-1311-4f2a-8ec6-699405d3a4ac" (UID: "4c3b59c8-1311-4f2a-8ec6-699405d3a4ac"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 07 19:22:15 crc kubenswrapper[4815]: I1207 19:22:15.664673 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l99rh\" (UniqueName: \"kubernetes.io/projected/4c3b59c8-1311-4f2a-8ec6-699405d3a4ac-kube-api-access-l99rh\") on node \"crc\" DevicePath \"\"" Dec 07 19:22:15 crc kubenswrapper[4815]: I1207 19:22:15.664711 4815 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4c3b59c8-1311-4f2a-8ec6-699405d3a4ac-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 07 19:22:15 crc kubenswrapper[4815]: I1207 19:22:15.664725 4815 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4c3b59c8-1311-4f2a-8ec6-699405d3a4ac-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 07 19:22:15 crc kubenswrapper[4815]: I1207 19:22:15.664734 4815 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4c3b59c8-1311-4f2a-8ec6-699405d3a4ac-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 07 19:22:15 crc kubenswrapper[4815]: I1207 19:22:15.664743 4815 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4c3b59c8-1311-4f2a-8ec6-699405d3a4ac-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 07 19:22:15 crc kubenswrapper[4815]: I1207 19:22:15.664752 4815 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4c3b59c8-1311-4f2a-8ec6-699405d3a4ac-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 07 19:22:15 crc kubenswrapper[4815]: I1207 19:22:15.664759 4815 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4c3b59c8-1311-4f2a-8ec6-699405d3a4ac-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 07 19:22:15 crc kubenswrapper[4815]: I1207 19:22:15.899907 4815 generic.go:334] "Generic (PLEG): container finished" podID="4c3b59c8-1311-4f2a-8ec6-699405d3a4ac" containerID="7b32d1fc45f0aa64cb349a9c3afbfadae18e339e188c18f54e630b43efa9ab21" exitCode=0 Dec 07 19:22:15 crc kubenswrapper[4815]: I1207 19:22:15.899978 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-k99gl" event={"ID":"4c3b59c8-1311-4f2a-8ec6-699405d3a4ac","Type":"ContainerDied","Data":"7b32d1fc45f0aa64cb349a9c3afbfadae18e339e188c18f54e630b43efa9ab21"} Dec 07 19:22:15 crc kubenswrapper[4815]: I1207 19:22:15.900095 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-k99gl" event={"ID":"4c3b59c8-1311-4f2a-8ec6-699405d3a4ac","Type":"ContainerDied","Data":"61994bdb28b11b350eeebba08915901e777c8cbb60334b6a0fef728a01742e97"} Dec 07 19:22:15 crc kubenswrapper[4815]: I1207 19:22:15.900139 4815 scope.go:117] "RemoveContainer" containerID="7b32d1fc45f0aa64cb349a9c3afbfadae18e339e188c18f54e630b43efa9ab21" Dec 07 19:22:15 crc kubenswrapper[4815]: I1207 19:22:15.900164 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-k99gl" Dec 07 19:22:15 crc kubenswrapper[4815]: I1207 19:22:15.930611 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-k99gl"] Dec 07 19:22:15 crc kubenswrapper[4815]: I1207 19:22:15.934593 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-k99gl"] Dec 07 19:22:15 crc kubenswrapper[4815]: I1207 19:22:15.941908 4815 scope.go:117] "RemoveContainer" containerID="7b32d1fc45f0aa64cb349a9c3afbfadae18e339e188c18f54e630b43efa9ab21" Dec 07 19:22:15 crc kubenswrapper[4815]: E1207 19:22:15.942647 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b32d1fc45f0aa64cb349a9c3afbfadae18e339e188c18f54e630b43efa9ab21\": container with ID starting with 7b32d1fc45f0aa64cb349a9c3afbfadae18e339e188c18f54e630b43efa9ab21 not found: ID does not exist" containerID="7b32d1fc45f0aa64cb349a9c3afbfadae18e339e188c18f54e630b43efa9ab21" Dec 07 19:22:15 crc kubenswrapper[4815]: I1207 19:22:15.942735 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b32d1fc45f0aa64cb349a9c3afbfadae18e339e188c18f54e630b43efa9ab21"} err="failed to get container status \"7b32d1fc45f0aa64cb349a9c3afbfadae18e339e188c18f54e630b43efa9ab21\": rpc error: code = NotFound desc = could not find container \"7b32d1fc45f0aa64cb349a9c3afbfadae18e339e188c18f54e630b43efa9ab21\": container with ID starting with 7b32d1fc45f0aa64cb349a9c3afbfadae18e339e188c18f54e630b43efa9ab21 not found: ID does not exist" Dec 07 19:22:17 crc kubenswrapper[4815]: I1207 19:22:17.778422 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c3b59c8-1311-4f2a-8ec6-699405d3a4ac" path="/var/lib/kubelet/pods/4c3b59c8-1311-4f2a-8ec6-699405d3a4ac/volumes" Dec 07 19:23:31 crc kubenswrapper[4815]: E1207 19:23:31.695398 4815 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/system.slice/NetworkManager-dispatcher.service\": RecentStats: unable to find data in memory cache]" Dec 07 19:24:26 crc kubenswrapper[4815]: I1207 19:24:26.360141 4815 patch_prober.go:28] interesting pod/machine-config-daemon-gkn4h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 07 19:24:26 crc kubenswrapper[4815]: I1207 19:24:26.360730 4815 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 07 19:24:56 crc kubenswrapper[4815]: I1207 19:24:56.360018 4815 patch_prober.go:28] interesting pod/machine-config-daemon-gkn4h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 07 19:24:56 crc kubenswrapper[4815]: I1207 19:24:56.360793 4815 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 07 19:25:26 crc kubenswrapper[4815]: I1207 19:25:26.360017 4815 patch_prober.go:28] interesting pod/machine-config-daemon-gkn4h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 07 19:25:26 crc kubenswrapper[4815]: I1207 19:25:26.360726 4815 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 07 19:25:26 crc kubenswrapper[4815]: I1207 19:25:26.360805 4815 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" Dec 07 19:25:26 crc kubenswrapper[4815]: I1207 19:25:26.361732 4815 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2066ef4b58d85dd6cfa5f3ca947456828e3a7a719f157903ecc03c9ca5f9a8f6"} pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 07 19:25:26 crc kubenswrapper[4815]: I1207 19:25:26.361848 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" containerName="machine-config-daemon" containerID="cri-o://2066ef4b58d85dd6cfa5f3ca947456828e3a7a719f157903ecc03c9ca5f9a8f6" gracePeriod=600 Dec 07 19:25:27 crc kubenswrapper[4815]: I1207 19:25:27.148204 4815 generic.go:334] "Generic (PLEG): container finished" podID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" containerID="2066ef4b58d85dd6cfa5f3ca947456828e3a7a719f157903ecc03c9ca5f9a8f6" exitCode=0 Dec 07 19:25:27 crc kubenswrapper[4815]: I1207 19:25:27.148294 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" event={"ID":"3d662ba2-aa03-4eea-bd30-8ad40638f6c7","Type":"ContainerDied","Data":"2066ef4b58d85dd6cfa5f3ca947456828e3a7a719f157903ecc03c9ca5f9a8f6"} Dec 07 19:25:27 crc kubenswrapper[4815]: I1207 19:25:27.149064 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" event={"ID":"3d662ba2-aa03-4eea-bd30-8ad40638f6c7","Type":"ContainerStarted","Data":"e659021b677075a133a9841eb7e4e0a1041afcc4554ca5f769bf95532398a212"} Dec 07 19:25:27 crc kubenswrapper[4815]: I1207 19:25:27.149135 4815 scope.go:117] "RemoveContainer" containerID="66d22a4a12607541a7479c5436a415fa94c0b6d69786fcddbf663ce01735e9bc" Dec 07 19:27:02 crc kubenswrapper[4815]: I1207 19:27:02.611956 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-t2msm"] Dec 07 19:27:02 crc kubenswrapper[4815]: E1207 19:27:02.612658 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c3b59c8-1311-4f2a-8ec6-699405d3a4ac" containerName="registry" Dec 07 19:27:02 crc kubenswrapper[4815]: I1207 19:27:02.612669 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c3b59c8-1311-4f2a-8ec6-699405d3a4ac" containerName="registry" Dec 07 19:27:02 crc kubenswrapper[4815]: I1207 19:27:02.612758 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c3b59c8-1311-4f2a-8ec6-699405d3a4ac" containerName="registry" Dec 07 19:27:02 crc kubenswrapper[4815]: I1207 19:27:02.613120 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-t2msm" Dec 07 19:27:02 crc kubenswrapper[4815]: I1207 19:27:02.622757 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Dec 07 19:27:02 crc kubenswrapper[4815]: I1207 19:27:02.624214 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Dec 07 19:27:02 crc kubenswrapper[4815]: I1207 19:27:02.625771 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-52gfx"] Dec 07 19:27:02 crc kubenswrapper[4815]: I1207 19:27:02.626421 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-52gfx" Dec 07 19:27:02 crc kubenswrapper[4815]: I1207 19:27:02.632468 4815 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-rghnk" Dec 07 19:27:02 crc kubenswrapper[4815]: I1207 19:27:02.642384 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-t2msm"] Dec 07 19:27:02 crc kubenswrapper[4815]: I1207 19:27:02.646393 4815 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-q6tpk" Dec 07 19:27:02 crc kubenswrapper[4815]: I1207 19:27:02.660472 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-gghm8"] Dec 07 19:27:02 crc kubenswrapper[4815]: I1207 19:27:02.661084 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-gghm8" Dec 07 19:27:02 crc kubenswrapper[4815]: I1207 19:27:02.665176 4815 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-7r2g9" Dec 07 19:27:02 crc kubenswrapper[4815]: I1207 19:27:02.669711 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-52gfx"] Dec 07 19:27:02 crc kubenswrapper[4815]: I1207 19:27:02.681453 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9m9t\" (UniqueName: \"kubernetes.io/projected/c3a5ced1-3996-46a0-9762-bf0ff63654db-kube-api-access-z9m9t\") pod \"cert-manager-cainjector-7f985d654d-t2msm\" (UID: \"c3a5ced1-3996-46a0-9762-bf0ff63654db\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-t2msm" Dec 07 19:27:02 crc kubenswrapper[4815]: I1207 19:27:02.681501 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vl87q\" (UniqueName: \"kubernetes.io/projected/2e33a89c-dfcf-456f-bf42-5f98dbe2cbca-kube-api-access-vl87q\") pod \"cert-manager-5b446d88c5-52gfx\" (UID: \"2e33a89c-dfcf-456f-bf42-5f98dbe2cbca\") " pod="cert-manager/cert-manager-5b446d88c5-52gfx" Dec 07 19:27:02 crc kubenswrapper[4815]: I1207 19:27:02.682438 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-gghm8"] Dec 07 19:27:02 crc kubenswrapper[4815]: I1207 19:27:02.782246 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cftrl\" (UniqueName: \"kubernetes.io/projected/9181caec-fc89-487a-ba9b-269b81d5a18a-kube-api-access-cftrl\") pod \"cert-manager-webhook-5655c58dd6-gghm8\" (UID: \"9181caec-fc89-487a-ba9b-269b81d5a18a\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-gghm8" Dec 07 19:27:02 crc kubenswrapper[4815]: I1207 19:27:02.782300 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9m9t\" (UniqueName: \"kubernetes.io/projected/c3a5ced1-3996-46a0-9762-bf0ff63654db-kube-api-access-z9m9t\") pod \"cert-manager-cainjector-7f985d654d-t2msm\" (UID: \"c3a5ced1-3996-46a0-9762-bf0ff63654db\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-t2msm" Dec 07 19:27:02 crc kubenswrapper[4815]: I1207 19:27:02.782362 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vl87q\" (UniqueName: \"kubernetes.io/projected/2e33a89c-dfcf-456f-bf42-5f98dbe2cbca-kube-api-access-vl87q\") pod \"cert-manager-5b446d88c5-52gfx\" (UID: \"2e33a89c-dfcf-456f-bf42-5f98dbe2cbca\") " pod="cert-manager/cert-manager-5b446d88c5-52gfx" Dec 07 19:27:02 crc kubenswrapper[4815]: I1207 19:27:02.802029 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9m9t\" (UniqueName: \"kubernetes.io/projected/c3a5ced1-3996-46a0-9762-bf0ff63654db-kube-api-access-z9m9t\") pod \"cert-manager-cainjector-7f985d654d-t2msm\" (UID: \"c3a5ced1-3996-46a0-9762-bf0ff63654db\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-t2msm" Dec 07 19:27:02 crc kubenswrapper[4815]: I1207 19:27:02.803218 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vl87q\" (UniqueName: \"kubernetes.io/projected/2e33a89c-dfcf-456f-bf42-5f98dbe2cbca-kube-api-access-vl87q\") pod \"cert-manager-5b446d88c5-52gfx\" (UID: \"2e33a89c-dfcf-456f-bf42-5f98dbe2cbca\") " pod="cert-manager/cert-manager-5b446d88c5-52gfx" Dec 07 19:27:02 crc kubenswrapper[4815]: I1207 19:27:02.883500 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cftrl\" (UniqueName: \"kubernetes.io/projected/9181caec-fc89-487a-ba9b-269b81d5a18a-kube-api-access-cftrl\") pod \"cert-manager-webhook-5655c58dd6-gghm8\" (UID: \"9181caec-fc89-487a-ba9b-269b81d5a18a\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-gghm8" Dec 07 19:27:02 crc kubenswrapper[4815]: I1207 19:27:02.899396 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cftrl\" (UniqueName: \"kubernetes.io/projected/9181caec-fc89-487a-ba9b-269b81d5a18a-kube-api-access-cftrl\") pod \"cert-manager-webhook-5655c58dd6-gghm8\" (UID: \"9181caec-fc89-487a-ba9b-269b81d5a18a\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-gghm8" Dec 07 19:27:02 crc kubenswrapper[4815]: I1207 19:27:02.927281 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-t2msm" Dec 07 19:27:02 crc kubenswrapper[4815]: I1207 19:27:02.937375 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-52gfx" Dec 07 19:27:02 crc kubenswrapper[4815]: I1207 19:27:02.979969 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-gghm8" Dec 07 19:27:03 crc kubenswrapper[4815]: I1207 19:27:03.161529 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-t2msm"] Dec 07 19:27:03 crc kubenswrapper[4815]: W1207 19:27:03.176563 4815 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc3a5ced1_3996_46a0_9762_bf0ff63654db.slice/crio-4e83469940178b4a6c8cff7beea9ad9cefd22ff7e1b55163f1e9d86407909200 WatchSource:0}: Error finding container 4e83469940178b4a6c8cff7beea9ad9cefd22ff7e1b55163f1e9d86407909200: Status 404 returned error can't find the container with id 4e83469940178b4a6c8cff7beea9ad9cefd22ff7e1b55163f1e9d86407909200 Dec 07 19:27:03 crc kubenswrapper[4815]: I1207 19:27:03.187865 4815 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 07 19:27:03 crc kubenswrapper[4815]: I1207 19:27:03.212485 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-52gfx"] Dec 07 19:27:03 crc kubenswrapper[4815]: I1207 19:27:03.257985 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-gghm8"] Dec 07 19:27:03 crc kubenswrapper[4815]: W1207 19:27:03.263863 4815 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9181caec_fc89_487a_ba9b_269b81d5a18a.slice/crio-c9962f933d51f2afbc06a4b7d653dad1bd1be0a2e290ba93b4e3610b5b44ca05 WatchSource:0}: Error finding container c9962f933d51f2afbc06a4b7d653dad1bd1be0a2e290ba93b4e3610b5b44ca05: Status 404 returned error can't find the container with id c9962f933d51f2afbc06a4b7d653dad1bd1be0a2e290ba93b4e3610b5b44ca05 Dec 07 19:27:03 crc kubenswrapper[4815]: I1207 19:27:03.797064 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-52gfx" event={"ID":"2e33a89c-dfcf-456f-bf42-5f98dbe2cbca","Type":"ContainerStarted","Data":"88d146646218ac6306e66b80918512ab1cc94e1be9d1c67758e0f371f13b50d5"} Dec 07 19:27:03 crc kubenswrapper[4815]: I1207 19:27:03.798373 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-gghm8" event={"ID":"9181caec-fc89-487a-ba9b-269b81d5a18a","Type":"ContainerStarted","Data":"c9962f933d51f2afbc06a4b7d653dad1bd1be0a2e290ba93b4e3610b5b44ca05"} Dec 07 19:27:03 crc kubenswrapper[4815]: I1207 19:27:03.799464 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-t2msm" event={"ID":"c3a5ced1-3996-46a0-9762-bf0ff63654db","Type":"ContainerStarted","Data":"4e83469940178b4a6c8cff7beea9ad9cefd22ff7e1b55163f1e9d86407909200"} Dec 07 19:27:05 crc kubenswrapper[4815]: I1207 19:27:05.818160 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-t2msm" event={"ID":"c3a5ced1-3996-46a0-9762-bf0ff63654db","Type":"ContainerStarted","Data":"587f34001c91414aed4175312eb1fab0b0f7dc1e547955560e59f832d0637759"} Dec 07 19:27:05 crc kubenswrapper[4815]: I1207 19:27:05.820346 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-52gfx" event={"ID":"2e33a89c-dfcf-456f-bf42-5f98dbe2cbca","Type":"ContainerStarted","Data":"6e794d2bf304150f012d96fca175dbbbfd532a04a48e0886b467445dee9375c3"} Dec 07 19:27:05 crc kubenswrapper[4815]: I1207 19:27:05.844431 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-t2msm" podStartSLOduration=1.375580965 podStartE2EDuration="3.844414947s" podCreationTimestamp="2025-12-07 19:27:02 +0000 UTC" firstStartedPulling="2025-12-07 19:27:03.187662593 +0000 UTC m=+727.766652638" lastFinishedPulling="2025-12-07 19:27:05.656496565 +0000 UTC m=+730.235486620" observedRunningTime="2025-12-07 19:27:05.841893616 +0000 UTC m=+730.420883661" watchObservedRunningTime="2025-12-07 19:27:05.844414947 +0000 UTC m=+730.423404992" Dec 07 19:27:05 crc kubenswrapper[4815]: I1207 19:27:05.857631 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-52gfx" podStartSLOduration=1.440052709 podStartE2EDuration="3.857614133s" podCreationTimestamp="2025-12-07 19:27:02 +0000 UTC" firstStartedPulling="2025-12-07 19:27:03.225741536 +0000 UTC m=+727.804731581" lastFinishedPulling="2025-12-07 19:27:05.64330296 +0000 UTC m=+730.222293005" observedRunningTime="2025-12-07 19:27:05.856696487 +0000 UTC m=+730.435686532" watchObservedRunningTime="2025-12-07 19:27:05.857614133 +0000 UTC m=+730.436604168" Dec 07 19:27:07 crc kubenswrapper[4815]: I1207 19:27:07.831660 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-gghm8" event={"ID":"9181caec-fc89-487a-ba9b-269b81d5a18a","Type":"ContainerStarted","Data":"db159f204a2957eec29f61729c7f888f6d6374936cd8e1352afb4bffdc4f3a07"} Dec 07 19:27:07 crc kubenswrapper[4815]: I1207 19:27:07.832678 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-gghm8" Dec 07 19:27:07 crc kubenswrapper[4815]: I1207 19:27:07.854480 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-gghm8" podStartSLOduration=2.384773878 podStartE2EDuration="5.854454364s" podCreationTimestamp="2025-12-07 19:27:02 +0000 UTC" firstStartedPulling="2025-12-07 19:27:03.265384743 +0000 UTC m=+727.844374788" lastFinishedPulling="2025-12-07 19:27:06.735065189 +0000 UTC m=+731.314055274" observedRunningTime="2025-12-07 19:27:07.846149668 +0000 UTC m=+732.425139713" watchObservedRunningTime="2025-12-07 19:27:07.854454364 +0000 UTC m=+732.433444439" Dec 07 19:27:12 crc kubenswrapper[4815]: I1207 19:27:12.984226 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-gghm8" Dec 07 19:27:13 crc kubenswrapper[4815]: I1207 19:27:13.147823 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-zzw6c"] Dec 07 19:27:13 crc kubenswrapper[4815]: I1207 19:27:13.148258 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" podUID="13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f" containerName="ovn-controller" containerID="cri-o://60f140818c38211cb31d63c0b540b60681a781c349ba0523866a3c7e80f33756" gracePeriod=30 Dec 07 19:27:13 crc kubenswrapper[4815]: I1207 19:27:13.148311 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" podUID="13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f" containerName="nbdb" containerID="cri-o://c63ad6fab49ca4b8deb896aef3b5ff43c3a1ae1035e7ad94815a938939c57c58" gracePeriod=30 Dec 07 19:27:13 crc kubenswrapper[4815]: I1207 19:27:13.148402 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" podUID="13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f" containerName="kube-rbac-proxy-node" containerID="cri-o://ac471afc20373bd22caa4941b5ebe38fdf62ad2145fabb4211fa02ce8728ab1d" gracePeriod=30 Dec 07 19:27:13 crc kubenswrapper[4815]: I1207 19:27:13.148476 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" podUID="13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f" containerName="sbdb" containerID="cri-o://ba72f8ec4b6a36d0c1a6fd8f9f8e5fad577a00b8ae515738a007b994b989d134" gracePeriod=30 Dec 07 19:27:13 crc kubenswrapper[4815]: I1207 19:27:13.148521 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" podUID="13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f" containerName="northd" containerID="cri-o://812356ce6808d8d2a5845dec1132733fab86a0d6853951920ce5fb5083551c7e" gracePeriod=30 Dec 07 19:27:13 crc kubenswrapper[4815]: I1207 19:27:13.148439 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" podUID="13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://d4050d0d7398492efc8809311a01c1a60986b97b6d177846e70c9283399f37b3" gracePeriod=30 Dec 07 19:27:13 crc kubenswrapper[4815]: I1207 19:27:13.148835 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" podUID="13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f" containerName="ovn-acl-logging" containerID="cri-o://6ab1c01dcbfe50580e64d89dc9e14843419588c89d7d792af89034d8f272e44e" gracePeriod=30 Dec 07 19:27:13 crc kubenswrapper[4815]: I1207 19:27:13.186503 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" podUID="13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f" containerName="ovnkube-controller" containerID="cri-o://d149289581f96fc5a122787e8f8595783c8924d8fa538104d01ab788729c41fd" gracePeriod=30 Dec 07 19:27:13 crc kubenswrapper[4815]: I1207 19:27:13.874254 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-s95hp_0b739f36-d9c4-4fb6-9ead-9df05e283dea/kube-multus/2.log" Dec 07 19:27:13 crc kubenswrapper[4815]: I1207 19:27:13.875328 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-s95hp_0b739f36-d9c4-4fb6-9ead-9df05e283dea/kube-multus/1.log" Dec 07 19:27:13 crc kubenswrapper[4815]: I1207 19:27:13.875418 4815 generic.go:334] "Generic (PLEG): container finished" podID="0b739f36-d9c4-4fb6-9ead-9df05e283dea" containerID="17214bf8d7b3f024980012a53d6b512815ab09eca779cdfe9e2a75a966a21663" exitCode=2 Dec 07 19:27:13 crc kubenswrapper[4815]: I1207 19:27:13.875562 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-s95hp" event={"ID":"0b739f36-d9c4-4fb6-9ead-9df05e283dea","Type":"ContainerDied","Data":"17214bf8d7b3f024980012a53d6b512815ab09eca779cdfe9e2a75a966a21663"} Dec 07 19:27:13 crc kubenswrapper[4815]: I1207 19:27:13.875631 4815 scope.go:117] "RemoveContainer" containerID="7285635ab7710e9071a051b6e49036856b4c60c87b5110debec3bfb20bb0ac97" Dec 07 19:27:13 crc kubenswrapper[4815]: I1207 19:27:13.876205 4815 scope.go:117] "RemoveContainer" containerID="17214bf8d7b3f024980012a53d6b512815ab09eca779cdfe9e2a75a966a21663" Dec 07 19:27:13 crc kubenswrapper[4815]: I1207 19:27:13.880707 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zzw6c_13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f/ovnkube-controller/3.log" Dec 07 19:27:13 crc kubenswrapper[4815]: I1207 19:27:13.889008 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zzw6c_13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f/ovn-acl-logging/0.log" Dec 07 19:27:13 crc kubenswrapper[4815]: I1207 19:27:13.890129 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zzw6c_13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f/ovn-controller/0.log" Dec 07 19:27:13 crc kubenswrapper[4815]: I1207 19:27:13.890777 4815 generic.go:334] "Generic (PLEG): container finished" podID="13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f" containerID="d149289581f96fc5a122787e8f8595783c8924d8fa538104d01ab788729c41fd" exitCode=0 Dec 07 19:27:13 crc kubenswrapper[4815]: I1207 19:27:13.890813 4815 generic.go:334] "Generic (PLEG): container finished" podID="13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f" containerID="ba72f8ec4b6a36d0c1a6fd8f9f8e5fad577a00b8ae515738a007b994b989d134" exitCode=0 Dec 07 19:27:13 crc kubenswrapper[4815]: I1207 19:27:13.890830 4815 generic.go:334] "Generic (PLEG): container finished" podID="13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f" containerID="c63ad6fab49ca4b8deb896aef3b5ff43c3a1ae1035e7ad94815a938939c57c58" exitCode=0 Dec 07 19:27:13 crc kubenswrapper[4815]: I1207 19:27:13.890847 4815 generic.go:334] "Generic (PLEG): container finished" podID="13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f" containerID="812356ce6808d8d2a5845dec1132733fab86a0d6853951920ce5fb5083551c7e" exitCode=0 Dec 07 19:27:13 crc kubenswrapper[4815]: I1207 19:27:13.890864 4815 generic.go:334] "Generic (PLEG): container finished" podID="13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f" containerID="d4050d0d7398492efc8809311a01c1a60986b97b6d177846e70c9283399f37b3" exitCode=0 Dec 07 19:27:13 crc kubenswrapper[4815]: I1207 19:27:13.890880 4815 generic.go:334] "Generic (PLEG): container finished" podID="13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f" containerID="ac471afc20373bd22caa4941b5ebe38fdf62ad2145fabb4211fa02ce8728ab1d" exitCode=0 Dec 07 19:27:13 crc kubenswrapper[4815]: I1207 19:27:13.890896 4815 generic.go:334] "Generic (PLEG): container finished" podID="13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f" containerID="6ab1c01dcbfe50580e64d89dc9e14843419588c89d7d792af89034d8f272e44e" exitCode=143 Dec 07 19:27:13 crc kubenswrapper[4815]: I1207 19:27:13.890945 4815 generic.go:334] "Generic (PLEG): container finished" podID="13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f" containerID="60f140818c38211cb31d63c0b540b60681a781c349ba0523866a3c7e80f33756" exitCode=143 Dec 07 19:27:13 crc kubenswrapper[4815]: I1207 19:27:13.890985 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" event={"ID":"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f","Type":"ContainerDied","Data":"d149289581f96fc5a122787e8f8595783c8924d8fa538104d01ab788729c41fd"} Dec 07 19:27:13 crc kubenswrapper[4815]: I1207 19:27:13.891033 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" event={"ID":"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f","Type":"ContainerDied","Data":"ba72f8ec4b6a36d0c1a6fd8f9f8e5fad577a00b8ae515738a007b994b989d134"} Dec 07 19:27:13 crc kubenswrapper[4815]: I1207 19:27:13.891059 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" event={"ID":"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f","Type":"ContainerDied","Data":"c63ad6fab49ca4b8deb896aef3b5ff43c3a1ae1035e7ad94815a938939c57c58"} Dec 07 19:27:13 crc kubenswrapper[4815]: I1207 19:27:13.891081 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" event={"ID":"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f","Type":"ContainerDied","Data":"812356ce6808d8d2a5845dec1132733fab86a0d6853951920ce5fb5083551c7e"} Dec 07 19:27:13 crc kubenswrapper[4815]: I1207 19:27:13.891103 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" event={"ID":"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f","Type":"ContainerDied","Data":"d4050d0d7398492efc8809311a01c1a60986b97b6d177846e70c9283399f37b3"} Dec 07 19:27:13 crc kubenswrapper[4815]: I1207 19:27:13.891125 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" event={"ID":"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f","Type":"ContainerDied","Data":"ac471afc20373bd22caa4941b5ebe38fdf62ad2145fabb4211fa02ce8728ab1d"} Dec 07 19:27:13 crc kubenswrapper[4815]: I1207 19:27:13.891147 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" event={"ID":"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f","Type":"ContainerDied","Data":"6ab1c01dcbfe50580e64d89dc9e14843419588c89d7d792af89034d8f272e44e"} Dec 07 19:27:13 crc kubenswrapper[4815]: I1207 19:27:13.891173 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" event={"ID":"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f","Type":"ContainerDied","Data":"60f140818c38211cb31d63c0b540b60681a781c349ba0523866a3c7e80f33756"} Dec 07 19:27:13 crc kubenswrapper[4815]: I1207 19:27:13.942300 4815 scope.go:117] "RemoveContainer" containerID="ebff9b10eeedac44403b96b81a1731f6d5569cf3097b0387d72e11c1d602d51b" Dec 07 19:27:13 crc kubenswrapper[4815]: I1207 19:27:13.955196 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zzw6c_13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f/ovn-acl-logging/0.log" Dec 07 19:27:13 crc kubenswrapper[4815]: I1207 19:27:13.956159 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zzw6c_13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f/ovn-controller/0.log" Dec 07 19:27:13 crc kubenswrapper[4815]: I1207 19:27:13.956775 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.015569 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-pld78"] Dec 07 19:27:14 crc kubenswrapper[4815]: E1207 19:27:14.015801 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f" containerName="kubecfg-setup" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.015815 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f" containerName="kubecfg-setup" Dec 07 19:27:14 crc kubenswrapper[4815]: E1207 19:27:14.015829 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f" containerName="ovnkube-controller" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.015837 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f" containerName="ovnkube-controller" Dec 07 19:27:14 crc kubenswrapper[4815]: E1207 19:27:14.015846 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f" containerName="ovnkube-controller" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.015854 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f" containerName="ovnkube-controller" Dec 07 19:27:14 crc kubenswrapper[4815]: E1207 19:27:14.015864 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f" containerName="ovn-acl-logging" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.015871 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f" containerName="ovn-acl-logging" Dec 07 19:27:14 crc kubenswrapper[4815]: E1207 19:27:14.015880 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f" containerName="ovnkube-controller" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.015886 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f" containerName="ovnkube-controller" Dec 07 19:27:14 crc kubenswrapper[4815]: E1207 19:27:14.015893 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f" containerName="ovnkube-controller" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.015901 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f" containerName="ovnkube-controller" Dec 07 19:27:14 crc kubenswrapper[4815]: E1207 19:27:14.015934 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f" containerName="kube-rbac-proxy-node" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.015942 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f" containerName="kube-rbac-proxy-node" Dec 07 19:27:14 crc kubenswrapper[4815]: E1207 19:27:14.015950 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f" containerName="sbdb" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.015958 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f" containerName="sbdb" Dec 07 19:27:14 crc kubenswrapper[4815]: E1207 19:27:14.015968 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f" containerName="kube-rbac-proxy-ovn-metrics" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.015975 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f" containerName="kube-rbac-proxy-ovn-metrics" Dec 07 19:27:14 crc kubenswrapper[4815]: E1207 19:27:14.015989 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f" containerName="ovn-controller" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.015995 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f" containerName="ovn-controller" Dec 07 19:27:14 crc kubenswrapper[4815]: E1207 19:27:14.016005 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f" containerName="northd" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.016011 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f" containerName="northd" Dec 07 19:27:14 crc kubenswrapper[4815]: E1207 19:27:14.016020 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f" containerName="nbdb" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.016027 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f" containerName="nbdb" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.016135 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f" containerName="sbdb" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.016144 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f" containerName="ovnkube-controller" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.016154 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f" containerName="ovnkube-controller" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.016162 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f" containerName="kube-rbac-proxy-node" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.016171 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f" containerName="kube-rbac-proxy-ovn-metrics" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.016179 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f" containerName="ovn-controller" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.016189 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f" containerName="northd" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.016201 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f" containerName="nbdb" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.016210 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f" containerName="ovn-acl-logging" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.016222 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f" containerName="ovnkube-controller" Dec 07 19:27:14 crc kubenswrapper[4815]: E1207 19:27:14.016340 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f" containerName="ovnkube-controller" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.016349 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f" containerName="ovnkube-controller" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.016457 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f" containerName="ovnkube-controller" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.016467 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f" containerName="ovnkube-controller" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.018225 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pld78" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.027952 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-run-openvswitch\") pod \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\" (UID: \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\") " Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.028008 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-var-lib-openvswitch\") pod \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\" (UID: \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\") " Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.028046 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-run-ovn\") pod \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\" (UID: \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\") " Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.028091 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-ovnkube-config\") pod \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\" (UID: \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\") " Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.028123 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-host-kubelet\") pod \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\" (UID: \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\") " Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.028132 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f" (UID: "13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.028162 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-ovn-node-metrics-cert\") pod \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\" (UID: \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\") " Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.028165 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f" (UID: "13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.028198 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-run-systemd\") pod \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\" (UID: \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\") " Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.028211 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f" (UID: "13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.028232 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-env-overrides\") pod \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\" (UID: \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\") " Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.028238 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f" (UID: "13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.028268 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-systemd-units\") pod \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\" (UID: \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\") " Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.028302 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-host-slash\") pod \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\" (UID: \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\") " Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.028345 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-host-run-ovn-kubernetes\") pod \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\" (UID: \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\") " Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.028387 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\" (UID: \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\") " Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.028443 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-etc-openvswitch\") pod \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\" (UID: \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\") " Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.028481 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-host-cni-bin\") pod \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\" (UID: \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\") " Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.028513 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tddph\" (UniqueName: \"kubernetes.io/projected/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-kube-api-access-tddph\") pod \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\" (UID: \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\") " Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.028569 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-node-log\") pod \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\" (UID: \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\") " Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.028609 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-host-run-netns\") pod \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\" (UID: \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\") " Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.028640 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-log-socket\") pod \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\" (UID: \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\") " Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.028649 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f" (UID: "13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.028669 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-host-cni-netd\") pod \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\" (UID: \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\") " Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.028694 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f" (UID: "13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.028706 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-ovnkube-script-lib\") pod \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\" (UID: \"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f\") " Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.029069 4815 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.029093 4815 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-run-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.029110 4815 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.029126 4815 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.029125 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-node-log" (OuterVolumeSpecName: "node-log") pod "13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f" (UID: "13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.029140 4815 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.029199 4815 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-host-kubelet\") on node \"crc\" DevicePath \"\"" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.029236 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f" (UID: "13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.029272 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f" (UID: "13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.029649 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f" (UID: "13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.029715 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f" (UID: "13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.029751 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-log-socket" (OuterVolumeSpecName: "log-socket") pod "13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f" (UID: "13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.029784 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f" (UID: "13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.029826 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f" (UID: "13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.029996 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-host-slash" (OuterVolumeSpecName: "host-slash") pod "13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f" (UID: "13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.030006 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f" (UID: "13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.030481 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f" (UID: "13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.039857 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f" (UID: "13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.039991 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-kube-api-access-tddph" (OuterVolumeSpecName: "kube-api-access-tddph") pod "13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f" (UID: "13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f"). InnerVolumeSpecName "kube-api-access-tddph". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.044503 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f" (UID: "13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.130572 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f3f9cdb4-b097-40ec-a8f0-a37c4374483e-systemd-units\") pod \"ovnkube-node-pld78\" (UID: \"f3f9cdb4-b097-40ec-a8f0-a37c4374483e\") " pod="openshift-ovn-kubernetes/ovnkube-node-pld78" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.130645 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f3f9cdb4-b097-40ec-a8f0-a37c4374483e-host-run-netns\") pod \"ovnkube-node-pld78\" (UID: \"f3f9cdb4-b097-40ec-a8f0-a37c4374483e\") " pod="openshift-ovn-kubernetes/ovnkube-node-pld78" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.130675 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lv78\" (UniqueName: \"kubernetes.io/projected/f3f9cdb4-b097-40ec-a8f0-a37c4374483e-kube-api-access-6lv78\") pod \"ovnkube-node-pld78\" (UID: \"f3f9cdb4-b097-40ec-a8f0-a37c4374483e\") " pod="openshift-ovn-kubernetes/ovnkube-node-pld78" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.130816 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f3f9cdb4-b097-40ec-a8f0-a37c4374483e-ovn-node-metrics-cert\") pod \"ovnkube-node-pld78\" (UID: \"f3f9cdb4-b097-40ec-a8f0-a37c4374483e\") " pod="openshift-ovn-kubernetes/ovnkube-node-pld78" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.130935 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f3f9cdb4-b097-40ec-a8f0-a37c4374483e-run-systemd\") pod \"ovnkube-node-pld78\" (UID: \"f3f9cdb4-b097-40ec-a8f0-a37c4374483e\") " pod="openshift-ovn-kubernetes/ovnkube-node-pld78" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.130977 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f3f9cdb4-b097-40ec-a8f0-a37c4374483e-node-log\") pod \"ovnkube-node-pld78\" (UID: \"f3f9cdb4-b097-40ec-a8f0-a37c4374483e\") " pod="openshift-ovn-kubernetes/ovnkube-node-pld78" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.131007 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f3f9cdb4-b097-40ec-a8f0-a37c4374483e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pld78\" (UID: \"f3f9cdb4-b097-40ec-a8f0-a37c4374483e\") " pod="openshift-ovn-kubernetes/ovnkube-node-pld78" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.131050 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f3f9cdb4-b097-40ec-a8f0-a37c4374483e-env-overrides\") pod \"ovnkube-node-pld78\" (UID: \"f3f9cdb4-b097-40ec-a8f0-a37c4374483e\") " pod="openshift-ovn-kubernetes/ovnkube-node-pld78" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.131085 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f3f9cdb4-b097-40ec-a8f0-a37c4374483e-var-lib-openvswitch\") pod \"ovnkube-node-pld78\" (UID: \"f3f9cdb4-b097-40ec-a8f0-a37c4374483e\") " pod="openshift-ovn-kubernetes/ovnkube-node-pld78" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.131135 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f3f9cdb4-b097-40ec-a8f0-a37c4374483e-ovnkube-script-lib\") pod \"ovnkube-node-pld78\" (UID: \"f3f9cdb4-b097-40ec-a8f0-a37c4374483e\") " pod="openshift-ovn-kubernetes/ovnkube-node-pld78" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.131167 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f3f9cdb4-b097-40ec-a8f0-a37c4374483e-run-openvswitch\") pod \"ovnkube-node-pld78\" (UID: \"f3f9cdb4-b097-40ec-a8f0-a37c4374483e\") " pod="openshift-ovn-kubernetes/ovnkube-node-pld78" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.131200 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f3f9cdb4-b097-40ec-a8f0-a37c4374483e-log-socket\") pod \"ovnkube-node-pld78\" (UID: \"f3f9cdb4-b097-40ec-a8f0-a37c4374483e\") " pod="openshift-ovn-kubernetes/ovnkube-node-pld78" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.131233 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f3f9cdb4-b097-40ec-a8f0-a37c4374483e-etc-openvswitch\") pod \"ovnkube-node-pld78\" (UID: \"f3f9cdb4-b097-40ec-a8f0-a37c4374483e\") " pod="openshift-ovn-kubernetes/ovnkube-node-pld78" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.131281 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f3f9cdb4-b097-40ec-a8f0-a37c4374483e-host-cni-netd\") pod \"ovnkube-node-pld78\" (UID: \"f3f9cdb4-b097-40ec-a8f0-a37c4374483e\") " pod="openshift-ovn-kubernetes/ovnkube-node-pld78" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.131321 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f3f9cdb4-b097-40ec-a8f0-a37c4374483e-host-cni-bin\") pod \"ovnkube-node-pld78\" (UID: \"f3f9cdb4-b097-40ec-a8f0-a37c4374483e\") " pod="openshift-ovn-kubernetes/ovnkube-node-pld78" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.131373 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f3f9cdb4-b097-40ec-a8f0-a37c4374483e-host-slash\") pod \"ovnkube-node-pld78\" (UID: \"f3f9cdb4-b097-40ec-a8f0-a37c4374483e\") " pod="openshift-ovn-kubernetes/ovnkube-node-pld78" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.131428 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f3f9cdb4-b097-40ec-a8f0-a37c4374483e-ovnkube-config\") pod \"ovnkube-node-pld78\" (UID: \"f3f9cdb4-b097-40ec-a8f0-a37c4374483e\") " pod="openshift-ovn-kubernetes/ovnkube-node-pld78" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.131536 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f3f9cdb4-b097-40ec-a8f0-a37c4374483e-host-run-ovn-kubernetes\") pod \"ovnkube-node-pld78\" (UID: \"f3f9cdb4-b097-40ec-a8f0-a37c4374483e\") " pod="openshift-ovn-kubernetes/ovnkube-node-pld78" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.131593 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f3f9cdb4-b097-40ec-a8f0-a37c4374483e-run-ovn\") pod \"ovnkube-node-pld78\" (UID: \"f3f9cdb4-b097-40ec-a8f0-a37c4374483e\") " pod="openshift-ovn-kubernetes/ovnkube-node-pld78" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.131626 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f3f9cdb4-b097-40ec-a8f0-a37c4374483e-host-kubelet\") pod \"ovnkube-node-pld78\" (UID: \"f3f9cdb4-b097-40ec-a8f0-a37c4374483e\") " pod="openshift-ovn-kubernetes/ovnkube-node-pld78" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.131711 4815 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-host-cni-netd\") on node \"crc\" DevicePath \"\"" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.131732 4815 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.131750 4815 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.131768 4815 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.131785 4815 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-run-systemd\") on node \"crc\" DevicePath \"\"" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.131802 4815 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-systemd-units\") on node \"crc\" DevicePath \"\"" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.131819 4815 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-host-slash\") on node \"crc\" DevicePath \"\"" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.131836 4815 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.131853 4815 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.131870 4815 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-host-cni-bin\") on node \"crc\" DevicePath \"\"" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.131888 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tddph\" (UniqueName: \"kubernetes.io/projected/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-kube-api-access-tddph\") on node \"crc\" DevicePath \"\"" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.131906 4815 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-node-log\") on node \"crc\" DevicePath \"\"" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.131949 4815 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-host-run-netns\") on node \"crc\" DevicePath \"\"" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.131966 4815 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f-log-socket\") on node \"crc\" DevicePath \"\"" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.233178 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f3f9cdb4-b097-40ec-a8f0-a37c4374483e-ovnkube-config\") pod \"ovnkube-node-pld78\" (UID: \"f3f9cdb4-b097-40ec-a8f0-a37c4374483e\") " pod="openshift-ovn-kubernetes/ovnkube-node-pld78" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.233276 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f3f9cdb4-b097-40ec-a8f0-a37c4374483e-run-ovn\") pod \"ovnkube-node-pld78\" (UID: \"f3f9cdb4-b097-40ec-a8f0-a37c4374483e\") " pod="openshift-ovn-kubernetes/ovnkube-node-pld78" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.233312 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f3f9cdb4-b097-40ec-a8f0-a37c4374483e-host-run-ovn-kubernetes\") pod \"ovnkube-node-pld78\" (UID: \"f3f9cdb4-b097-40ec-a8f0-a37c4374483e\") " pod="openshift-ovn-kubernetes/ovnkube-node-pld78" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.233340 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f3f9cdb4-b097-40ec-a8f0-a37c4374483e-host-kubelet\") pod \"ovnkube-node-pld78\" (UID: \"f3f9cdb4-b097-40ec-a8f0-a37c4374483e\") " pod="openshift-ovn-kubernetes/ovnkube-node-pld78" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.233379 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f3f9cdb4-b097-40ec-a8f0-a37c4374483e-systemd-units\") pod \"ovnkube-node-pld78\" (UID: \"f3f9cdb4-b097-40ec-a8f0-a37c4374483e\") " pod="openshift-ovn-kubernetes/ovnkube-node-pld78" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.233413 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f3f9cdb4-b097-40ec-a8f0-a37c4374483e-host-run-netns\") pod \"ovnkube-node-pld78\" (UID: \"f3f9cdb4-b097-40ec-a8f0-a37c4374483e\") " pod="openshift-ovn-kubernetes/ovnkube-node-pld78" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.233454 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lv78\" (UniqueName: \"kubernetes.io/projected/f3f9cdb4-b097-40ec-a8f0-a37c4374483e-kube-api-access-6lv78\") pod \"ovnkube-node-pld78\" (UID: \"f3f9cdb4-b097-40ec-a8f0-a37c4374483e\") " pod="openshift-ovn-kubernetes/ovnkube-node-pld78" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.233483 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f3f9cdb4-b097-40ec-a8f0-a37c4374483e-ovn-node-metrics-cert\") pod \"ovnkube-node-pld78\" (UID: \"f3f9cdb4-b097-40ec-a8f0-a37c4374483e\") " pod="openshift-ovn-kubernetes/ovnkube-node-pld78" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.233525 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f3f9cdb4-b097-40ec-a8f0-a37c4374483e-run-systemd\") pod \"ovnkube-node-pld78\" (UID: \"f3f9cdb4-b097-40ec-a8f0-a37c4374483e\") " pod="openshift-ovn-kubernetes/ovnkube-node-pld78" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.233554 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f3f9cdb4-b097-40ec-a8f0-a37c4374483e-node-log\") pod \"ovnkube-node-pld78\" (UID: \"f3f9cdb4-b097-40ec-a8f0-a37c4374483e\") " pod="openshift-ovn-kubernetes/ovnkube-node-pld78" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.233584 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f3f9cdb4-b097-40ec-a8f0-a37c4374483e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pld78\" (UID: \"f3f9cdb4-b097-40ec-a8f0-a37c4374483e\") " pod="openshift-ovn-kubernetes/ovnkube-node-pld78" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.233617 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f3f9cdb4-b097-40ec-a8f0-a37c4374483e-env-overrides\") pod \"ovnkube-node-pld78\" (UID: \"f3f9cdb4-b097-40ec-a8f0-a37c4374483e\") " pod="openshift-ovn-kubernetes/ovnkube-node-pld78" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.233651 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f3f9cdb4-b097-40ec-a8f0-a37c4374483e-var-lib-openvswitch\") pod \"ovnkube-node-pld78\" (UID: \"f3f9cdb4-b097-40ec-a8f0-a37c4374483e\") " pod="openshift-ovn-kubernetes/ovnkube-node-pld78" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.233692 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f3f9cdb4-b097-40ec-a8f0-a37c4374483e-ovnkube-script-lib\") pod \"ovnkube-node-pld78\" (UID: \"f3f9cdb4-b097-40ec-a8f0-a37c4374483e\") " pod="openshift-ovn-kubernetes/ovnkube-node-pld78" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.233721 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f3f9cdb4-b097-40ec-a8f0-a37c4374483e-run-openvswitch\") pod \"ovnkube-node-pld78\" (UID: \"f3f9cdb4-b097-40ec-a8f0-a37c4374483e\") " pod="openshift-ovn-kubernetes/ovnkube-node-pld78" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.233749 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f3f9cdb4-b097-40ec-a8f0-a37c4374483e-log-socket\") pod \"ovnkube-node-pld78\" (UID: \"f3f9cdb4-b097-40ec-a8f0-a37c4374483e\") " pod="openshift-ovn-kubernetes/ovnkube-node-pld78" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.233776 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f3f9cdb4-b097-40ec-a8f0-a37c4374483e-etc-openvswitch\") pod \"ovnkube-node-pld78\" (UID: \"f3f9cdb4-b097-40ec-a8f0-a37c4374483e\") " pod="openshift-ovn-kubernetes/ovnkube-node-pld78" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.233812 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f3f9cdb4-b097-40ec-a8f0-a37c4374483e-host-cni-netd\") pod \"ovnkube-node-pld78\" (UID: \"f3f9cdb4-b097-40ec-a8f0-a37c4374483e\") " pod="openshift-ovn-kubernetes/ovnkube-node-pld78" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.233841 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f3f9cdb4-b097-40ec-a8f0-a37c4374483e-host-cni-bin\") pod \"ovnkube-node-pld78\" (UID: \"f3f9cdb4-b097-40ec-a8f0-a37c4374483e\") " pod="openshift-ovn-kubernetes/ovnkube-node-pld78" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.233870 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f3f9cdb4-b097-40ec-a8f0-a37c4374483e-host-slash\") pod \"ovnkube-node-pld78\" (UID: \"f3f9cdb4-b097-40ec-a8f0-a37c4374483e\") " pod="openshift-ovn-kubernetes/ovnkube-node-pld78" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.233997 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f3f9cdb4-b097-40ec-a8f0-a37c4374483e-host-slash\") pod \"ovnkube-node-pld78\" (UID: \"f3f9cdb4-b097-40ec-a8f0-a37c4374483e\") " pod="openshift-ovn-kubernetes/ovnkube-node-pld78" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.235064 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f3f9cdb4-b097-40ec-a8f0-a37c4374483e-ovnkube-config\") pod \"ovnkube-node-pld78\" (UID: \"f3f9cdb4-b097-40ec-a8f0-a37c4374483e\") " pod="openshift-ovn-kubernetes/ovnkube-node-pld78" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.235142 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f3f9cdb4-b097-40ec-a8f0-a37c4374483e-run-ovn\") pod \"ovnkube-node-pld78\" (UID: \"f3f9cdb4-b097-40ec-a8f0-a37c4374483e\") " pod="openshift-ovn-kubernetes/ovnkube-node-pld78" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.235186 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f3f9cdb4-b097-40ec-a8f0-a37c4374483e-host-run-ovn-kubernetes\") pod \"ovnkube-node-pld78\" (UID: \"f3f9cdb4-b097-40ec-a8f0-a37c4374483e\") " pod="openshift-ovn-kubernetes/ovnkube-node-pld78" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.235229 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f3f9cdb4-b097-40ec-a8f0-a37c4374483e-host-kubelet\") pod \"ovnkube-node-pld78\" (UID: \"f3f9cdb4-b097-40ec-a8f0-a37c4374483e\") " pod="openshift-ovn-kubernetes/ovnkube-node-pld78" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.235267 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f3f9cdb4-b097-40ec-a8f0-a37c4374483e-systemd-units\") pod \"ovnkube-node-pld78\" (UID: \"f3f9cdb4-b097-40ec-a8f0-a37c4374483e\") " pod="openshift-ovn-kubernetes/ovnkube-node-pld78" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.235306 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f3f9cdb4-b097-40ec-a8f0-a37c4374483e-host-run-netns\") pod \"ovnkube-node-pld78\" (UID: \"f3f9cdb4-b097-40ec-a8f0-a37c4374483e\") " pod="openshift-ovn-kubernetes/ovnkube-node-pld78" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.236133 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f3f9cdb4-b097-40ec-a8f0-a37c4374483e-etc-openvswitch\") pod \"ovnkube-node-pld78\" (UID: \"f3f9cdb4-b097-40ec-a8f0-a37c4374483e\") " pod="openshift-ovn-kubernetes/ovnkube-node-pld78" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.236327 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f3f9cdb4-b097-40ec-a8f0-a37c4374483e-run-openvswitch\") pod \"ovnkube-node-pld78\" (UID: \"f3f9cdb4-b097-40ec-a8f0-a37c4374483e\") " pod="openshift-ovn-kubernetes/ovnkube-node-pld78" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.236442 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f3f9cdb4-b097-40ec-a8f0-a37c4374483e-log-socket\") pod \"ovnkube-node-pld78\" (UID: \"f3f9cdb4-b097-40ec-a8f0-a37c4374483e\") " pod="openshift-ovn-kubernetes/ovnkube-node-pld78" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.236591 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f3f9cdb4-b097-40ec-a8f0-a37c4374483e-host-cni-netd\") pod \"ovnkube-node-pld78\" (UID: \"f3f9cdb4-b097-40ec-a8f0-a37c4374483e\") " pod="openshift-ovn-kubernetes/ovnkube-node-pld78" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.236725 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f3f9cdb4-b097-40ec-a8f0-a37c4374483e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pld78\" (UID: \"f3f9cdb4-b097-40ec-a8f0-a37c4374483e\") " pod="openshift-ovn-kubernetes/ovnkube-node-pld78" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.237222 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f3f9cdb4-b097-40ec-a8f0-a37c4374483e-run-systemd\") pod \"ovnkube-node-pld78\" (UID: \"f3f9cdb4-b097-40ec-a8f0-a37c4374483e\") " pod="openshift-ovn-kubernetes/ovnkube-node-pld78" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.237341 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f3f9cdb4-b097-40ec-a8f0-a37c4374483e-node-log\") pod \"ovnkube-node-pld78\" (UID: \"f3f9cdb4-b097-40ec-a8f0-a37c4374483e\") " pod="openshift-ovn-kubernetes/ovnkube-node-pld78" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.237463 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f3f9cdb4-b097-40ec-a8f0-a37c4374483e-host-cni-bin\") pod \"ovnkube-node-pld78\" (UID: \"f3f9cdb4-b097-40ec-a8f0-a37c4374483e\") " pod="openshift-ovn-kubernetes/ovnkube-node-pld78" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.238229 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f3f9cdb4-b097-40ec-a8f0-a37c4374483e-env-overrides\") pod \"ovnkube-node-pld78\" (UID: \"f3f9cdb4-b097-40ec-a8f0-a37c4374483e\") " pod="openshift-ovn-kubernetes/ovnkube-node-pld78" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.238382 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f3f9cdb4-b097-40ec-a8f0-a37c4374483e-var-lib-openvswitch\") pod \"ovnkube-node-pld78\" (UID: \"f3f9cdb4-b097-40ec-a8f0-a37c4374483e\") " pod="openshift-ovn-kubernetes/ovnkube-node-pld78" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.238525 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f3f9cdb4-b097-40ec-a8f0-a37c4374483e-ovnkube-script-lib\") pod \"ovnkube-node-pld78\" (UID: \"f3f9cdb4-b097-40ec-a8f0-a37c4374483e\") " pod="openshift-ovn-kubernetes/ovnkube-node-pld78" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.242519 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f3f9cdb4-b097-40ec-a8f0-a37c4374483e-ovn-node-metrics-cert\") pod \"ovnkube-node-pld78\" (UID: \"f3f9cdb4-b097-40ec-a8f0-a37c4374483e\") " pod="openshift-ovn-kubernetes/ovnkube-node-pld78" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.273633 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lv78\" (UniqueName: \"kubernetes.io/projected/f3f9cdb4-b097-40ec-a8f0-a37c4374483e-kube-api-access-6lv78\") pod \"ovnkube-node-pld78\" (UID: \"f3f9cdb4-b097-40ec-a8f0-a37c4374483e\") " pod="openshift-ovn-kubernetes/ovnkube-node-pld78" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.344568 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pld78" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.899468 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zzw6c_13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f/ovn-acl-logging/0.log" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.900023 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zzw6c_13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f/ovn-controller/0.log" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.900595 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" event={"ID":"13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f","Type":"ContainerDied","Data":"7b2f72cd2cbe2cb11f715a46c910cb0ce86591696554d1aaa22ae4b2b3c2e0b2"} Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.900644 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zzw6c" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.900659 4815 scope.go:117] "RemoveContainer" containerID="d149289581f96fc5a122787e8f8595783c8924d8fa538104d01ab788729c41fd" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.908688 4815 generic.go:334] "Generic (PLEG): container finished" podID="f3f9cdb4-b097-40ec-a8f0-a37c4374483e" containerID="5655e3dabd5133bf839271b8c1d137a9cc3cd0819006e1669625de04290fcf3a" exitCode=0 Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.908795 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pld78" event={"ID":"f3f9cdb4-b097-40ec-a8f0-a37c4374483e","Type":"ContainerDied","Data":"5655e3dabd5133bf839271b8c1d137a9cc3cd0819006e1669625de04290fcf3a"} Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.908829 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pld78" event={"ID":"f3f9cdb4-b097-40ec-a8f0-a37c4374483e","Type":"ContainerStarted","Data":"6eb4530a5e8d90bb66672ed163e4baab8c747b1363141a61c2906c869f683e1d"} Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.916149 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-s95hp_0b739f36-d9c4-4fb6-9ead-9df05e283dea/kube-multus/2.log" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.916210 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-s95hp" event={"ID":"0b739f36-d9c4-4fb6-9ead-9df05e283dea","Type":"ContainerStarted","Data":"6474d818f7506d40006de26ec23ad982be499ed53eb266ff2c8c40bc6aa0492c"} Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.923294 4815 scope.go:117] "RemoveContainer" containerID="ba72f8ec4b6a36d0c1a6fd8f9f8e5fad577a00b8ae515738a007b994b989d134" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.950131 4815 scope.go:117] "RemoveContainer" containerID="c63ad6fab49ca4b8deb896aef3b5ff43c3a1ae1035e7ad94815a938939c57c58" Dec 07 19:27:14 crc kubenswrapper[4815]: I1207 19:27:14.976709 4815 scope.go:117] "RemoveContainer" containerID="812356ce6808d8d2a5845dec1132733fab86a0d6853951920ce5fb5083551c7e" Dec 07 19:27:15 crc kubenswrapper[4815]: I1207 19:27:15.002237 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-zzw6c"] Dec 07 19:27:15 crc kubenswrapper[4815]: I1207 19:27:15.008974 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-zzw6c"] Dec 07 19:27:15 crc kubenswrapper[4815]: I1207 19:27:15.009080 4815 scope.go:117] "RemoveContainer" containerID="d4050d0d7398492efc8809311a01c1a60986b97b6d177846e70c9283399f37b3" Dec 07 19:27:15 crc kubenswrapper[4815]: I1207 19:27:15.030086 4815 scope.go:117] "RemoveContainer" containerID="ac471afc20373bd22caa4941b5ebe38fdf62ad2145fabb4211fa02ce8728ab1d" Dec 07 19:27:15 crc kubenswrapper[4815]: I1207 19:27:15.041775 4815 scope.go:117] "RemoveContainer" containerID="6ab1c01dcbfe50580e64d89dc9e14843419588c89d7d792af89034d8f272e44e" Dec 07 19:27:15 crc kubenswrapper[4815]: I1207 19:27:15.052536 4815 scope.go:117] "RemoveContainer" containerID="60f140818c38211cb31d63c0b540b60681a781c349ba0523866a3c7e80f33756" Dec 07 19:27:15 crc kubenswrapper[4815]: I1207 19:27:15.064024 4815 scope.go:117] "RemoveContainer" containerID="1a4a4b2d4e80c078ae857e91491c082c6b1c57a5297c485c31fac0c90996b19f" Dec 07 19:27:15 crc kubenswrapper[4815]: I1207 19:27:15.777022 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f" path="/var/lib/kubelet/pods/13ca3d87-e6c7-4d51-9b77-34fdd2a38e1f/volumes" Dec 07 19:27:15 crc kubenswrapper[4815]: I1207 19:27:15.924120 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pld78" event={"ID":"f3f9cdb4-b097-40ec-a8f0-a37c4374483e","Type":"ContainerStarted","Data":"3b990fbf71567708aa7bd477308789093665c3f871c075820de0842bc44d8195"} Dec 07 19:27:15 crc kubenswrapper[4815]: I1207 19:27:15.924161 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pld78" event={"ID":"f3f9cdb4-b097-40ec-a8f0-a37c4374483e","Type":"ContainerStarted","Data":"43a3fa75265b07d6621ffba2337ce5d481230fae1b90386aa7ff2d913c350c0e"} Dec 07 19:27:15 crc kubenswrapper[4815]: I1207 19:27:15.924175 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pld78" event={"ID":"f3f9cdb4-b097-40ec-a8f0-a37c4374483e","Type":"ContainerStarted","Data":"3baafe2a55f0ef7d68b74f270445c0164a89f6fd56467840e64ae82c8a7299c7"} Dec 07 19:27:15 crc kubenswrapper[4815]: I1207 19:27:15.924187 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pld78" event={"ID":"f3f9cdb4-b097-40ec-a8f0-a37c4374483e","Type":"ContainerStarted","Data":"6829a41036db8ad0edab08813014c2e0ae2e10e9dfce54c5f4a5bd679665f898"} Dec 07 19:27:15 crc kubenswrapper[4815]: I1207 19:27:15.924199 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pld78" event={"ID":"f3f9cdb4-b097-40ec-a8f0-a37c4374483e","Type":"ContainerStarted","Data":"90d4d6cd672289fde67d0b0f223a2958ab2e2bb43f0300b1fc95d134e7b9fc12"} Dec 07 19:27:15 crc kubenswrapper[4815]: I1207 19:27:15.924211 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pld78" event={"ID":"f3f9cdb4-b097-40ec-a8f0-a37c4374483e","Type":"ContainerStarted","Data":"f3ff84b0a470b6eadeb1bb365769838482424b7855788dbf48b4ad23e34d9af9"} Dec 07 19:27:17 crc kubenswrapper[4815]: I1207 19:27:17.939086 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pld78" event={"ID":"f3f9cdb4-b097-40ec-a8f0-a37c4374483e","Type":"ContainerStarted","Data":"1b8dbae6147ecd03ea453c1267170cdc86128d1e134f460d6347b833b380cc05"} Dec 07 19:27:20 crc kubenswrapper[4815]: I1207 19:27:20.963375 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pld78" event={"ID":"f3f9cdb4-b097-40ec-a8f0-a37c4374483e","Type":"ContainerStarted","Data":"6d2ff56fb1c7c7098938d2d42add54d50330d6663f9b5473ad58ca7bc84ca04a"} Dec 07 19:27:20 crc kubenswrapper[4815]: I1207 19:27:20.966587 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pld78" Dec 07 19:27:20 crc kubenswrapper[4815]: I1207 19:27:20.966716 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pld78" Dec 07 19:27:20 crc kubenswrapper[4815]: I1207 19:27:20.966740 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pld78" Dec 07 19:27:21 crc kubenswrapper[4815]: I1207 19:27:21.053988 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-pld78" podStartSLOduration=8.053963666 podStartE2EDuration="8.053963666s" podCreationTimestamp="2025-12-07 19:27:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:27:21.050385984 +0000 UTC m=+745.629376029" watchObservedRunningTime="2025-12-07 19:27:21.053963666 +0000 UTC m=+745.632953741" Dec 07 19:27:21 crc kubenswrapper[4815]: I1207 19:27:21.054784 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pld78" Dec 07 19:27:21 crc kubenswrapper[4815]: I1207 19:27:21.058288 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pld78" Dec 07 19:27:26 crc kubenswrapper[4815]: I1207 19:27:26.360185 4815 patch_prober.go:28] interesting pod/machine-config-daemon-gkn4h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 07 19:27:26 crc kubenswrapper[4815]: I1207 19:27:26.360646 4815 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 07 19:27:44 crc kubenswrapper[4815]: I1207 19:27:44.388289 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pld78" Dec 07 19:27:46 crc kubenswrapper[4815]: I1207 19:27:46.822076 4815 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 07 19:27:54 crc kubenswrapper[4815]: I1207 19:27:54.738177 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftgl7c"] Dec 07 19:27:54 crc kubenswrapper[4815]: I1207 19:27:54.740793 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftgl7c" Dec 07 19:27:54 crc kubenswrapper[4815]: I1207 19:27:54.743495 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 07 19:27:54 crc kubenswrapper[4815]: I1207 19:27:54.756179 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftgl7c"] Dec 07 19:27:54 crc kubenswrapper[4815]: I1207 19:27:54.831402 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6fa0357a-e808-457c-b9dc-728eca137551-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftgl7c\" (UID: \"6fa0357a-e808-457c-b9dc-728eca137551\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftgl7c" Dec 07 19:27:54 crc kubenswrapper[4815]: I1207 19:27:54.831468 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5drq\" (UniqueName: \"kubernetes.io/projected/6fa0357a-e808-457c-b9dc-728eca137551-kube-api-access-g5drq\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftgl7c\" (UID: \"6fa0357a-e808-457c-b9dc-728eca137551\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftgl7c" Dec 07 19:27:54 crc kubenswrapper[4815]: I1207 19:27:54.831495 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6fa0357a-e808-457c-b9dc-728eca137551-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftgl7c\" (UID: \"6fa0357a-e808-457c-b9dc-728eca137551\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftgl7c" Dec 07 19:27:54 crc kubenswrapper[4815]: I1207 19:27:54.933315 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6fa0357a-e808-457c-b9dc-728eca137551-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftgl7c\" (UID: \"6fa0357a-e808-457c-b9dc-728eca137551\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftgl7c" Dec 07 19:27:54 crc kubenswrapper[4815]: I1207 19:27:54.933395 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5drq\" (UniqueName: \"kubernetes.io/projected/6fa0357a-e808-457c-b9dc-728eca137551-kube-api-access-g5drq\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftgl7c\" (UID: \"6fa0357a-e808-457c-b9dc-728eca137551\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftgl7c" Dec 07 19:27:54 crc kubenswrapper[4815]: I1207 19:27:54.933437 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6fa0357a-e808-457c-b9dc-728eca137551-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftgl7c\" (UID: \"6fa0357a-e808-457c-b9dc-728eca137551\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftgl7c" Dec 07 19:27:54 crc kubenswrapper[4815]: I1207 19:27:54.933835 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6fa0357a-e808-457c-b9dc-728eca137551-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftgl7c\" (UID: \"6fa0357a-e808-457c-b9dc-728eca137551\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftgl7c" Dec 07 19:27:54 crc kubenswrapper[4815]: I1207 19:27:54.934107 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6fa0357a-e808-457c-b9dc-728eca137551-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftgl7c\" (UID: \"6fa0357a-e808-457c-b9dc-728eca137551\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftgl7c" Dec 07 19:27:54 crc kubenswrapper[4815]: I1207 19:27:54.956203 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5drq\" (UniqueName: \"kubernetes.io/projected/6fa0357a-e808-457c-b9dc-728eca137551-kube-api-access-g5drq\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftgl7c\" (UID: \"6fa0357a-e808-457c-b9dc-728eca137551\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftgl7c" Dec 07 19:27:55 crc kubenswrapper[4815]: I1207 19:27:55.057465 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftgl7c" Dec 07 19:27:55 crc kubenswrapper[4815]: I1207 19:27:55.298565 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftgl7c"] Dec 07 19:27:55 crc kubenswrapper[4815]: W1207 19:27:55.309785 4815 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6fa0357a_e808_457c_b9dc_728eca137551.slice/crio-1e735508175f878d1f29ffd394c3b26f4b043e85deb2c65680b5db062809e99d WatchSource:0}: Error finding container 1e735508175f878d1f29ffd394c3b26f4b043e85deb2c65680b5db062809e99d: Status 404 returned error can't find the container with id 1e735508175f878d1f29ffd394c3b26f4b043e85deb2c65680b5db062809e99d Dec 07 19:27:56 crc kubenswrapper[4815]: I1207 19:27:56.177443 4815 generic.go:334] "Generic (PLEG): container finished" podID="6fa0357a-e808-457c-b9dc-728eca137551" containerID="b297ab65ccb42ae604846c5373765ba2046ab830bb9659c1f0813e77be6d4c96" exitCode=0 Dec 07 19:27:56 crc kubenswrapper[4815]: I1207 19:27:56.177570 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftgl7c" event={"ID":"6fa0357a-e808-457c-b9dc-728eca137551","Type":"ContainerDied","Data":"b297ab65ccb42ae604846c5373765ba2046ab830bb9659c1f0813e77be6d4c96"} Dec 07 19:27:56 crc kubenswrapper[4815]: I1207 19:27:56.178085 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftgl7c" event={"ID":"6fa0357a-e808-457c-b9dc-728eca137551","Type":"ContainerStarted","Data":"1e735508175f878d1f29ffd394c3b26f4b043e85deb2c65680b5db062809e99d"} Dec 07 19:27:56 crc kubenswrapper[4815]: I1207 19:27:56.360355 4815 patch_prober.go:28] interesting pod/machine-config-daemon-gkn4h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 07 19:27:56 crc kubenswrapper[4815]: I1207 19:27:56.360425 4815 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 07 19:27:56 crc kubenswrapper[4815]: I1207 19:27:56.930411 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-sn7cx"] Dec 07 19:27:56 crc kubenswrapper[4815]: I1207 19:27:56.931580 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sn7cx" Dec 07 19:27:56 crc kubenswrapper[4815]: I1207 19:27:56.947132 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sn7cx"] Dec 07 19:27:57 crc kubenswrapper[4815]: I1207 19:27:57.060123 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3107902-48ef-4548-a73a-a57a0f36d5d2-catalog-content\") pod \"redhat-operators-sn7cx\" (UID: \"e3107902-48ef-4548-a73a-a57a0f36d5d2\") " pod="openshift-marketplace/redhat-operators-sn7cx" Dec 07 19:27:57 crc kubenswrapper[4815]: I1207 19:27:57.060174 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khhbn\" (UniqueName: \"kubernetes.io/projected/e3107902-48ef-4548-a73a-a57a0f36d5d2-kube-api-access-khhbn\") pod \"redhat-operators-sn7cx\" (UID: \"e3107902-48ef-4548-a73a-a57a0f36d5d2\") " pod="openshift-marketplace/redhat-operators-sn7cx" Dec 07 19:27:57 crc kubenswrapper[4815]: I1207 19:27:57.060253 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3107902-48ef-4548-a73a-a57a0f36d5d2-utilities\") pod \"redhat-operators-sn7cx\" (UID: \"e3107902-48ef-4548-a73a-a57a0f36d5d2\") " pod="openshift-marketplace/redhat-operators-sn7cx" Dec 07 19:27:57 crc kubenswrapper[4815]: I1207 19:27:57.161516 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khhbn\" (UniqueName: \"kubernetes.io/projected/e3107902-48ef-4548-a73a-a57a0f36d5d2-kube-api-access-khhbn\") pod \"redhat-operators-sn7cx\" (UID: \"e3107902-48ef-4548-a73a-a57a0f36d5d2\") " pod="openshift-marketplace/redhat-operators-sn7cx" Dec 07 19:27:57 crc kubenswrapper[4815]: I1207 19:27:57.161616 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3107902-48ef-4548-a73a-a57a0f36d5d2-utilities\") pod \"redhat-operators-sn7cx\" (UID: \"e3107902-48ef-4548-a73a-a57a0f36d5d2\") " pod="openshift-marketplace/redhat-operators-sn7cx" Dec 07 19:27:57 crc kubenswrapper[4815]: I1207 19:27:57.161659 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3107902-48ef-4548-a73a-a57a0f36d5d2-catalog-content\") pod \"redhat-operators-sn7cx\" (UID: \"e3107902-48ef-4548-a73a-a57a0f36d5d2\") " pod="openshift-marketplace/redhat-operators-sn7cx" Dec 07 19:27:57 crc kubenswrapper[4815]: I1207 19:27:57.162215 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3107902-48ef-4548-a73a-a57a0f36d5d2-catalog-content\") pod \"redhat-operators-sn7cx\" (UID: \"e3107902-48ef-4548-a73a-a57a0f36d5d2\") " pod="openshift-marketplace/redhat-operators-sn7cx" Dec 07 19:27:57 crc kubenswrapper[4815]: I1207 19:27:57.162304 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3107902-48ef-4548-a73a-a57a0f36d5d2-utilities\") pod \"redhat-operators-sn7cx\" (UID: \"e3107902-48ef-4548-a73a-a57a0f36d5d2\") " pod="openshift-marketplace/redhat-operators-sn7cx" Dec 07 19:27:57 crc kubenswrapper[4815]: I1207 19:27:57.183308 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khhbn\" (UniqueName: \"kubernetes.io/projected/e3107902-48ef-4548-a73a-a57a0f36d5d2-kube-api-access-khhbn\") pod \"redhat-operators-sn7cx\" (UID: \"e3107902-48ef-4548-a73a-a57a0f36d5d2\") " pod="openshift-marketplace/redhat-operators-sn7cx" Dec 07 19:27:57 crc kubenswrapper[4815]: I1207 19:27:57.244843 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sn7cx" Dec 07 19:27:57 crc kubenswrapper[4815]: I1207 19:27:57.453706 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sn7cx"] Dec 07 19:27:58 crc kubenswrapper[4815]: I1207 19:27:58.186944 4815 generic.go:334] "Generic (PLEG): container finished" podID="e3107902-48ef-4548-a73a-a57a0f36d5d2" containerID="5dcfa2cce897d4fbae1844b2acf9ceed436ca5fa9a7fca2e56313a53fe9adc9f" exitCode=0 Dec 07 19:27:58 crc kubenswrapper[4815]: I1207 19:27:58.187288 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sn7cx" event={"ID":"e3107902-48ef-4548-a73a-a57a0f36d5d2","Type":"ContainerDied","Data":"5dcfa2cce897d4fbae1844b2acf9ceed436ca5fa9a7fca2e56313a53fe9adc9f"} Dec 07 19:27:58 crc kubenswrapper[4815]: I1207 19:27:58.187313 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sn7cx" event={"ID":"e3107902-48ef-4548-a73a-a57a0f36d5d2","Type":"ContainerStarted","Data":"cc6d12253c66efcd96c35d5995743c60a490009302a1540b89985d826e5ed5ca"} Dec 07 19:27:58 crc kubenswrapper[4815]: I1207 19:27:58.193411 4815 generic.go:334] "Generic (PLEG): container finished" podID="6fa0357a-e808-457c-b9dc-728eca137551" containerID="2c56ff246a62c5285e678b8899a520f1603e2eccfdc5206ed77d30bef41f6fd5" exitCode=0 Dec 07 19:27:58 crc kubenswrapper[4815]: I1207 19:27:58.193442 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftgl7c" event={"ID":"6fa0357a-e808-457c-b9dc-728eca137551","Type":"ContainerDied","Data":"2c56ff246a62c5285e678b8899a520f1603e2eccfdc5206ed77d30bef41f6fd5"} Dec 07 19:27:59 crc kubenswrapper[4815]: I1207 19:27:59.386282 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sn7cx" event={"ID":"e3107902-48ef-4548-a73a-a57a0f36d5d2","Type":"ContainerStarted","Data":"89bdf2690a49d5bd2627167a05e3ba8662973a530da0f426f3d19df480d9cec2"} Dec 07 19:27:59 crc kubenswrapper[4815]: I1207 19:27:59.389335 4815 generic.go:334] "Generic (PLEG): container finished" podID="6fa0357a-e808-457c-b9dc-728eca137551" containerID="41442e8c39179dec2ae15c76a963cf9078f33f341426a5d10389606fa5656d73" exitCode=0 Dec 07 19:27:59 crc kubenswrapper[4815]: I1207 19:27:59.389380 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftgl7c" event={"ID":"6fa0357a-e808-457c-b9dc-728eca137551","Type":"ContainerDied","Data":"41442e8c39179dec2ae15c76a963cf9078f33f341426a5d10389606fa5656d73"} Dec 07 19:28:00 crc kubenswrapper[4815]: I1207 19:28:00.976534 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftgl7c" Dec 07 19:28:01 crc kubenswrapper[4815]: I1207 19:28:01.013456 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6fa0357a-e808-457c-b9dc-728eca137551-util\") pod \"6fa0357a-e808-457c-b9dc-728eca137551\" (UID: \"6fa0357a-e808-457c-b9dc-728eca137551\") " Dec 07 19:28:01 crc kubenswrapper[4815]: I1207 19:28:01.013503 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6fa0357a-e808-457c-b9dc-728eca137551-bundle\") pod \"6fa0357a-e808-457c-b9dc-728eca137551\" (UID: \"6fa0357a-e808-457c-b9dc-728eca137551\") " Dec 07 19:28:01 crc kubenswrapper[4815]: I1207 19:28:01.013527 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5drq\" (UniqueName: \"kubernetes.io/projected/6fa0357a-e808-457c-b9dc-728eca137551-kube-api-access-g5drq\") pod \"6fa0357a-e808-457c-b9dc-728eca137551\" (UID: \"6fa0357a-e808-457c-b9dc-728eca137551\") " Dec 07 19:28:01 crc kubenswrapper[4815]: I1207 19:28:01.014541 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6fa0357a-e808-457c-b9dc-728eca137551-bundle" (OuterVolumeSpecName: "bundle") pod "6fa0357a-e808-457c-b9dc-728eca137551" (UID: "6fa0357a-e808-457c-b9dc-728eca137551"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 07 19:28:01 crc kubenswrapper[4815]: I1207 19:28:01.019307 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fa0357a-e808-457c-b9dc-728eca137551-kube-api-access-g5drq" (OuterVolumeSpecName: "kube-api-access-g5drq") pod "6fa0357a-e808-457c-b9dc-728eca137551" (UID: "6fa0357a-e808-457c-b9dc-728eca137551"). InnerVolumeSpecName "kube-api-access-g5drq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:28:01 crc kubenswrapper[4815]: I1207 19:28:01.114708 4815 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6fa0357a-e808-457c-b9dc-728eca137551-bundle\") on node \"crc\" DevicePath \"\"" Dec 07 19:28:01 crc kubenswrapper[4815]: I1207 19:28:01.114737 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g5drq\" (UniqueName: \"kubernetes.io/projected/6fa0357a-e808-457c-b9dc-728eca137551-kube-api-access-g5drq\") on node \"crc\" DevicePath \"\"" Dec 07 19:28:01 crc kubenswrapper[4815]: I1207 19:28:01.240335 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6fa0357a-e808-457c-b9dc-728eca137551-util" (OuterVolumeSpecName: "util") pod "6fa0357a-e808-457c-b9dc-728eca137551" (UID: "6fa0357a-e808-457c-b9dc-728eca137551"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 07 19:28:01 crc kubenswrapper[4815]: I1207 19:28:01.318134 4815 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6fa0357a-e808-457c-b9dc-728eca137551-util\") on node \"crc\" DevicePath \"\"" Dec 07 19:28:01 crc kubenswrapper[4815]: I1207 19:28:01.404726 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftgl7c" event={"ID":"6fa0357a-e808-457c-b9dc-728eca137551","Type":"ContainerDied","Data":"1e735508175f878d1f29ffd394c3b26f4b043e85deb2c65680b5db062809e99d"} Dec 07 19:28:01 crc kubenswrapper[4815]: I1207 19:28:01.404785 4815 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e735508175f878d1f29ffd394c3b26f4b043e85deb2c65680b5db062809e99d" Dec 07 19:28:01 crc kubenswrapper[4815]: I1207 19:28:01.404743 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftgl7c" Dec 07 19:28:01 crc kubenswrapper[4815]: I1207 19:28:01.407073 4815 generic.go:334] "Generic (PLEG): container finished" podID="e3107902-48ef-4548-a73a-a57a0f36d5d2" containerID="89bdf2690a49d5bd2627167a05e3ba8662973a530da0f426f3d19df480d9cec2" exitCode=0 Dec 07 19:28:01 crc kubenswrapper[4815]: I1207 19:28:01.407129 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sn7cx" event={"ID":"e3107902-48ef-4548-a73a-a57a0f36d5d2","Type":"ContainerDied","Data":"89bdf2690a49d5bd2627167a05e3ba8662973a530da0f426f3d19df480d9cec2"} Dec 07 19:28:02 crc kubenswrapper[4815]: I1207 19:28:02.418820 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sn7cx" event={"ID":"e3107902-48ef-4548-a73a-a57a0f36d5d2","Type":"ContainerStarted","Data":"e199e15d633457ea36dd6499020f71d17f12242fd77e3f3830acfa56b0aade72"} Dec 07 19:28:02 crc kubenswrapper[4815]: I1207 19:28:02.440279 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-sn7cx" podStartSLOduration=2.738370519 podStartE2EDuration="6.440260377s" podCreationTimestamp="2025-12-07 19:27:56 +0000 UTC" firstStartedPulling="2025-12-07 19:27:58.188501712 +0000 UTC m=+782.767491757" lastFinishedPulling="2025-12-07 19:28:01.89039156 +0000 UTC m=+786.469381615" observedRunningTime="2025-12-07 19:28:02.436762107 +0000 UTC m=+787.015752182" watchObservedRunningTime="2025-12-07 19:28:02.440260377 +0000 UTC m=+787.019250432" Dec 07 19:28:04 crc kubenswrapper[4815]: I1207 19:28:04.870107 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-zg96h"] Dec 07 19:28:04 crc kubenswrapper[4815]: E1207 19:28:04.870608 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fa0357a-e808-457c-b9dc-728eca137551" containerName="extract" Dec 07 19:28:04 crc kubenswrapper[4815]: I1207 19:28:04.870621 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fa0357a-e808-457c-b9dc-728eca137551" containerName="extract" Dec 07 19:28:04 crc kubenswrapper[4815]: E1207 19:28:04.870631 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fa0357a-e808-457c-b9dc-728eca137551" containerName="util" Dec 07 19:28:04 crc kubenswrapper[4815]: I1207 19:28:04.870638 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fa0357a-e808-457c-b9dc-728eca137551" containerName="util" Dec 07 19:28:04 crc kubenswrapper[4815]: E1207 19:28:04.870655 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fa0357a-e808-457c-b9dc-728eca137551" containerName="pull" Dec 07 19:28:04 crc kubenswrapper[4815]: I1207 19:28:04.870661 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fa0357a-e808-457c-b9dc-728eca137551" containerName="pull" Dec 07 19:28:04 crc kubenswrapper[4815]: I1207 19:28:04.870759 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fa0357a-e808-457c-b9dc-728eca137551" containerName="extract" Dec 07 19:28:04 crc kubenswrapper[4815]: I1207 19:28:04.871133 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-zg96h" Dec 07 19:28:04 crc kubenswrapper[4815]: I1207 19:28:04.876086 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-t98t5" Dec 07 19:28:04 crc kubenswrapper[4815]: I1207 19:28:04.876138 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Dec 07 19:28:04 crc kubenswrapper[4815]: I1207 19:28:04.877039 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Dec 07 19:28:04 crc kubenswrapper[4815]: I1207 19:28:04.919581 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-zg96h"] Dec 07 19:28:04 crc kubenswrapper[4815]: I1207 19:28:04.968358 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvtxn\" (UniqueName: \"kubernetes.io/projected/49c1d011-c4fa-4a22-b8e6-102fcc05362f-kube-api-access-bvtxn\") pod \"nmstate-operator-5b5b58f5c8-zg96h\" (UID: \"49c1d011-c4fa-4a22-b8e6-102fcc05362f\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-zg96h" Dec 07 19:28:05 crc kubenswrapper[4815]: I1207 19:28:05.069791 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvtxn\" (UniqueName: \"kubernetes.io/projected/49c1d011-c4fa-4a22-b8e6-102fcc05362f-kube-api-access-bvtxn\") pod \"nmstate-operator-5b5b58f5c8-zg96h\" (UID: \"49c1d011-c4fa-4a22-b8e6-102fcc05362f\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-zg96h" Dec 07 19:28:05 crc kubenswrapper[4815]: I1207 19:28:05.093825 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvtxn\" (UniqueName: \"kubernetes.io/projected/49c1d011-c4fa-4a22-b8e6-102fcc05362f-kube-api-access-bvtxn\") pod \"nmstate-operator-5b5b58f5c8-zg96h\" (UID: \"49c1d011-c4fa-4a22-b8e6-102fcc05362f\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-zg96h" Dec 07 19:28:05 crc kubenswrapper[4815]: I1207 19:28:05.185250 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-zg96h" Dec 07 19:28:05 crc kubenswrapper[4815]: I1207 19:28:05.416165 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-zg96h"] Dec 07 19:28:05 crc kubenswrapper[4815]: W1207 19:28:05.419160 4815 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49c1d011_c4fa_4a22_b8e6_102fcc05362f.slice/crio-966d0c958a3967b5ae9ee71f25158be02d7e5e590720ad35ead914ca41270332 WatchSource:0}: Error finding container 966d0c958a3967b5ae9ee71f25158be02d7e5e590720ad35ead914ca41270332: Status 404 returned error can't find the container with id 966d0c958a3967b5ae9ee71f25158be02d7e5e590720ad35ead914ca41270332 Dec 07 19:28:05 crc kubenswrapper[4815]: I1207 19:28:05.438365 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-zg96h" event={"ID":"49c1d011-c4fa-4a22-b8e6-102fcc05362f","Type":"ContainerStarted","Data":"966d0c958a3967b5ae9ee71f25158be02d7e5e590720ad35ead914ca41270332"} Dec 07 19:28:07 crc kubenswrapper[4815]: I1207 19:28:07.245702 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-sn7cx" Dec 07 19:28:07 crc kubenswrapper[4815]: I1207 19:28:07.246108 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-sn7cx" Dec 07 19:28:08 crc kubenswrapper[4815]: I1207 19:28:08.295503 4815 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-sn7cx" podUID="e3107902-48ef-4548-a73a-a57a0f36d5d2" containerName="registry-server" probeResult="failure" output=< Dec 07 19:28:08 crc kubenswrapper[4815]: timeout: failed to connect service ":50051" within 1s Dec 07 19:28:08 crc kubenswrapper[4815]: > Dec 07 19:28:09 crc kubenswrapper[4815]: I1207 19:28:09.842728 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-zg96h" event={"ID":"49c1d011-c4fa-4a22-b8e6-102fcc05362f","Type":"ContainerStarted","Data":"15992e63e250455ed5d5483fba88e8c73b1c8813be80f0be34850e129ac85f18"} Dec 07 19:28:09 crc kubenswrapper[4815]: I1207 19:28:09.870467 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-zg96h" podStartSLOduration=2.086791074 podStartE2EDuration="5.870441865s" podCreationTimestamp="2025-12-07 19:28:04 +0000 UTC" firstStartedPulling="2025-12-07 19:28:05.420964269 +0000 UTC m=+789.999954314" lastFinishedPulling="2025-12-07 19:28:09.20461507 +0000 UTC m=+793.783605105" observedRunningTime="2025-12-07 19:28:09.859900444 +0000 UTC m=+794.438890529" watchObservedRunningTime="2025-12-07 19:28:09.870441865 +0000 UTC m=+794.449431950" Dec 07 19:28:15 crc kubenswrapper[4815]: I1207 19:28:15.040276 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-4z7hm"] Dec 07 19:28:15 crc kubenswrapper[4815]: I1207 19:28:15.041888 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-4z7hm" Dec 07 19:28:15 crc kubenswrapper[4815]: I1207 19:28:15.051675 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-2j567" Dec 07 19:28:15 crc kubenswrapper[4815]: I1207 19:28:15.058746 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-8khj4"] Dec 07 19:28:15 crc kubenswrapper[4815]: I1207 19:28:15.059758 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-8khj4" Dec 07 19:28:15 crc kubenswrapper[4815]: I1207 19:28:15.063759 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-4z7hm"] Dec 07 19:28:15 crc kubenswrapper[4815]: I1207 19:28:15.066057 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Dec 07 19:28:15 crc kubenswrapper[4815]: I1207 19:28:15.085949 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-fnq5m"] Dec 07 19:28:15 crc kubenswrapper[4815]: I1207 19:28:15.086654 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-fnq5m" Dec 07 19:28:15 crc kubenswrapper[4815]: I1207 19:28:15.112246 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-8khj4"] Dec 07 19:28:15 crc kubenswrapper[4815]: I1207 19:28:15.189457 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/a75fb831-cdc3-4a5b-b4ae-451acdaef347-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-8khj4\" (UID: \"a75fb831-cdc3-4a5b-b4ae-451acdaef347\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-8khj4" Dec 07 19:28:15 crc kubenswrapper[4815]: I1207 19:28:15.189825 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtbb7\" (UniqueName: \"kubernetes.io/projected/de2e93af-0942-4efc-a33f-c9685c554154-kube-api-access-rtbb7\") pod \"nmstate-metrics-7f946cbc9-4z7hm\" (UID: \"de2e93af-0942-4efc-a33f-c9685c554154\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-4z7hm" Dec 07 19:28:15 crc kubenswrapper[4815]: I1207 19:28:15.189901 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/20b5ab6f-84d1-4d0f-a133-c800fbc797c1-nmstate-lock\") pod \"nmstate-handler-fnq5m\" (UID: \"20b5ab6f-84d1-4d0f-a133-c800fbc797c1\") " pod="openshift-nmstate/nmstate-handler-fnq5m" Dec 07 19:28:15 crc kubenswrapper[4815]: I1207 19:28:15.189953 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfl6d\" (UniqueName: \"kubernetes.io/projected/20b5ab6f-84d1-4d0f-a133-c800fbc797c1-kube-api-access-bfl6d\") pod \"nmstate-handler-fnq5m\" (UID: \"20b5ab6f-84d1-4d0f-a133-c800fbc797c1\") " pod="openshift-nmstate/nmstate-handler-fnq5m" Dec 07 19:28:15 crc kubenswrapper[4815]: I1207 19:28:15.190019 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/20b5ab6f-84d1-4d0f-a133-c800fbc797c1-ovs-socket\") pod \"nmstate-handler-fnq5m\" (UID: \"20b5ab6f-84d1-4d0f-a133-c800fbc797c1\") " pod="openshift-nmstate/nmstate-handler-fnq5m" Dec 07 19:28:15 crc kubenswrapper[4815]: I1207 19:28:15.190044 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6cfj\" (UniqueName: \"kubernetes.io/projected/a75fb831-cdc3-4a5b-b4ae-451acdaef347-kube-api-access-l6cfj\") pod \"nmstate-webhook-5f6d4c5ccb-8khj4\" (UID: \"a75fb831-cdc3-4a5b-b4ae-451acdaef347\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-8khj4" Dec 07 19:28:15 crc kubenswrapper[4815]: I1207 19:28:15.190068 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/20b5ab6f-84d1-4d0f-a133-c800fbc797c1-dbus-socket\") pod \"nmstate-handler-fnq5m\" (UID: \"20b5ab6f-84d1-4d0f-a133-c800fbc797c1\") " pod="openshift-nmstate/nmstate-handler-fnq5m" Dec 07 19:28:15 crc kubenswrapper[4815]: I1207 19:28:15.221735 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-dr8jc"] Dec 07 19:28:15 crc kubenswrapper[4815]: I1207 19:28:15.222409 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-dr8jc" Dec 07 19:28:15 crc kubenswrapper[4815]: I1207 19:28:15.227727 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Dec 07 19:28:15 crc kubenswrapper[4815]: I1207 19:28:15.228056 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-ksg8x" Dec 07 19:28:15 crc kubenswrapper[4815]: I1207 19:28:15.228970 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Dec 07 19:28:15 crc kubenswrapper[4815]: I1207 19:28:15.241103 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-dr8jc"] Dec 07 19:28:15 crc kubenswrapper[4815]: I1207 19:28:15.291576 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfl6d\" (UniqueName: \"kubernetes.io/projected/20b5ab6f-84d1-4d0f-a133-c800fbc797c1-kube-api-access-bfl6d\") pod \"nmstate-handler-fnq5m\" (UID: \"20b5ab6f-84d1-4d0f-a133-c800fbc797c1\") " pod="openshift-nmstate/nmstate-handler-fnq5m" Dec 07 19:28:15 crc kubenswrapper[4815]: I1207 19:28:15.291659 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/20b5ab6f-84d1-4d0f-a133-c800fbc797c1-ovs-socket\") pod \"nmstate-handler-fnq5m\" (UID: \"20b5ab6f-84d1-4d0f-a133-c800fbc797c1\") " pod="openshift-nmstate/nmstate-handler-fnq5m" Dec 07 19:28:15 crc kubenswrapper[4815]: I1207 19:28:15.291820 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6cfj\" (UniqueName: \"kubernetes.io/projected/a75fb831-cdc3-4a5b-b4ae-451acdaef347-kube-api-access-l6cfj\") pod \"nmstate-webhook-5f6d4c5ccb-8khj4\" (UID: \"a75fb831-cdc3-4a5b-b4ae-451acdaef347\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-8khj4" Dec 07 19:28:15 crc kubenswrapper[4815]: I1207 19:28:15.291872 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/20b5ab6f-84d1-4d0f-a133-c800fbc797c1-ovs-socket\") pod \"nmstate-handler-fnq5m\" (UID: \"20b5ab6f-84d1-4d0f-a133-c800fbc797c1\") " pod="openshift-nmstate/nmstate-handler-fnq5m" Dec 07 19:28:15 crc kubenswrapper[4815]: I1207 19:28:15.291893 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/20b5ab6f-84d1-4d0f-a133-c800fbc797c1-dbus-socket\") pod \"nmstate-handler-fnq5m\" (UID: \"20b5ab6f-84d1-4d0f-a133-c800fbc797c1\") " pod="openshift-nmstate/nmstate-handler-fnq5m" Dec 07 19:28:15 crc kubenswrapper[4815]: I1207 19:28:15.291989 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/a75fb831-cdc3-4a5b-b4ae-451acdaef347-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-8khj4\" (UID: \"a75fb831-cdc3-4a5b-b4ae-451acdaef347\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-8khj4" Dec 07 19:28:15 crc kubenswrapper[4815]: I1207 19:28:15.292038 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtbb7\" (UniqueName: \"kubernetes.io/projected/de2e93af-0942-4efc-a33f-c9685c554154-kube-api-access-rtbb7\") pod \"nmstate-metrics-7f946cbc9-4z7hm\" (UID: \"de2e93af-0942-4efc-a33f-c9685c554154\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-4z7hm" Dec 07 19:28:15 crc kubenswrapper[4815]: E1207 19:28:15.292150 4815 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Dec 07 19:28:15 crc kubenswrapper[4815]: I1207 19:28:15.292188 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/20b5ab6f-84d1-4d0f-a133-c800fbc797c1-nmstate-lock\") pod \"nmstate-handler-fnq5m\" (UID: \"20b5ab6f-84d1-4d0f-a133-c800fbc797c1\") " pod="openshift-nmstate/nmstate-handler-fnq5m" Dec 07 19:28:15 crc kubenswrapper[4815]: E1207 19:28:15.292218 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a75fb831-cdc3-4a5b-b4ae-451acdaef347-tls-key-pair podName:a75fb831-cdc3-4a5b-b4ae-451acdaef347 nodeName:}" failed. No retries permitted until 2025-12-07 19:28:15.792194387 +0000 UTC m=+800.371184442 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/a75fb831-cdc3-4a5b-b4ae-451acdaef347-tls-key-pair") pod "nmstate-webhook-5f6d4c5ccb-8khj4" (UID: "a75fb831-cdc3-4a5b-b4ae-451acdaef347") : secret "openshift-nmstate-webhook" not found Dec 07 19:28:15 crc kubenswrapper[4815]: I1207 19:28:15.292270 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/20b5ab6f-84d1-4d0f-a133-c800fbc797c1-nmstate-lock\") pod \"nmstate-handler-fnq5m\" (UID: \"20b5ab6f-84d1-4d0f-a133-c800fbc797c1\") " pod="openshift-nmstate/nmstate-handler-fnq5m" Dec 07 19:28:15 crc kubenswrapper[4815]: I1207 19:28:15.292290 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/20b5ab6f-84d1-4d0f-a133-c800fbc797c1-dbus-socket\") pod \"nmstate-handler-fnq5m\" (UID: \"20b5ab6f-84d1-4d0f-a133-c800fbc797c1\") " pod="openshift-nmstate/nmstate-handler-fnq5m" Dec 07 19:28:15 crc kubenswrapper[4815]: I1207 19:28:15.311791 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6cfj\" (UniqueName: \"kubernetes.io/projected/a75fb831-cdc3-4a5b-b4ae-451acdaef347-kube-api-access-l6cfj\") pod \"nmstate-webhook-5f6d4c5ccb-8khj4\" (UID: \"a75fb831-cdc3-4a5b-b4ae-451acdaef347\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-8khj4" Dec 07 19:28:15 crc kubenswrapper[4815]: I1207 19:28:15.314727 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfl6d\" (UniqueName: \"kubernetes.io/projected/20b5ab6f-84d1-4d0f-a133-c800fbc797c1-kube-api-access-bfl6d\") pod \"nmstate-handler-fnq5m\" (UID: \"20b5ab6f-84d1-4d0f-a133-c800fbc797c1\") " pod="openshift-nmstate/nmstate-handler-fnq5m" Dec 07 19:28:15 crc kubenswrapper[4815]: I1207 19:28:15.330678 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtbb7\" (UniqueName: \"kubernetes.io/projected/de2e93af-0942-4efc-a33f-c9685c554154-kube-api-access-rtbb7\") pod \"nmstate-metrics-7f946cbc9-4z7hm\" (UID: \"de2e93af-0942-4efc-a33f-c9685c554154\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-4z7hm" Dec 07 19:28:15 crc kubenswrapper[4815]: I1207 19:28:15.386088 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-4z7hm" Dec 07 19:28:15 crc kubenswrapper[4815]: I1207 19:28:15.392800 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/b6174174-1523-4a61-84e4-3d2d8f12303f-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-dr8jc\" (UID: \"b6174174-1523-4a61-84e4-3d2d8f12303f\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-dr8jc" Dec 07 19:28:15 crc kubenswrapper[4815]: I1207 19:28:15.392868 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/b6174174-1523-4a61-84e4-3d2d8f12303f-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-dr8jc\" (UID: \"b6174174-1523-4a61-84e4-3d2d8f12303f\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-dr8jc" Dec 07 19:28:15 crc kubenswrapper[4815]: I1207 19:28:15.392905 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7r68\" (UniqueName: \"kubernetes.io/projected/b6174174-1523-4a61-84e4-3d2d8f12303f-kube-api-access-w7r68\") pod \"nmstate-console-plugin-7fbb5f6569-dr8jc\" (UID: \"b6174174-1523-4a61-84e4-3d2d8f12303f\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-dr8jc" Dec 07 19:28:15 crc kubenswrapper[4815]: I1207 19:28:15.435149 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-fnq5m" Dec 07 19:28:15 crc kubenswrapper[4815]: I1207 19:28:15.473796 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-98fb7895b-jgq24"] Dec 07 19:28:15 crc kubenswrapper[4815]: I1207 19:28:15.479395 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-98fb7895b-jgq24" Dec 07 19:28:15 crc kubenswrapper[4815]: I1207 19:28:15.493367 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/b6174174-1523-4a61-84e4-3d2d8f12303f-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-dr8jc\" (UID: \"b6174174-1523-4a61-84e4-3d2d8f12303f\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-dr8jc" Dec 07 19:28:15 crc kubenswrapper[4815]: I1207 19:28:15.493423 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7r68\" (UniqueName: \"kubernetes.io/projected/b6174174-1523-4a61-84e4-3d2d8f12303f-kube-api-access-w7r68\") pod \"nmstate-console-plugin-7fbb5f6569-dr8jc\" (UID: \"b6174174-1523-4a61-84e4-3d2d8f12303f\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-dr8jc" Dec 07 19:28:15 crc kubenswrapper[4815]: I1207 19:28:15.493474 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/b6174174-1523-4a61-84e4-3d2d8f12303f-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-dr8jc\" (UID: \"b6174174-1523-4a61-84e4-3d2d8f12303f\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-dr8jc" Dec 07 19:28:15 crc kubenswrapper[4815]: I1207 19:28:15.494447 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/b6174174-1523-4a61-84e4-3d2d8f12303f-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-dr8jc\" (UID: \"b6174174-1523-4a61-84e4-3d2d8f12303f\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-dr8jc" Dec 07 19:28:15 crc kubenswrapper[4815]: E1207 19:28:15.494514 4815 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Dec 07 19:28:15 crc kubenswrapper[4815]: E1207 19:28:15.494545 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b6174174-1523-4a61-84e4-3d2d8f12303f-plugin-serving-cert podName:b6174174-1523-4a61-84e4-3d2d8f12303f nodeName:}" failed. No retries permitted until 2025-12-07 19:28:15.99453479 +0000 UTC m=+800.573524835 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/b6174174-1523-4a61-84e4-3d2d8f12303f-plugin-serving-cert") pod "nmstate-console-plugin-7fbb5f6569-dr8jc" (UID: "b6174174-1523-4a61-84e4-3d2d8f12303f") : secret "plugin-serving-cert" not found Dec 07 19:28:15 crc kubenswrapper[4815]: I1207 19:28:15.518328 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-98fb7895b-jgq24"] Dec 07 19:28:15 crc kubenswrapper[4815]: I1207 19:28:15.531767 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7r68\" (UniqueName: \"kubernetes.io/projected/b6174174-1523-4a61-84e4-3d2d8f12303f-kube-api-access-w7r68\") pod \"nmstate-console-plugin-7fbb5f6569-dr8jc\" (UID: \"b6174174-1523-4a61-84e4-3d2d8f12303f\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-dr8jc" Dec 07 19:28:15 crc kubenswrapper[4815]: I1207 19:28:15.594491 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7f78cc6c-7d9f-4a01-83d6-2b432d55e172-console-serving-cert\") pod \"console-98fb7895b-jgq24\" (UID: \"7f78cc6c-7d9f-4a01-83d6-2b432d55e172\") " pod="openshift-console/console-98fb7895b-jgq24" Dec 07 19:28:15 crc kubenswrapper[4815]: I1207 19:28:15.594526 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5ddm\" (UniqueName: \"kubernetes.io/projected/7f78cc6c-7d9f-4a01-83d6-2b432d55e172-kube-api-access-f5ddm\") pod \"console-98fb7895b-jgq24\" (UID: \"7f78cc6c-7d9f-4a01-83d6-2b432d55e172\") " pod="openshift-console/console-98fb7895b-jgq24" Dec 07 19:28:15 crc kubenswrapper[4815]: I1207 19:28:15.594550 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7f78cc6c-7d9f-4a01-83d6-2b432d55e172-console-config\") pod \"console-98fb7895b-jgq24\" (UID: \"7f78cc6c-7d9f-4a01-83d6-2b432d55e172\") " pod="openshift-console/console-98fb7895b-jgq24" Dec 07 19:28:15 crc kubenswrapper[4815]: I1207 19:28:15.594578 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7f78cc6c-7d9f-4a01-83d6-2b432d55e172-trusted-ca-bundle\") pod \"console-98fb7895b-jgq24\" (UID: \"7f78cc6c-7d9f-4a01-83d6-2b432d55e172\") " pod="openshift-console/console-98fb7895b-jgq24" Dec 07 19:28:15 crc kubenswrapper[4815]: I1207 19:28:15.594601 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7f78cc6c-7d9f-4a01-83d6-2b432d55e172-oauth-serving-cert\") pod \"console-98fb7895b-jgq24\" (UID: \"7f78cc6c-7d9f-4a01-83d6-2b432d55e172\") " pod="openshift-console/console-98fb7895b-jgq24" Dec 07 19:28:15 crc kubenswrapper[4815]: I1207 19:28:15.594621 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7f78cc6c-7d9f-4a01-83d6-2b432d55e172-service-ca\") pod \"console-98fb7895b-jgq24\" (UID: \"7f78cc6c-7d9f-4a01-83d6-2b432d55e172\") " pod="openshift-console/console-98fb7895b-jgq24" Dec 07 19:28:15 crc kubenswrapper[4815]: I1207 19:28:15.594934 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7f78cc6c-7d9f-4a01-83d6-2b432d55e172-console-oauth-config\") pod \"console-98fb7895b-jgq24\" (UID: \"7f78cc6c-7d9f-4a01-83d6-2b432d55e172\") " pod="openshift-console/console-98fb7895b-jgq24" Dec 07 19:28:15 crc kubenswrapper[4815]: I1207 19:28:15.691056 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-4z7hm"] Dec 07 19:28:15 crc kubenswrapper[4815]: I1207 19:28:15.695880 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7f78cc6c-7d9f-4a01-83d6-2b432d55e172-console-oauth-config\") pod \"console-98fb7895b-jgq24\" (UID: \"7f78cc6c-7d9f-4a01-83d6-2b432d55e172\") " pod="openshift-console/console-98fb7895b-jgq24" Dec 07 19:28:15 crc kubenswrapper[4815]: I1207 19:28:15.695929 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5ddm\" (UniqueName: \"kubernetes.io/projected/7f78cc6c-7d9f-4a01-83d6-2b432d55e172-kube-api-access-f5ddm\") pod \"console-98fb7895b-jgq24\" (UID: \"7f78cc6c-7d9f-4a01-83d6-2b432d55e172\") " pod="openshift-console/console-98fb7895b-jgq24" Dec 07 19:28:15 crc kubenswrapper[4815]: I1207 19:28:15.695948 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7f78cc6c-7d9f-4a01-83d6-2b432d55e172-console-serving-cert\") pod \"console-98fb7895b-jgq24\" (UID: \"7f78cc6c-7d9f-4a01-83d6-2b432d55e172\") " pod="openshift-console/console-98fb7895b-jgq24" Dec 07 19:28:15 crc kubenswrapper[4815]: I1207 19:28:15.695965 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7f78cc6c-7d9f-4a01-83d6-2b432d55e172-console-config\") pod \"console-98fb7895b-jgq24\" (UID: \"7f78cc6c-7d9f-4a01-83d6-2b432d55e172\") " pod="openshift-console/console-98fb7895b-jgq24" Dec 07 19:28:15 crc kubenswrapper[4815]: I1207 19:28:15.695994 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7f78cc6c-7d9f-4a01-83d6-2b432d55e172-trusted-ca-bundle\") pod \"console-98fb7895b-jgq24\" (UID: \"7f78cc6c-7d9f-4a01-83d6-2b432d55e172\") " pod="openshift-console/console-98fb7895b-jgq24" Dec 07 19:28:15 crc kubenswrapper[4815]: I1207 19:28:15.696014 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7f78cc6c-7d9f-4a01-83d6-2b432d55e172-oauth-serving-cert\") pod \"console-98fb7895b-jgq24\" (UID: \"7f78cc6c-7d9f-4a01-83d6-2b432d55e172\") " pod="openshift-console/console-98fb7895b-jgq24" Dec 07 19:28:15 crc kubenswrapper[4815]: I1207 19:28:15.696035 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7f78cc6c-7d9f-4a01-83d6-2b432d55e172-service-ca\") pod \"console-98fb7895b-jgq24\" (UID: \"7f78cc6c-7d9f-4a01-83d6-2b432d55e172\") " pod="openshift-console/console-98fb7895b-jgq24" Dec 07 19:28:15 crc kubenswrapper[4815]: I1207 19:28:15.697413 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7f78cc6c-7d9f-4a01-83d6-2b432d55e172-oauth-serving-cert\") pod \"console-98fb7895b-jgq24\" (UID: \"7f78cc6c-7d9f-4a01-83d6-2b432d55e172\") " pod="openshift-console/console-98fb7895b-jgq24" Dec 07 19:28:15 crc kubenswrapper[4815]: I1207 19:28:15.697504 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7f78cc6c-7d9f-4a01-83d6-2b432d55e172-trusted-ca-bundle\") pod \"console-98fb7895b-jgq24\" (UID: \"7f78cc6c-7d9f-4a01-83d6-2b432d55e172\") " pod="openshift-console/console-98fb7895b-jgq24" Dec 07 19:28:15 crc kubenswrapper[4815]: I1207 19:28:15.697613 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7f78cc6c-7d9f-4a01-83d6-2b432d55e172-console-config\") pod \"console-98fb7895b-jgq24\" (UID: \"7f78cc6c-7d9f-4a01-83d6-2b432d55e172\") " pod="openshift-console/console-98fb7895b-jgq24" Dec 07 19:28:15 crc kubenswrapper[4815]: I1207 19:28:15.697619 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7f78cc6c-7d9f-4a01-83d6-2b432d55e172-service-ca\") pod \"console-98fb7895b-jgq24\" (UID: \"7f78cc6c-7d9f-4a01-83d6-2b432d55e172\") " pod="openshift-console/console-98fb7895b-jgq24" Dec 07 19:28:15 crc kubenswrapper[4815]: I1207 19:28:15.702215 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7f78cc6c-7d9f-4a01-83d6-2b432d55e172-console-oauth-config\") pod \"console-98fb7895b-jgq24\" (UID: \"7f78cc6c-7d9f-4a01-83d6-2b432d55e172\") " pod="openshift-console/console-98fb7895b-jgq24" Dec 07 19:28:15 crc kubenswrapper[4815]: I1207 19:28:15.702454 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7f78cc6c-7d9f-4a01-83d6-2b432d55e172-console-serving-cert\") pod \"console-98fb7895b-jgq24\" (UID: \"7f78cc6c-7d9f-4a01-83d6-2b432d55e172\") " pod="openshift-console/console-98fb7895b-jgq24" Dec 07 19:28:15 crc kubenswrapper[4815]: I1207 19:28:15.715882 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5ddm\" (UniqueName: \"kubernetes.io/projected/7f78cc6c-7d9f-4a01-83d6-2b432d55e172-kube-api-access-f5ddm\") pod \"console-98fb7895b-jgq24\" (UID: \"7f78cc6c-7d9f-4a01-83d6-2b432d55e172\") " pod="openshift-console/console-98fb7895b-jgq24" Dec 07 19:28:15 crc kubenswrapper[4815]: I1207 19:28:15.799965 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/a75fb831-cdc3-4a5b-b4ae-451acdaef347-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-8khj4\" (UID: \"a75fb831-cdc3-4a5b-b4ae-451acdaef347\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-8khj4" Dec 07 19:28:15 crc kubenswrapper[4815]: I1207 19:28:15.804873 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/a75fb831-cdc3-4a5b-b4ae-451acdaef347-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-8khj4\" (UID: \"a75fb831-cdc3-4a5b-b4ae-451acdaef347\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-8khj4" Dec 07 19:28:15 crc kubenswrapper[4815]: I1207 19:28:15.807871 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-98fb7895b-jgq24" Dec 07 19:28:16 crc kubenswrapper[4815]: I1207 19:28:15.999953 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-fnq5m" event={"ID":"20b5ab6f-84d1-4d0f-a133-c800fbc797c1","Type":"ContainerStarted","Data":"6ceefb68a6bedfff7c6302f5e7fa5fdb5356abe5191e6fb0b760ed1c52a6e609"} Dec 07 19:28:16 crc kubenswrapper[4815]: I1207 19:28:16.000199 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-8khj4" Dec 07 19:28:16 crc kubenswrapper[4815]: I1207 19:28:16.003017 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/b6174174-1523-4a61-84e4-3d2d8f12303f-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-dr8jc\" (UID: \"b6174174-1523-4a61-84e4-3d2d8f12303f\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-dr8jc" Dec 07 19:28:16 crc kubenswrapper[4815]: I1207 19:28:16.003844 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-4z7hm" event={"ID":"de2e93af-0942-4efc-a33f-c9685c554154","Type":"ContainerStarted","Data":"4dcaa86f546259e45b6ab0e66ad681176a15bb7f2ef38e4b6428905f128f8cb4"} Dec 07 19:28:16 crc kubenswrapper[4815]: I1207 19:28:16.006266 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/b6174174-1523-4a61-84e4-3d2d8f12303f-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-dr8jc\" (UID: \"b6174174-1523-4a61-84e4-3d2d8f12303f\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-dr8jc" Dec 07 19:28:16 crc kubenswrapper[4815]: I1207 19:28:16.143472 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-dr8jc" Dec 07 19:28:16 crc kubenswrapper[4815]: I1207 19:28:16.240604 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-98fb7895b-jgq24"] Dec 07 19:28:16 crc kubenswrapper[4815]: W1207 19:28:16.287004 4815 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f78cc6c_7d9f_4a01_83d6_2b432d55e172.slice/crio-3bc52c6131c5936ffbf9762539da7218e87d296c8a8a176cd2981d84b511b7ef WatchSource:0}: Error finding container 3bc52c6131c5936ffbf9762539da7218e87d296c8a8a176cd2981d84b511b7ef: Status 404 returned error can't find the container with id 3bc52c6131c5936ffbf9762539da7218e87d296c8a8a176cd2981d84b511b7ef Dec 07 19:28:16 crc kubenswrapper[4815]: I1207 19:28:16.365861 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-8khj4"] Dec 07 19:28:16 crc kubenswrapper[4815]: W1207 19:28:16.377833 4815 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda75fb831_cdc3_4a5b_b4ae_451acdaef347.slice/crio-d412c62658e666d2cc6eab898e8766834f00f60654b5d171f17361cb9b08d152 WatchSource:0}: Error finding container d412c62658e666d2cc6eab898e8766834f00f60654b5d171f17361cb9b08d152: Status 404 returned error can't find the container with id d412c62658e666d2cc6eab898e8766834f00f60654b5d171f17361cb9b08d152 Dec 07 19:28:16 crc kubenswrapper[4815]: I1207 19:28:16.638151 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-dr8jc"] Dec 07 19:28:17 crc kubenswrapper[4815]: I1207 19:28:17.017808 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-8khj4" event={"ID":"a75fb831-cdc3-4a5b-b4ae-451acdaef347","Type":"ContainerStarted","Data":"d412c62658e666d2cc6eab898e8766834f00f60654b5d171f17361cb9b08d152"} Dec 07 19:28:17 crc kubenswrapper[4815]: I1207 19:28:17.019765 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-98fb7895b-jgq24" event={"ID":"7f78cc6c-7d9f-4a01-83d6-2b432d55e172","Type":"ContainerStarted","Data":"3bc52c6131c5936ffbf9762539da7218e87d296c8a8a176cd2981d84b511b7ef"} Dec 07 19:28:17 crc kubenswrapper[4815]: I1207 19:28:17.024590 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-dr8jc" event={"ID":"b6174174-1523-4a61-84e4-3d2d8f12303f","Type":"ContainerStarted","Data":"12bc1a38386a94c02ceee371708d91cac6e110f731a9e7d482343252b65e34e0"} Dec 07 19:28:17 crc kubenswrapper[4815]: I1207 19:28:17.319749 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-sn7cx" Dec 07 19:28:17 crc kubenswrapper[4815]: I1207 19:28:17.400758 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-sn7cx" Dec 07 19:28:17 crc kubenswrapper[4815]: I1207 19:28:17.551963 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sn7cx"] Dec 07 19:28:18 crc kubenswrapper[4815]: I1207 19:28:18.030888 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-98fb7895b-jgq24" event={"ID":"7f78cc6c-7d9f-4a01-83d6-2b432d55e172","Type":"ContainerStarted","Data":"c138357db23a688e437a8040a5970f561ee7ca0403d9b75ff228a7fd7f0f9ee1"} Dec 07 19:28:18 crc kubenswrapper[4815]: I1207 19:28:18.053807 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-98fb7895b-jgq24" podStartSLOduration=3.053790399 podStartE2EDuration="3.053790399s" podCreationTimestamp="2025-12-07 19:28:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:28:18.051694629 +0000 UTC m=+802.630684684" watchObservedRunningTime="2025-12-07 19:28:18.053790399 +0000 UTC m=+802.632780444" Dec 07 19:28:19 crc kubenswrapper[4815]: I1207 19:28:19.037146 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-sn7cx" podUID="e3107902-48ef-4548-a73a-a57a0f36d5d2" containerName="registry-server" containerID="cri-o://e199e15d633457ea36dd6499020f71d17f12242fd77e3f3830acfa56b0aade72" gracePeriod=2 Dec 07 19:28:20 crc kubenswrapper[4815]: I1207 19:28:20.049871 4815 generic.go:334] "Generic (PLEG): container finished" podID="e3107902-48ef-4548-a73a-a57a0f36d5d2" containerID="e199e15d633457ea36dd6499020f71d17f12242fd77e3f3830acfa56b0aade72" exitCode=0 Dec 07 19:28:20 crc kubenswrapper[4815]: I1207 19:28:20.050158 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sn7cx" event={"ID":"e3107902-48ef-4548-a73a-a57a0f36d5d2","Type":"ContainerDied","Data":"e199e15d633457ea36dd6499020f71d17f12242fd77e3f3830acfa56b0aade72"} Dec 07 19:28:20 crc kubenswrapper[4815]: I1207 19:28:20.464797 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sn7cx" Dec 07 19:28:20 crc kubenswrapper[4815]: I1207 19:28:20.536983 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3107902-48ef-4548-a73a-a57a0f36d5d2-catalog-content\") pod \"e3107902-48ef-4548-a73a-a57a0f36d5d2\" (UID: \"e3107902-48ef-4548-a73a-a57a0f36d5d2\") " Dec 07 19:28:20 crc kubenswrapper[4815]: I1207 19:28:20.537021 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-khhbn\" (UniqueName: \"kubernetes.io/projected/e3107902-48ef-4548-a73a-a57a0f36d5d2-kube-api-access-khhbn\") pod \"e3107902-48ef-4548-a73a-a57a0f36d5d2\" (UID: \"e3107902-48ef-4548-a73a-a57a0f36d5d2\") " Dec 07 19:28:20 crc kubenswrapper[4815]: I1207 19:28:20.539044 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3107902-48ef-4548-a73a-a57a0f36d5d2-utilities" (OuterVolumeSpecName: "utilities") pod "e3107902-48ef-4548-a73a-a57a0f36d5d2" (UID: "e3107902-48ef-4548-a73a-a57a0f36d5d2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 07 19:28:20 crc kubenswrapper[4815]: I1207 19:28:20.541096 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3107902-48ef-4548-a73a-a57a0f36d5d2-utilities\") pod \"e3107902-48ef-4548-a73a-a57a0f36d5d2\" (UID: \"e3107902-48ef-4548-a73a-a57a0f36d5d2\") " Dec 07 19:28:20 crc kubenswrapper[4815]: I1207 19:28:20.541511 4815 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3107902-48ef-4548-a73a-a57a0f36d5d2-utilities\") on node \"crc\" DevicePath \"\"" Dec 07 19:28:20 crc kubenswrapper[4815]: I1207 19:28:20.545096 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3107902-48ef-4548-a73a-a57a0f36d5d2-kube-api-access-khhbn" (OuterVolumeSpecName: "kube-api-access-khhbn") pod "e3107902-48ef-4548-a73a-a57a0f36d5d2" (UID: "e3107902-48ef-4548-a73a-a57a0f36d5d2"). InnerVolumeSpecName "kube-api-access-khhbn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:28:20 crc kubenswrapper[4815]: I1207 19:28:20.643031 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-khhbn\" (UniqueName: \"kubernetes.io/projected/e3107902-48ef-4548-a73a-a57a0f36d5d2-kube-api-access-khhbn\") on node \"crc\" DevicePath \"\"" Dec 07 19:28:20 crc kubenswrapper[4815]: I1207 19:28:20.652170 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3107902-48ef-4548-a73a-a57a0f36d5d2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e3107902-48ef-4548-a73a-a57a0f36d5d2" (UID: "e3107902-48ef-4548-a73a-a57a0f36d5d2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 07 19:28:20 crc kubenswrapper[4815]: I1207 19:28:20.745483 4815 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3107902-48ef-4548-a73a-a57a0f36d5d2-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 07 19:28:21 crc kubenswrapper[4815]: I1207 19:28:21.073684 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-dr8jc" event={"ID":"b6174174-1523-4a61-84e4-3d2d8f12303f","Type":"ContainerStarted","Data":"591974acf723c8a42828a40cb2894c8ea1c260ea958f985e400e66c984af115d"} Dec 07 19:28:21 crc kubenswrapper[4815]: I1207 19:28:21.076416 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-8khj4" event={"ID":"a75fb831-cdc3-4a5b-b4ae-451acdaef347","Type":"ContainerStarted","Data":"56b1e6490340cfd035ba42c0e7246ea31047fae206fae2d9f3a7c1f6a7dafc2e"} Dec 07 19:28:21 crc kubenswrapper[4815]: I1207 19:28:21.077521 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-4z7hm" event={"ID":"de2e93af-0942-4efc-a33f-c9685c554154","Type":"ContainerStarted","Data":"bd8be23479c98d89f780b454dc5935a6c8932f2172be55357f6291c290822540"} Dec 07 19:28:21 crc kubenswrapper[4815]: I1207 19:28:21.078488 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-fnq5m" event={"ID":"20b5ab6f-84d1-4d0f-a133-c800fbc797c1","Type":"ContainerStarted","Data":"272a2b89f328c1f605caa1e39948f0f0e677ba0fb7ac41c8c4b43b8658a15fe8"} Dec 07 19:28:21 crc kubenswrapper[4815]: I1207 19:28:21.078867 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-fnq5m" Dec 07 19:28:21 crc kubenswrapper[4815]: I1207 19:28:21.082407 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sn7cx" event={"ID":"e3107902-48ef-4548-a73a-a57a0f36d5d2","Type":"ContainerDied","Data":"cc6d12253c66efcd96c35d5995743c60a490009302a1540b89985d826e5ed5ca"} Dec 07 19:28:21 crc kubenswrapper[4815]: I1207 19:28:21.082447 4815 scope.go:117] "RemoveContainer" containerID="e199e15d633457ea36dd6499020f71d17f12242fd77e3f3830acfa56b0aade72" Dec 07 19:28:21 crc kubenswrapper[4815]: I1207 19:28:21.082537 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sn7cx" Dec 07 19:28:21 crc kubenswrapper[4815]: I1207 19:28:21.099632 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-dr8jc" podStartSLOduration=2.207222519 podStartE2EDuration="6.099607021s" podCreationTimestamp="2025-12-07 19:28:15 +0000 UTC" firstStartedPulling="2025-12-07 19:28:16.642865409 +0000 UTC m=+801.221855464" lastFinishedPulling="2025-12-07 19:28:20.535249921 +0000 UTC m=+805.114239966" observedRunningTime="2025-12-07 19:28:21.089272216 +0000 UTC m=+805.668262261" watchObservedRunningTime="2025-12-07 19:28:21.099607021 +0000 UTC m=+805.678597066" Dec 07 19:28:21 crc kubenswrapper[4815]: I1207 19:28:21.116803 4815 scope.go:117] "RemoveContainer" containerID="89bdf2690a49d5bd2627167a05e3ba8662973a530da0f426f3d19df480d9cec2" Dec 07 19:28:21 crc kubenswrapper[4815]: I1207 19:28:21.144761 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-fnq5m" podStartSLOduration=1.099742675 podStartE2EDuration="6.144743408s" podCreationTimestamp="2025-12-07 19:28:15 +0000 UTC" firstStartedPulling="2025-12-07 19:28:15.492011388 +0000 UTC m=+800.071001433" lastFinishedPulling="2025-12-07 19:28:20.537012111 +0000 UTC m=+805.116002166" observedRunningTime="2025-12-07 19:28:21.114766993 +0000 UTC m=+805.693757038" watchObservedRunningTime="2025-12-07 19:28:21.144743408 +0000 UTC m=+805.723733453" Dec 07 19:28:21 crc kubenswrapper[4815]: I1207 19:28:21.147600 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-8khj4" podStartSLOduration=1.9913374 podStartE2EDuration="6.147586599s" podCreationTimestamp="2025-12-07 19:28:15 +0000 UTC" firstStartedPulling="2025-12-07 19:28:16.380057072 +0000 UTC m=+800.959047117" lastFinishedPulling="2025-12-07 19:28:20.536306261 +0000 UTC m=+805.115296316" observedRunningTime="2025-12-07 19:28:21.144144301 +0000 UTC m=+805.723134356" watchObservedRunningTime="2025-12-07 19:28:21.147586599 +0000 UTC m=+805.726576644" Dec 07 19:28:21 crc kubenswrapper[4815]: I1207 19:28:21.157513 4815 scope.go:117] "RemoveContainer" containerID="5dcfa2cce897d4fbae1844b2acf9ceed436ca5fa9a7fca2e56313a53fe9adc9f" Dec 07 19:28:21 crc kubenswrapper[4815]: I1207 19:28:21.161163 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sn7cx"] Dec 07 19:28:21 crc kubenswrapper[4815]: I1207 19:28:21.180214 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-sn7cx"] Dec 07 19:28:21 crc kubenswrapper[4815]: I1207 19:28:21.783551 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3107902-48ef-4548-a73a-a57a0f36d5d2" path="/var/lib/kubelet/pods/e3107902-48ef-4548-a73a-a57a0f36d5d2/volumes" Dec 07 19:28:22 crc kubenswrapper[4815]: I1207 19:28:22.090587 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-8khj4" Dec 07 19:28:23 crc kubenswrapper[4815]: I1207 19:28:23.098304 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-4z7hm" event={"ID":"de2e93af-0942-4efc-a33f-c9685c554154","Type":"ContainerStarted","Data":"992f89a3b7602d65c0f63d2829ce964f139ab2d277d2857f929e978f10457906"} Dec 07 19:28:25 crc kubenswrapper[4815]: I1207 19:28:25.472967 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-fnq5m" Dec 07 19:28:25 crc kubenswrapper[4815]: I1207 19:28:25.490701 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-4z7hm" podStartSLOduration=3.36295852 podStartE2EDuration="10.490684108s" podCreationTimestamp="2025-12-07 19:28:15 +0000 UTC" firstStartedPulling="2025-12-07 19:28:15.701839184 +0000 UTC m=+800.280829229" lastFinishedPulling="2025-12-07 19:28:22.829564762 +0000 UTC m=+807.408554817" observedRunningTime="2025-12-07 19:28:23.122045246 +0000 UTC m=+807.701035361" watchObservedRunningTime="2025-12-07 19:28:25.490684108 +0000 UTC m=+810.069674153" Dec 07 19:28:25 crc kubenswrapper[4815]: I1207 19:28:25.808980 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-98fb7895b-jgq24" Dec 07 19:28:25 crc kubenswrapper[4815]: I1207 19:28:25.809036 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-98fb7895b-jgq24" Dec 07 19:28:25 crc kubenswrapper[4815]: I1207 19:28:25.815709 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-98fb7895b-jgq24" Dec 07 19:28:26 crc kubenswrapper[4815]: I1207 19:28:26.125028 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-98fb7895b-jgq24" Dec 07 19:28:26 crc kubenswrapper[4815]: I1207 19:28:26.257359 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-xxlhj"] Dec 07 19:28:26 crc kubenswrapper[4815]: I1207 19:28:26.360171 4815 patch_prober.go:28] interesting pod/machine-config-daemon-gkn4h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 07 19:28:26 crc kubenswrapper[4815]: I1207 19:28:26.360257 4815 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 07 19:28:26 crc kubenswrapper[4815]: I1207 19:28:26.360299 4815 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" Dec 07 19:28:26 crc kubenswrapper[4815]: I1207 19:28:26.360845 4815 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e659021b677075a133a9841eb7e4e0a1041afcc4554ca5f769bf95532398a212"} pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 07 19:28:26 crc kubenswrapper[4815]: I1207 19:28:26.360901 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" containerName="machine-config-daemon" containerID="cri-o://e659021b677075a133a9841eb7e4e0a1041afcc4554ca5f769bf95532398a212" gracePeriod=600 Dec 07 19:28:27 crc kubenswrapper[4815]: I1207 19:28:27.130899 4815 generic.go:334] "Generic (PLEG): container finished" podID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" containerID="e659021b677075a133a9841eb7e4e0a1041afcc4554ca5f769bf95532398a212" exitCode=0 Dec 07 19:28:27 crc kubenswrapper[4815]: I1207 19:28:27.130996 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" event={"ID":"3d662ba2-aa03-4eea-bd30-8ad40638f6c7","Type":"ContainerDied","Data":"e659021b677075a133a9841eb7e4e0a1041afcc4554ca5f769bf95532398a212"} Dec 07 19:28:27 crc kubenswrapper[4815]: I1207 19:28:27.131647 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" event={"ID":"3d662ba2-aa03-4eea-bd30-8ad40638f6c7","Type":"ContainerStarted","Data":"79cc7c5fc46172fc78f3ba5349136e2b03ed048984c9a5cc4bf7488a43b013ac"} Dec 07 19:28:27 crc kubenswrapper[4815]: I1207 19:28:27.131687 4815 scope.go:117] "RemoveContainer" containerID="2066ef4b58d85dd6cfa5f3ca947456828e3a7a719f157903ecc03c9ca5f9a8f6" Dec 07 19:28:36 crc kubenswrapper[4815]: I1207 19:28:36.006244 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-8khj4" Dec 07 19:28:50 crc kubenswrapper[4815]: I1207 19:28:50.352272 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83xcc8m"] Dec 07 19:28:50 crc kubenswrapper[4815]: E1207 19:28:50.353168 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3107902-48ef-4548-a73a-a57a0f36d5d2" containerName="registry-server" Dec 07 19:28:50 crc kubenswrapper[4815]: I1207 19:28:50.353189 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3107902-48ef-4548-a73a-a57a0f36d5d2" containerName="registry-server" Dec 07 19:28:50 crc kubenswrapper[4815]: E1207 19:28:50.353204 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3107902-48ef-4548-a73a-a57a0f36d5d2" containerName="extract-utilities" Dec 07 19:28:50 crc kubenswrapper[4815]: I1207 19:28:50.353217 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3107902-48ef-4548-a73a-a57a0f36d5d2" containerName="extract-utilities" Dec 07 19:28:50 crc kubenswrapper[4815]: E1207 19:28:50.353237 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3107902-48ef-4548-a73a-a57a0f36d5d2" containerName="extract-content" Dec 07 19:28:50 crc kubenswrapper[4815]: I1207 19:28:50.353252 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3107902-48ef-4548-a73a-a57a0f36d5d2" containerName="extract-content" Dec 07 19:28:50 crc kubenswrapper[4815]: I1207 19:28:50.353431 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3107902-48ef-4548-a73a-a57a0f36d5d2" containerName="registry-server" Dec 07 19:28:50 crc kubenswrapper[4815]: I1207 19:28:50.354637 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83xcc8m" Dec 07 19:28:50 crc kubenswrapper[4815]: I1207 19:28:50.356672 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 07 19:28:50 crc kubenswrapper[4815]: I1207 19:28:50.364283 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83xcc8m"] Dec 07 19:28:50 crc kubenswrapper[4815]: I1207 19:28:50.521151 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hd4fp\" (UniqueName: \"kubernetes.io/projected/2599f100-fc55-4ced-96e8-1a6cc7de37f0-kube-api-access-hd4fp\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83xcc8m\" (UID: \"2599f100-fc55-4ced-96e8-1a6cc7de37f0\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83xcc8m" Dec 07 19:28:50 crc kubenswrapper[4815]: I1207 19:28:50.521216 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2599f100-fc55-4ced-96e8-1a6cc7de37f0-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83xcc8m\" (UID: \"2599f100-fc55-4ced-96e8-1a6cc7de37f0\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83xcc8m" Dec 07 19:28:50 crc kubenswrapper[4815]: I1207 19:28:50.521658 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2599f100-fc55-4ced-96e8-1a6cc7de37f0-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83xcc8m\" (UID: \"2599f100-fc55-4ced-96e8-1a6cc7de37f0\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83xcc8m" Dec 07 19:28:50 crc kubenswrapper[4815]: I1207 19:28:50.623122 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hd4fp\" (UniqueName: \"kubernetes.io/projected/2599f100-fc55-4ced-96e8-1a6cc7de37f0-kube-api-access-hd4fp\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83xcc8m\" (UID: \"2599f100-fc55-4ced-96e8-1a6cc7de37f0\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83xcc8m" Dec 07 19:28:50 crc kubenswrapper[4815]: I1207 19:28:50.623464 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2599f100-fc55-4ced-96e8-1a6cc7de37f0-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83xcc8m\" (UID: \"2599f100-fc55-4ced-96e8-1a6cc7de37f0\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83xcc8m" Dec 07 19:28:50 crc kubenswrapper[4815]: I1207 19:28:50.623598 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2599f100-fc55-4ced-96e8-1a6cc7de37f0-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83xcc8m\" (UID: \"2599f100-fc55-4ced-96e8-1a6cc7de37f0\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83xcc8m" Dec 07 19:28:50 crc kubenswrapper[4815]: I1207 19:28:50.624132 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2599f100-fc55-4ced-96e8-1a6cc7de37f0-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83xcc8m\" (UID: \"2599f100-fc55-4ced-96e8-1a6cc7de37f0\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83xcc8m" Dec 07 19:28:50 crc kubenswrapper[4815]: I1207 19:28:50.624339 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2599f100-fc55-4ced-96e8-1a6cc7de37f0-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83xcc8m\" (UID: \"2599f100-fc55-4ced-96e8-1a6cc7de37f0\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83xcc8m" Dec 07 19:28:50 crc kubenswrapper[4815]: I1207 19:28:50.648756 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hd4fp\" (UniqueName: \"kubernetes.io/projected/2599f100-fc55-4ced-96e8-1a6cc7de37f0-kube-api-access-hd4fp\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83xcc8m\" (UID: \"2599f100-fc55-4ced-96e8-1a6cc7de37f0\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83xcc8m" Dec 07 19:28:50 crc kubenswrapper[4815]: I1207 19:28:50.727364 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83xcc8m" Dec 07 19:28:51 crc kubenswrapper[4815]: I1207 19:28:51.143113 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83xcc8m"] Dec 07 19:28:51 crc kubenswrapper[4815]: I1207 19:28:51.310346 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83xcc8m" event={"ID":"2599f100-fc55-4ced-96e8-1a6cc7de37f0","Type":"ContainerStarted","Data":"62dea344a3fdaa0e8c56e6bf33c5612927488ebf67eedd2f08902f507c316239"} Dec 07 19:28:51 crc kubenswrapper[4815]: I1207 19:28:51.313544 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-xxlhj" podUID="f8d9863a-2779-463c-8d73-76246a51b333" containerName="console" containerID="cri-o://48f3e002343e239f0cee8fc41f70041db449369f6faab1576219881caedd3f19" gracePeriod=15 Dec 07 19:28:51 crc kubenswrapper[4815]: I1207 19:28:51.692077 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-xxlhj_f8d9863a-2779-463c-8d73-76246a51b333/console/0.log" Dec 07 19:28:51 crc kubenswrapper[4815]: I1207 19:28:51.692389 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-xxlhj" Dec 07 19:28:51 crc kubenswrapper[4815]: I1207 19:28:51.841321 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f8d9863a-2779-463c-8d73-76246a51b333-console-serving-cert\") pod \"f8d9863a-2779-463c-8d73-76246a51b333\" (UID: \"f8d9863a-2779-463c-8d73-76246a51b333\") " Dec 07 19:28:51 crc kubenswrapper[4815]: I1207 19:28:51.841376 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f8d9863a-2779-463c-8d73-76246a51b333-oauth-serving-cert\") pod \"f8d9863a-2779-463c-8d73-76246a51b333\" (UID: \"f8d9863a-2779-463c-8d73-76246a51b333\") " Dec 07 19:28:51 crc kubenswrapper[4815]: I1207 19:28:51.841418 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f8d9863a-2779-463c-8d73-76246a51b333-console-config\") pod \"f8d9863a-2779-463c-8d73-76246a51b333\" (UID: \"f8d9863a-2779-463c-8d73-76246a51b333\") " Dec 07 19:28:51 crc kubenswrapper[4815]: I1207 19:28:51.841457 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-msgq4\" (UniqueName: \"kubernetes.io/projected/f8d9863a-2779-463c-8d73-76246a51b333-kube-api-access-msgq4\") pod \"f8d9863a-2779-463c-8d73-76246a51b333\" (UID: \"f8d9863a-2779-463c-8d73-76246a51b333\") " Dec 07 19:28:51 crc kubenswrapper[4815]: I1207 19:28:51.841523 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8d9863a-2779-463c-8d73-76246a51b333-trusted-ca-bundle\") pod \"f8d9863a-2779-463c-8d73-76246a51b333\" (UID: \"f8d9863a-2779-463c-8d73-76246a51b333\") " Dec 07 19:28:51 crc kubenswrapper[4815]: I1207 19:28:51.841635 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f8d9863a-2779-463c-8d73-76246a51b333-service-ca\") pod \"f8d9863a-2779-463c-8d73-76246a51b333\" (UID: \"f8d9863a-2779-463c-8d73-76246a51b333\") " Dec 07 19:28:51 crc kubenswrapper[4815]: I1207 19:28:51.842576 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8d9863a-2779-463c-8d73-76246a51b333-service-ca" (OuterVolumeSpecName: "service-ca") pod "f8d9863a-2779-463c-8d73-76246a51b333" (UID: "f8d9863a-2779-463c-8d73-76246a51b333"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:28:51 crc kubenswrapper[4815]: I1207 19:28:51.842604 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8d9863a-2779-463c-8d73-76246a51b333-console-config" (OuterVolumeSpecName: "console-config") pod "f8d9863a-2779-463c-8d73-76246a51b333" (UID: "f8d9863a-2779-463c-8d73-76246a51b333"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:28:51 crc kubenswrapper[4815]: I1207 19:28:51.842568 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8d9863a-2779-463c-8d73-76246a51b333-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "f8d9863a-2779-463c-8d73-76246a51b333" (UID: "f8d9863a-2779-463c-8d73-76246a51b333"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:28:51 crc kubenswrapper[4815]: I1207 19:28:51.842872 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f8d9863a-2779-463c-8d73-76246a51b333-console-oauth-config\") pod \"f8d9863a-2779-463c-8d73-76246a51b333\" (UID: \"f8d9863a-2779-463c-8d73-76246a51b333\") " Dec 07 19:28:51 crc kubenswrapper[4815]: I1207 19:28:51.842968 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8d9863a-2779-463c-8d73-76246a51b333-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "f8d9863a-2779-463c-8d73-76246a51b333" (UID: "f8d9863a-2779-463c-8d73-76246a51b333"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:28:51 crc kubenswrapper[4815]: I1207 19:28:51.843166 4815 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f8d9863a-2779-463c-8d73-76246a51b333-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 07 19:28:51 crc kubenswrapper[4815]: I1207 19:28:51.843192 4815 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f8d9863a-2779-463c-8d73-76246a51b333-console-config\") on node \"crc\" DevicePath \"\"" Dec 07 19:28:51 crc kubenswrapper[4815]: I1207 19:28:51.843204 4815 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8d9863a-2779-463c-8d73-76246a51b333-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 07 19:28:51 crc kubenswrapper[4815]: I1207 19:28:51.843217 4815 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f8d9863a-2779-463c-8d73-76246a51b333-service-ca\") on node \"crc\" DevicePath \"\"" Dec 07 19:28:51 crc kubenswrapper[4815]: I1207 19:28:51.847887 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8d9863a-2779-463c-8d73-76246a51b333-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "f8d9863a-2779-463c-8d73-76246a51b333" (UID: "f8d9863a-2779-463c-8d73-76246a51b333"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:28:51 crc kubenswrapper[4815]: I1207 19:28:51.848244 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8d9863a-2779-463c-8d73-76246a51b333-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "f8d9863a-2779-463c-8d73-76246a51b333" (UID: "f8d9863a-2779-463c-8d73-76246a51b333"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:28:51 crc kubenswrapper[4815]: I1207 19:28:51.848693 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8d9863a-2779-463c-8d73-76246a51b333-kube-api-access-msgq4" (OuterVolumeSpecName: "kube-api-access-msgq4") pod "f8d9863a-2779-463c-8d73-76246a51b333" (UID: "f8d9863a-2779-463c-8d73-76246a51b333"). InnerVolumeSpecName "kube-api-access-msgq4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:28:51 crc kubenswrapper[4815]: I1207 19:28:51.944783 4815 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f8d9863a-2779-463c-8d73-76246a51b333-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 07 19:28:51 crc kubenswrapper[4815]: I1207 19:28:51.944815 4815 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f8d9863a-2779-463c-8d73-76246a51b333-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 07 19:28:51 crc kubenswrapper[4815]: I1207 19:28:51.944824 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-msgq4\" (UniqueName: \"kubernetes.io/projected/f8d9863a-2779-463c-8d73-76246a51b333-kube-api-access-msgq4\") on node \"crc\" DevicePath \"\"" Dec 07 19:28:52 crc kubenswrapper[4815]: I1207 19:28:52.317321 4815 generic.go:334] "Generic (PLEG): container finished" podID="2599f100-fc55-4ced-96e8-1a6cc7de37f0" containerID="419898d78830fc12a8a4126ec25ba3feab891e37f6d0e012ca354038c406c8b6" exitCode=0 Dec 07 19:28:52 crc kubenswrapper[4815]: I1207 19:28:52.317358 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83xcc8m" event={"ID":"2599f100-fc55-4ced-96e8-1a6cc7de37f0","Type":"ContainerDied","Data":"419898d78830fc12a8a4126ec25ba3feab891e37f6d0e012ca354038c406c8b6"} Dec 07 19:28:52 crc kubenswrapper[4815]: I1207 19:28:52.319952 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-xxlhj_f8d9863a-2779-463c-8d73-76246a51b333/console/0.log" Dec 07 19:28:52 crc kubenswrapper[4815]: I1207 19:28:52.319987 4815 generic.go:334] "Generic (PLEG): container finished" podID="f8d9863a-2779-463c-8d73-76246a51b333" containerID="48f3e002343e239f0cee8fc41f70041db449369f6faab1576219881caedd3f19" exitCode=2 Dec 07 19:28:52 crc kubenswrapper[4815]: I1207 19:28:52.320014 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-xxlhj" event={"ID":"f8d9863a-2779-463c-8d73-76246a51b333","Type":"ContainerDied","Data":"48f3e002343e239f0cee8fc41f70041db449369f6faab1576219881caedd3f19"} Dec 07 19:28:52 crc kubenswrapper[4815]: I1207 19:28:52.320038 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-xxlhj" event={"ID":"f8d9863a-2779-463c-8d73-76246a51b333","Type":"ContainerDied","Data":"61a99deab59b2786e77d7d3b14a9b598cfcf1fe6178097309645deeddf7bc3eb"} Dec 07 19:28:52 crc kubenswrapper[4815]: I1207 19:28:52.320059 4815 scope.go:117] "RemoveContainer" containerID="48f3e002343e239f0cee8fc41f70041db449369f6faab1576219881caedd3f19" Dec 07 19:28:52 crc kubenswrapper[4815]: I1207 19:28:52.320193 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-xxlhj" Dec 07 19:28:52 crc kubenswrapper[4815]: I1207 19:28:52.346187 4815 scope.go:117] "RemoveContainer" containerID="48f3e002343e239f0cee8fc41f70041db449369f6faab1576219881caedd3f19" Dec 07 19:28:52 crc kubenswrapper[4815]: E1207 19:28:52.346659 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48f3e002343e239f0cee8fc41f70041db449369f6faab1576219881caedd3f19\": container with ID starting with 48f3e002343e239f0cee8fc41f70041db449369f6faab1576219881caedd3f19 not found: ID does not exist" containerID="48f3e002343e239f0cee8fc41f70041db449369f6faab1576219881caedd3f19" Dec 07 19:28:52 crc kubenswrapper[4815]: I1207 19:28:52.346706 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48f3e002343e239f0cee8fc41f70041db449369f6faab1576219881caedd3f19"} err="failed to get container status \"48f3e002343e239f0cee8fc41f70041db449369f6faab1576219881caedd3f19\": rpc error: code = NotFound desc = could not find container \"48f3e002343e239f0cee8fc41f70041db449369f6faab1576219881caedd3f19\": container with ID starting with 48f3e002343e239f0cee8fc41f70041db449369f6faab1576219881caedd3f19 not found: ID does not exist" Dec 07 19:28:52 crc kubenswrapper[4815]: I1207 19:28:52.358043 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-xxlhj"] Dec 07 19:28:52 crc kubenswrapper[4815]: I1207 19:28:52.362118 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-xxlhj"] Dec 07 19:28:53 crc kubenswrapper[4815]: I1207 19:28:53.782634 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8d9863a-2779-463c-8d73-76246a51b333" path="/var/lib/kubelet/pods/f8d9863a-2779-463c-8d73-76246a51b333/volumes" Dec 07 19:28:54 crc kubenswrapper[4815]: I1207 19:28:54.341983 4815 generic.go:334] "Generic (PLEG): container finished" podID="2599f100-fc55-4ced-96e8-1a6cc7de37f0" containerID="6d6e6fe645ef744abd6ffcf243248e74512a546fd9b3f0046e84a3395832fef2" exitCode=0 Dec 07 19:28:54 crc kubenswrapper[4815]: I1207 19:28:54.342026 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83xcc8m" event={"ID":"2599f100-fc55-4ced-96e8-1a6cc7de37f0","Type":"ContainerDied","Data":"6d6e6fe645ef744abd6ffcf243248e74512a546fd9b3f0046e84a3395832fef2"} Dec 07 19:28:56 crc kubenswrapper[4815]: I1207 19:28:56.368699 4815 generic.go:334] "Generic (PLEG): container finished" podID="2599f100-fc55-4ced-96e8-1a6cc7de37f0" containerID="1ac6cbad24ed23fb02a96ca74ed535d2584f1bba63facdb12b094e7b7df9c0b6" exitCode=0 Dec 07 19:28:56 crc kubenswrapper[4815]: I1207 19:28:56.368783 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83xcc8m" event={"ID":"2599f100-fc55-4ced-96e8-1a6cc7de37f0","Type":"ContainerDied","Data":"1ac6cbad24ed23fb02a96ca74ed535d2584f1bba63facdb12b094e7b7df9c0b6"} Dec 07 19:28:57 crc kubenswrapper[4815]: I1207 19:28:57.688608 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83xcc8m" Dec 07 19:28:57 crc kubenswrapper[4815]: I1207 19:28:57.854968 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hd4fp\" (UniqueName: \"kubernetes.io/projected/2599f100-fc55-4ced-96e8-1a6cc7de37f0-kube-api-access-hd4fp\") pod \"2599f100-fc55-4ced-96e8-1a6cc7de37f0\" (UID: \"2599f100-fc55-4ced-96e8-1a6cc7de37f0\") " Dec 07 19:28:57 crc kubenswrapper[4815]: I1207 19:28:57.855066 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2599f100-fc55-4ced-96e8-1a6cc7de37f0-util\") pod \"2599f100-fc55-4ced-96e8-1a6cc7de37f0\" (UID: \"2599f100-fc55-4ced-96e8-1a6cc7de37f0\") " Dec 07 19:28:57 crc kubenswrapper[4815]: I1207 19:28:57.855115 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2599f100-fc55-4ced-96e8-1a6cc7de37f0-bundle\") pod \"2599f100-fc55-4ced-96e8-1a6cc7de37f0\" (UID: \"2599f100-fc55-4ced-96e8-1a6cc7de37f0\") " Dec 07 19:28:57 crc kubenswrapper[4815]: I1207 19:28:57.857206 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2599f100-fc55-4ced-96e8-1a6cc7de37f0-bundle" (OuterVolumeSpecName: "bundle") pod "2599f100-fc55-4ced-96e8-1a6cc7de37f0" (UID: "2599f100-fc55-4ced-96e8-1a6cc7de37f0"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 07 19:28:57 crc kubenswrapper[4815]: I1207 19:28:57.864034 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2599f100-fc55-4ced-96e8-1a6cc7de37f0-kube-api-access-hd4fp" (OuterVolumeSpecName: "kube-api-access-hd4fp") pod "2599f100-fc55-4ced-96e8-1a6cc7de37f0" (UID: "2599f100-fc55-4ced-96e8-1a6cc7de37f0"). InnerVolumeSpecName "kube-api-access-hd4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:28:57 crc kubenswrapper[4815]: I1207 19:28:57.871990 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2599f100-fc55-4ced-96e8-1a6cc7de37f0-util" (OuterVolumeSpecName: "util") pod "2599f100-fc55-4ced-96e8-1a6cc7de37f0" (UID: "2599f100-fc55-4ced-96e8-1a6cc7de37f0"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 07 19:28:57 crc kubenswrapper[4815]: I1207 19:28:57.957275 4815 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2599f100-fc55-4ced-96e8-1a6cc7de37f0-util\") on node \"crc\" DevicePath \"\"" Dec 07 19:28:57 crc kubenswrapper[4815]: I1207 19:28:57.957626 4815 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2599f100-fc55-4ced-96e8-1a6cc7de37f0-bundle\") on node \"crc\" DevicePath \"\"" Dec 07 19:28:57 crc kubenswrapper[4815]: I1207 19:28:57.957748 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hd4fp\" (UniqueName: \"kubernetes.io/projected/2599f100-fc55-4ced-96e8-1a6cc7de37f0-kube-api-access-hd4fp\") on node \"crc\" DevicePath \"\"" Dec 07 19:28:58 crc kubenswrapper[4815]: I1207 19:28:58.388782 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83xcc8m" event={"ID":"2599f100-fc55-4ced-96e8-1a6cc7de37f0","Type":"ContainerDied","Data":"62dea344a3fdaa0e8c56e6bf33c5612927488ebf67eedd2f08902f507c316239"} Dec 07 19:28:58 crc kubenswrapper[4815]: I1207 19:28:58.388835 4815 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="62dea344a3fdaa0e8c56e6bf33c5612927488ebf67eedd2f08902f507c316239" Dec 07 19:28:58 crc kubenswrapper[4815]: I1207 19:28:58.388943 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83xcc8m" Dec 07 19:29:08 crc kubenswrapper[4815]: I1207 19:29:08.505026 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-b799b5946-s42vm"] Dec 07 19:29:08 crc kubenswrapper[4815]: E1207 19:29:08.505623 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2599f100-fc55-4ced-96e8-1a6cc7de37f0" containerName="extract" Dec 07 19:29:08 crc kubenswrapper[4815]: I1207 19:29:08.505635 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="2599f100-fc55-4ced-96e8-1a6cc7de37f0" containerName="extract" Dec 07 19:29:08 crc kubenswrapper[4815]: E1207 19:29:08.505646 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2599f100-fc55-4ced-96e8-1a6cc7de37f0" containerName="pull" Dec 07 19:29:08 crc kubenswrapper[4815]: I1207 19:29:08.505651 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="2599f100-fc55-4ced-96e8-1a6cc7de37f0" containerName="pull" Dec 07 19:29:08 crc kubenswrapper[4815]: E1207 19:29:08.505665 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8d9863a-2779-463c-8d73-76246a51b333" containerName="console" Dec 07 19:29:08 crc kubenswrapper[4815]: I1207 19:29:08.505672 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8d9863a-2779-463c-8d73-76246a51b333" containerName="console" Dec 07 19:29:08 crc kubenswrapper[4815]: E1207 19:29:08.505680 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2599f100-fc55-4ced-96e8-1a6cc7de37f0" containerName="util" Dec 07 19:29:08 crc kubenswrapper[4815]: I1207 19:29:08.505686 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="2599f100-fc55-4ced-96e8-1a6cc7de37f0" containerName="util" Dec 07 19:29:08 crc kubenswrapper[4815]: I1207 19:29:08.505772 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="2599f100-fc55-4ced-96e8-1a6cc7de37f0" containerName="extract" Dec 07 19:29:08 crc kubenswrapper[4815]: I1207 19:29:08.505786 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8d9863a-2779-463c-8d73-76246a51b333" containerName="console" Dec 07 19:29:08 crc kubenswrapper[4815]: I1207 19:29:08.506147 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-b799b5946-s42vm" Dec 07 19:29:08 crc kubenswrapper[4815]: I1207 19:29:08.507830 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Dec 07 19:29:08 crc kubenswrapper[4815]: I1207 19:29:08.508603 4815 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Dec 07 19:29:08 crc kubenswrapper[4815]: I1207 19:29:08.509018 4815 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-76ws9" Dec 07 19:29:08 crc kubenswrapper[4815]: I1207 19:29:08.510122 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Dec 07 19:29:08 crc kubenswrapper[4815]: I1207 19:29:08.511277 4815 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Dec 07 19:29:08 crc kubenswrapper[4815]: I1207 19:29:08.536024 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-b799b5946-s42vm"] Dec 07 19:29:08 crc kubenswrapper[4815]: I1207 19:29:08.592743 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2855ae17-4194-4def-b747-15972b85f28f-apiservice-cert\") pod \"metallb-operator-controller-manager-b799b5946-s42vm\" (UID: \"2855ae17-4194-4def-b747-15972b85f28f\") " pod="metallb-system/metallb-operator-controller-manager-b799b5946-s42vm" Dec 07 19:29:08 crc kubenswrapper[4815]: I1207 19:29:08.592856 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2855ae17-4194-4def-b747-15972b85f28f-webhook-cert\") pod \"metallb-operator-controller-manager-b799b5946-s42vm\" (UID: \"2855ae17-4194-4def-b747-15972b85f28f\") " pod="metallb-system/metallb-operator-controller-manager-b799b5946-s42vm" Dec 07 19:29:08 crc kubenswrapper[4815]: I1207 19:29:08.592896 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krpj4\" (UniqueName: \"kubernetes.io/projected/2855ae17-4194-4def-b747-15972b85f28f-kube-api-access-krpj4\") pod \"metallb-operator-controller-manager-b799b5946-s42vm\" (UID: \"2855ae17-4194-4def-b747-15972b85f28f\") " pod="metallb-system/metallb-operator-controller-manager-b799b5946-s42vm" Dec 07 19:29:08 crc kubenswrapper[4815]: I1207 19:29:08.693782 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2855ae17-4194-4def-b747-15972b85f28f-webhook-cert\") pod \"metallb-operator-controller-manager-b799b5946-s42vm\" (UID: \"2855ae17-4194-4def-b747-15972b85f28f\") " pod="metallb-system/metallb-operator-controller-manager-b799b5946-s42vm" Dec 07 19:29:08 crc kubenswrapper[4815]: I1207 19:29:08.693840 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krpj4\" (UniqueName: \"kubernetes.io/projected/2855ae17-4194-4def-b747-15972b85f28f-kube-api-access-krpj4\") pod \"metallb-operator-controller-manager-b799b5946-s42vm\" (UID: \"2855ae17-4194-4def-b747-15972b85f28f\") " pod="metallb-system/metallb-operator-controller-manager-b799b5946-s42vm" Dec 07 19:29:08 crc kubenswrapper[4815]: I1207 19:29:08.693861 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2855ae17-4194-4def-b747-15972b85f28f-apiservice-cert\") pod \"metallb-operator-controller-manager-b799b5946-s42vm\" (UID: \"2855ae17-4194-4def-b747-15972b85f28f\") " pod="metallb-system/metallb-operator-controller-manager-b799b5946-s42vm" Dec 07 19:29:08 crc kubenswrapper[4815]: I1207 19:29:08.704266 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2855ae17-4194-4def-b747-15972b85f28f-apiservice-cert\") pod \"metallb-operator-controller-manager-b799b5946-s42vm\" (UID: \"2855ae17-4194-4def-b747-15972b85f28f\") " pod="metallb-system/metallb-operator-controller-manager-b799b5946-s42vm" Dec 07 19:29:08 crc kubenswrapper[4815]: I1207 19:29:08.704678 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2855ae17-4194-4def-b747-15972b85f28f-webhook-cert\") pod \"metallb-operator-controller-manager-b799b5946-s42vm\" (UID: \"2855ae17-4194-4def-b747-15972b85f28f\") " pod="metallb-system/metallb-operator-controller-manager-b799b5946-s42vm" Dec 07 19:29:08 crc kubenswrapper[4815]: I1207 19:29:08.723091 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krpj4\" (UniqueName: \"kubernetes.io/projected/2855ae17-4194-4def-b747-15972b85f28f-kube-api-access-krpj4\") pod \"metallb-operator-controller-manager-b799b5946-s42vm\" (UID: \"2855ae17-4194-4def-b747-15972b85f28f\") " pod="metallb-system/metallb-operator-controller-manager-b799b5946-s42vm" Dec 07 19:29:08 crc kubenswrapper[4815]: I1207 19:29:08.792343 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-858ccc4d87-28ck9"] Dec 07 19:29:08 crc kubenswrapper[4815]: I1207 19:29:08.793004 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-858ccc4d87-28ck9" Dec 07 19:29:08 crc kubenswrapper[4815]: I1207 19:29:08.794888 4815 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 07 19:29:08 crc kubenswrapper[4815]: I1207 19:29:08.795373 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8tkl\" (UniqueName: \"kubernetes.io/projected/a2b44f0b-e1f1-472c-816e-acfca6f08db5-kube-api-access-k8tkl\") pod \"metallb-operator-webhook-server-858ccc4d87-28ck9\" (UID: \"a2b44f0b-e1f1-472c-816e-acfca6f08db5\") " pod="metallb-system/metallb-operator-webhook-server-858ccc4d87-28ck9" Dec 07 19:29:08 crc kubenswrapper[4815]: I1207 19:29:08.795415 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a2b44f0b-e1f1-472c-816e-acfca6f08db5-webhook-cert\") pod \"metallb-operator-webhook-server-858ccc4d87-28ck9\" (UID: \"a2b44f0b-e1f1-472c-816e-acfca6f08db5\") " pod="metallb-system/metallb-operator-webhook-server-858ccc4d87-28ck9" Dec 07 19:29:08 crc kubenswrapper[4815]: I1207 19:29:08.795453 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a2b44f0b-e1f1-472c-816e-acfca6f08db5-apiservice-cert\") pod \"metallb-operator-webhook-server-858ccc4d87-28ck9\" (UID: \"a2b44f0b-e1f1-472c-816e-acfca6f08db5\") " pod="metallb-system/metallb-operator-webhook-server-858ccc4d87-28ck9" Dec 07 19:29:08 crc kubenswrapper[4815]: I1207 19:29:08.796757 4815 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Dec 07 19:29:08 crc kubenswrapper[4815]: I1207 19:29:08.796981 4815 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-pvdpk" Dec 07 19:29:08 crc kubenswrapper[4815]: I1207 19:29:08.810836 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-858ccc4d87-28ck9"] Dec 07 19:29:08 crc kubenswrapper[4815]: I1207 19:29:08.822615 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-b799b5946-s42vm" Dec 07 19:29:08 crc kubenswrapper[4815]: I1207 19:29:08.896491 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8tkl\" (UniqueName: \"kubernetes.io/projected/a2b44f0b-e1f1-472c-816e-acfca6f08db5-kube-api-access-k8tkl\") pod \"metallb-operator-webhook-server-858ccc4d87-28ck9\" (UID: \"a2b44f0b-e1f1-472c-816e-acfca6f08db5\") " pod="metallb-system/metallb-operator-webhook-server-858ccc4d87-28ck9" Dec 07 19:29:08 crc kubenswrapper[4815]: I1207 19:29:08.896860 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a2b44f0b-e1f1-472c-816e-acfca6f08db5-webhook-cert\") pod \"metallb-operator-webhook-server-858ccc4d87-28ck9\" (UID: \"a2b44f0b-e1f1-472c-816e-acfca6f08db5\") " pod="metallb-system/metallb-operator-webhook-server-858ccc4d87-28ck9" Dec 07 19:29:08 crc kubenswrapper[4815]: I1207 19:29:08.896905 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a2b44f0b-e1f1-472c-816e-acfca6f08db5-apiservice-cert\") pod \"metallb-operator-webhook-server-858ccc4d87-28ck9\" (UID: \"a2b44f0b-e1f1-472c-816e-acfca6f08db5\") " pod="metallb-system/metallb-operator-webhook-server-858ccc4d87-28ck9" Dec 07 19:29:08 crc kubenswrapper[4815]: I1207 19:29:08.905595 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a2b44f0b-e1f1-472c-816e-acfca6f08db5-webhook-cert\") pod \"metallb-operator-webhook-server-858ccc4d87-28ck9\" (UID: \"a2b44f0b-e1f1-472c-816e-acfca6f08db5\") " pod="metallb-system/metallb-operator-webhook-server-858ccc4d87-28ck9" Dec 07 19:29:08 crc kubenswrapper[4815]: I1207 19:29:08.909129 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a2b44f0b-e1f1-472c-816e-acfca6f08db5-apiservice-cert\") pod \"metallb-operator-webhook-server-858ccc4d87-28ck9\" (UID: \"a2b44f0b-e1f1-472c-816e-acfca6f08db5\") " pod="metallb-system/metallb-operator-webhook-server-858ccc4d87-28ck9" Dec 07 19:29:08 crc kubenswrapper[4815]: I1207 19:29:08.929869 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8tkl\" (UniqueName: \"kubernetes.io/projected/a2b44f0b-e1f1-472c-816e-acfca6f08db5-kube-api-access-k8tkl\") pod \"metallb-operator-webhook-server-858ccc4d87-28ck9\" (UID: \"a2b44f0b-e1f1-472c-816e-acfca6f08db5\") " pod="metallb-system/metallb-operator-webhook-server-858ccc4d87-28ck9" Dec 07 19:29:09 crc kubenswrapper[4815]: I1207 19:29:09.173805 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-858ccc4d87-28ck9" Dec 07 19:29:09 crc kubenswrapper[4815]: I1207 19:29:09.209890 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-b799b5946-s42vm"] Dec 07 19:29:09 crc kubenswrapper[4815]: I1207 19:29:09.455464 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-b799b5946-s42vm" event={"ID":"2855ae17-4194-4def-b747-15972b85f28f","Type":"ContainerStarted","Data":"d0d5398de41e25219bc6102cd84605f48633a342faafbcbf65561e8222703645"} Dec 07 19:29:09 crc kubenswrapper[4815]: I1207 19:29:09.559672 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-858ccc4d87-28ck9"] Dec 07 19:29:09 crc kubenswrapper[4815]: W1207 19:29:09.567394 4815 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda2b44f0b_e1f1_472c_816e_acfca6f08db5.slice/crio-d6096a23fcc4c91582a91a6e32d6153f5d31586b893fb0906468d990f3e951cc WatchSource:0}: Error finding container d6096a23fcc4c91582a91a6e32d6153f5d31586b893fb0906468d990f3e951cc: Status 404 returned error can't find the container with id d6096a23fcc4c91582a91a6e32d6153f5d31586b893fb0906468d990f3e951cc Dec 07 19:29:10 crc kubenswrapper[4815]: I1207 19:29:10.467739 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-858ccc4d87-28ck9" event={"ID":"a2b44f0b-e1f1-472c-816e-acfca6f08db5","Type":"ContainerStarted","Data":"d6096a23fcc4c91582a91a6e32d6153f5d31586b893fb0906468d990f3e951cc"} Dec 07 19:29:13 crc kubenswrapper[4815]: I1207 19:29:13.516800 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-b799b5946-s42vm" event={"ID":"2855ae17-4194-4def-b747-15972b85f28f","Type":"ContainerStarted","Data":"e75d5042cd0d3ec643cccb89bde5dd985839011e0f0aa0219e2296cfbb76d96f"} Dec 07 19:29:13 crc kubenswrapper[4815]: I1207 19:29:13.517399 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-b799b5946-s42vm" Dec 07 19:29:13 crc kubenswrapper[4815]: I1207 19:29:13.545148 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-b799b5946-s42vm" podStartSLOduration=1.547726191 podStartE2EDuration="5.545129768s" podCreationTimestamp="2025-12-07 19:29:08 +0000 UTC" firstStartedPulling="2025-12-07 19:29:09.240954769 +0000 UTC m=+853.819944814" lastFinishedPulling="2025-12-07 19:29:13.238358346 +0000 UTC m=+857.817348391" observedRunningTime="2025-12-07 19:29:13.537432128 +0000 UTC m=+858.116422173" watchObservedRunningTime="2025-12-07 19:29:13.545129768 +0000 UTC m=+858.124119813" Dec 07 19:29:16 crc kubenswrapper[4815]: I1207 19:29:16.554103 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-858ccc4d87-28ck9" event={"ID":"a2b44f0b-e1f1-472c-816e-acfca6f08db5","Type":"ContainerStarted","Data":"2a13c7184053e654566f96d2e5bcc6856a845e3bf9c5f59cf1a40402c6aa90fc"} Dec 07 19:29:16 crc kubenswrapper[4815]: I1207 19:29:16.554602 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-858ccc4d87-28ck9" Dec 07 19:29:16 crc kubenswrapper[4815]: I1207 19:29:16.586316 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-858ccc4d87-28ck9" podStartSLOduration=2.372203632 podStartE2EDuration="8.586294196s" podCreationTimestamp="2025-12-07 19:29:08 +0000 UTC" firstStartedPulling="2025-12-07 19:29:09.569738479 +0000 UTC m=+854.148728524" lastFinishedPulling="2025-12-07 19:29:15.783829033 +0000 UTC m=+860.362819088" observedRunningTime="2025-12-07 19:29:16.579024948 +0000 UTC m=+861.158014993" watchObservedRunningTime="2025-12-07 19:29:16.586294196 +0000 UTC m=+861.165284241" Dec 07 19:29:29 crc kubenswrapper[4815]: I1207 19:29:29.189688 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-858ccc4d87-28ck9" Dec 07 19:29:48 crc kubenswrapper[4815]: I1207 19:29:48.826346 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-b799b5946-s42vm" Dec 07 19:29:49 crc kubenswrapper[4815]: I1207 19:29:49.657025 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-7zz67"] Dec 07 19:29:49 crc kubenswrapper[4815]: I1207 19:29:49.658081 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-7zz67" Dec 07 19:29:49 crc kubenswrapper[4815]: I1207 19:29:49.664117 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-wp7l7"] Dec 07 19:29:49 crc kubenswrapper[4815]: I1207 19:29:49.666175 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-wp7l7" Dec 07 19:29:49 crc kubenswrapper[4815]: I1207 19:29:49.668693 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Dec 07 19:29:49 crc kubenswrapper[4815]: I1207 19:29:49.669190 4815 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Dec 07 19:29:49 crc kubenswrapper[4815]: I1207 19:29:49.669214 4815 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Dec 07 19:29:49 crc kubenswrapper[4815]: I1207 19:29:49.669732 4815 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-qmb9h" Dec 07 19:29:49 crc kubenswrapper[4815]: I1207 19:29:49.677202 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-7zz67"] Dec 07 19:29:49 crc kubenswrapper[4815]: I1207 19:29:49.702988 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f91a6d74-3342-4de5-92c4-354251161c5d-metrics-certs\") pod \"frr-k8s-wp7l7\" (UID: \"f91a6d74-3342-4de5-92c4-354251161c5d\") " pod="metallb-system/frr-k8s-wp7l7" Dec 07 19:29:49 crc kubenswrapper[4815]: I1207 19:29:49.703256 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x92sl\" (UniqueName: \"kubernetes.io/projected/b91cf016-e00e-4b0c-a1ef-2b851971e00b-kube-api-access-x92sl\") pod \"frr-k8s-webhook-server-7fcb986d4-7zz67\" (UID: \"b91cf016-e00e-4b0c-a1ef-2b851971e00b\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-7zz67" Dec 07 19:29:49 crc kubenswrapper[4815]: I1207 19:29:49.703370 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/f91a6d74-3342-4de5-92c4-354251161c5d-reloader\") pod \"frr-k8s-wp7l7\" (UID: \"f91a6d74-3342-4de5-92c4-354251161c5d\") " pod="metallb-system/frr-k8s-wp7l7" Dec 07 19:29:49 crc kubenswrapper[4815]: I1207 19:29:49.703464 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/f91a6d74-3342-4de5-92c4-354251161c5d-metrics\") pod \"frr-k8s-wp7l7\" (UID: \"f91a6d74-3342-4de5-92c4-354251161c5d\") " pod="metallb-system/frr-k8s-wp7l7" Dec 07 19:29:49 crc kubenswrapper[4815]: I1207 19:29:49.703548 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/f91a6d74-3342-4de5-92c4-354251161c5d-frr-startup\") pod \"frr-k8s-wp7l7\" (UID: \"f91a6d74-3342-4de5-92c4-354251161c5d\") " pod="metallb-system/frr-k8s-wp7l7" Dec 07 19:29:49 crc kubenswrapper[4815]: I1207 19:29:49.703642 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/f91a6d74-3342-4de5-92c4-354251161c5d-frr-sockets\") pod \"frr-k8s-wp7l7\" (UID: \"f91a6d74-3342-4de5-92c4-354251161c5d\") " pod="metallb-system/frr-k8s-wp7l7" Dec 07 19:29:49 crc kubenswrapper[4815]: I1207 19:29:49.703757 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/f91a6d74-3342-4de5-92c4-354251161c5d-frr-conf\") pod \"frr-k8s-wp7l7\" (UID: \"f91a6d74-3342-4de5-92c4-354251161c5d\") " pod="metallb-system/frr-k8s-wp7l7" Dec 07 19:29:49 crc kubenswrapper[4815]: I1207 19:29:49.703840 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xl8c2\" (UniqueName: \"kubernetes.io/projected/f91a6d74-3342-4de5-92c4-354251161c5d-kube-api-access-xl8c2\") pod \"frr-k8s-wp7l7\" (UID: \"f91a6d74-3342-4de5-92c4-354251161c5d\") " pod="metallb-system/frr-k8s-wp7l7" Dec 07 19:29:49 crc kubenswrapper[4815]: I1207 19:29:49.703979 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b91cf016-e00e-4b0c-a1ef-2b851971e00b-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-7zz67\" (UID: \"b91cf016-e00e-4b0c-a1ef-2b851971e00b\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-7zz67" Dec 07 19:29:49 crc kubenswrapper[4815]: I1207 19:29:49.805174 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x92sl\" (UniqueName: \"kubernetes.io/projected/b91cf016-e00e-4b0c-a1ef-2b851971e00b-kube-api-access-x92sl\") pod \"frr-k8s-webhook-server-7fcb986d4-7zz67\" (UID: \"b91cf016-e00e-4b0c-a1ef-2b851971e00b\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-7zz67" Dec 07 19:29:49 crc kubenswrapper[4815]: I1207 19:29:49.805243 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/f91a6d74-3342-4de5-92c4-354251161c5d-reloader\") pod \"frr-k8s-wp7l7\" (UID: \"f91a6d74-3342-4de5-92c4-354251161c5d\") " pod="metallb-system/frr-k8s-wp7l7" Dec 07 19:29:49 crc kubenswrapper[4815]: I1207 19:29:49.805277 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/f91a6d74-3342-4de5-92c4-354251161c5d-metrics\") pod \"frr-k8s-wp7l7\" (UID: \"f91a6d74-3342-4de5-92c4-354251161c5d\") " pod="metallb-system/frr-k8s-wp7l7" Dec 07 19:29:49 crc kubenswrapper[4815]: I1207 19:29:49.805304 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/f91a6d74-3342-4de5-92c4-354251161c5d-frr-startup\") pod \"frr-k8s-wp7l7\" (UID: \"f91a6d74-3342-4de5-92c4-354251161c5d\") " pod="metallb-system/frr-k8s-wp7l7" Dec 07 19:29:49 crc kubenswrapper[4815]: I1207 19:29:49.805330 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/f91a6d74-3342-4de5-92c4-354251161c5d-frr-sockets\") pod \"frr-k8s-wp7l7\" (UID: \"f91a6d74-3342-4de5-92c4-354251161c5d\") " pod="metallb-system/frr-k8s-wp7l7" Dec 07 19:29:49 crc kubenswrapper[4815]: I1207 19:29:49.805367 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/f91a6d74-3342-4de5-92c4-354251161c5d-frr-conf\") pod \"frr-k8s-wp7l7\" (UID: \"f91a6d74-3342-4de5-92c4-354251161c5d\") " pod="metallb-system/frr-k8s-wp7l7" Dec 07 19:29:49 crc kubenswrapper[4815]: I1207 19:29:49.805389 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xl8c2\" (UniqueName: \"kubernetes.io/projected/f91a6d74-3342-4de5-92c4-354251161c5d-kube-api-access-xl8c2\") pod \"frr-k8s-wp7l7\" (UID: \"f91a6d74-3342-4de5-92c4-354251161c5d\") " pod="metallb-system/frr-k8s-wp7l7" Dec 07 19:29:49 crc kubenswrapper[4815]: I1207 19:29:49.805411 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b91cf016-e00e-4b0c-a1ef-2b851971e00b-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-7zz67\" (UID: \"b91cf016-e00e-4b0c-a1ef-2b851971e00b\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-7zz67" Dec 07 19:29:49 crc kubenswrapper[4815]: I1207 19:29:49.805454 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f91a6d74-3342-4de5-92c4-354251161c5d-metrics-certs\") pod \"frr-k8s-wp7l7\" (UID: \"f91a6d74-3342-4de5-92c4-354251161c5d\") " pod="metallb-system/frr-k8s-wp7l7" Dec 07 19:29:49 crc kubenswrapper[4815]: E1207 19:29:49.805584 4815 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Dec 07 19:29:49 crc kubenswrapper[4815]: E1207 19:29:49.805641 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f91a6d74-3342-4de5-92c4-354251161c5d-metrics-certs podName:f91a6d74-3342-4de5-92c4-354251161c5d nodeName:}" failed. No retries permitted until 2025-12-07 19:29:50.30562341 +0000 UTC m=+894.884613455 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f91a6d74-3342-4de5-92c4-354251161c5d-metrics-certs") pod "frr-k8s-wp7l7" (UID: "f91a6d74-3342-4de5-92c4-354251161c5d") : secret "frr-k8s-certs-secret" not found Dec 07 19:29:49 crc kubenswrapper[4815]: I1207 19:29:49.806378 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/f91a6d74-3342-4de5-92c4-354251161c5d-reloader\") pod \"frr-k8s-wp7l7\" (UID: \"f91a6d74-3342-4de5-92c4-354251161c5d\") " pod="metallb-system/frr-k8s-wp7l7" Dec 07 19:29:49 crc kubenswrapper[4815]: I1207 19:29:49.806594 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/f91a6d74-3342-4de5-92c4-354251161c5d-metrics\") pod \"frr-k8s-wp7l7\" (UID: \"f91a6d74-3342-4de5-92c4-354251161c5d\") " pod="metallb-system/frr-k8s-wp7l7" Dec 07 19:29:49 crc kubenswrapper[4815]: I1207 19:29:49.807380 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/f91a6d74-3342-4de5-92c4-354251161c5d-frr-startup\") pod \"frr-k8s-wp7l7\" (UID: \"f91a6d74-3342-4de5-92c4-354251161c5d\") " pod="metallb-system/frr-k8s-wp7l7" Dec 07 19:29:49 crc kubenswrapper[4815]: I1207 19:29:49.807633 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/f91a6d74-3342-4de5-92c4-354251161c5d-frr-sockets\") pod \"frr-k8s-wp7l7\" (UID: \"f91a6d74-3342-4de5-92c4-354251161c5d\") " pod="metallb-system/frr-k8s-wp7l7" Dec 07 19:29:49 crc kubenswrapper[4815]: I1207 19:29:49.807857 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/f91a6d74-3342-4de5-92c4-354251161c5d-frr-conf\") pod \"frr-k8s-wp7l7\" (UID: \"f91a6d74-3342-4de5-92c4-354251161c5d\") " pod="metallb-system/frr-k8s-wp7l7" Dec 07 19:29:49 crc kubenswrapper[4815]: I1207 19:29:49.813114 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b91cf016-e00e-4b0c-a1ef-2b851971e00b-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-7zz67\" (UID: \"b91cf016-e00e-4b0c-a1ef-2b851971e00b\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-7zz67" Dec 07 19:29:49 crc kubenswrapper[4815]: I1207 19:29:49.831299 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x92sl\" (UniqueName: \"kubernetes.io/projected/b91cf016-e00e-4b0c-a1ef-2b851971e00b-kube-api-access-x92sl\") pod \"frr-k8s-webhook-server-7fcb986d4-7zz67\" (UID: \"b91cf016-e00e-4b0c-a1ef-2b851971e00b\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-7zz67" Dec 07 19:29:49 crc kubenswrapper[4815]: I1207 19:29:49.843221 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xl8c2\" (UniqueName: \"kubernetes.io/projected/f91a6d74-3342-4de5-92c4-354251161c5d-kube-api-access-xl8c2\") pod \"frr-k8s-wp7l7\" (UID: \"f91a6d74-3342-4de5-92c4-354251161c5d\") " pod="metallb-system/frr-k8s-wp7l7" Dec 07 19:29:49 crc kubenswrapper[4815]: I1207 19:29:49.877112 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-xdlfl"] Dec 07 19:29:49 crc kubenswrapper[4815]: I1207 19:29:49.878204 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-xdlfl" Dec 07 19:29:49 crc kubenswrapper[4815]: I1207 19:29:49.886278 4815 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Dec 07 19:29:49 crc kubenswrapper[4815]: I1207 19:29:49.886343 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Dec 07 19:29:49 crc kubenswrapper[4815]: I1207 19:29:49.886455 4815 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Dec 07 19:29:49 crc kubenswrapper[4815]: I1207 19:29:49.894206 4815 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-2fsh6" Dec 07 19:29:49 crc kubenswrapper[4815]: I1207 19:29:49.913578 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-f8648f98b-29cdv"] Dec 07 19:29:49 crc kubenswrapper[4815]: I1207 19:29:49.914377 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-29cdv" Dec 07 19:29:49 crc kubenswrapper[4815]: I1207 19:29:49.918300 4815 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Dec 07 19:29:49 crc kubenswrapper[4815]: I1207 19:29:49.940876 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-29cdv"] Dec 07 19:29:49 crc kubenswrapper[4815]: I1207 19:29:49.988255 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-7zz67" Dec 07 19:29:50 crc kubenswrapper[4815]: I1207 19:29:50.007220 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgcdn\" (UniqueName: \"kubernetes.io/projected/cd570d1e-fecb-4987-b842-d3e40a89a7a7-kube-api-access-dgcdn\") pod \"speaker-xdlfl\" (UID: \"cd570d1e-fecb-4987-b842-d3e40a89a7a7\") " pod="metallb-system/speaker-xdlfl" Dec 07 19:29:50 crc kubenswrapper[4815]: I1207 19:29:50.007264 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffbwp\" (UniqueName: \"kubernetes.io/projected/6454b83c-e2d6-413c-abbf-21a8f749750c-kube-api-access-ffbwp\") pod \"controller-f8648f98b-29cdv\" (UID: \"6454b83c-e2d6-413c-abbf-21a8f749750c\") " pod="metallb-system/controller-f8648f98b-29cdv" Dec 07 19:29:50 crc kubenswrapper[4815]: I1207 19:29:50.007314 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/cd570d1e-fecb-4987-b842-d3e40a89a7a7-metallb-excludel2\") pod \"speaker-xdlfl\" (UID: \"cd570d1e-fecb-4987-b842-d3e40a89a7a7\") " pod="metallb-system/speaker-xdlfl" Dec 07 19:29:50 crc kubenswrapper[4815]: I1207 19:29:50.007332 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cd570d1e-fecb-4987-b842-d3e40a89a7a7-metrics-certs\") pod \"speaker-xdlfl\" (UID: \"cd570d1e-fecb-4987-b842-d3e40a89a7a7\") " pod="metallb-system/speaker-xdlfl" Dec 07 19:29:50 crc kubenswrapper[4815]: I1207 19:29:50.007351 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6454b83c-e2d6-413c-abbf-21a8f749750c-metrics-certs\") pod \"controller-f8648f98b-29cdv\" (UID: \"6454b83c-e2d6-413c-abbf-21a8f749750c\") " pod="metallb-system/controller-f8648f98b-29cdv" Dec 07 19:29:50 crc kubenswrapper[4815]: I1207 19:29:50.007367 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6454b83c-e2d6-413c-abbf-21a8f749750c-cert\") pod \"controller-f8648f98b-29cdv\" (UID: \"6454b83c-e2d6-413c-abbf-21a8f749750c\") " pod="metallb-system/controller-f8648f98b-29cdv" Dec 07 19:29:50 crc kubenswrapper[4815]: I1207 19:29:50.007391 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/cd570d1e-fecb-4987-b842-d3e40a89a7a7-memberlist\") pod \"speaker-xdlfl\" (UID: \"cd570d1e-fecb-4987-b842-d3e40a89a7a7\") " pod="metallb-system/speaker-xdlfl" Dec 07 19:29:50 crc kubenswrapper[4815]: I1207 19:29:50.108981 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgcdn\" (UniqueName: \"kubernetes.io/projected/cd570d1e-fecb-4987-b842-d3e40a89a7a7-kube-api-access-dgcdn\") pod \"speaker-xdlfl\" (UID: \"cd570d1e-fecb-4987-b842-d3e40a89a7a7\") " pod="metallb-system/speaker-xdlfl" Dec 07 19:29:50 crc kubenswrapper[4815]: I1207 19:29:50.109034 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffbwp\" (UniqueName: \"kubernetes.io/projected/6454b83c-e2d6-413c-abbf-21a8f749750c-kube-api-access-ffbwp\") pod \"controller-f8648f98b-29cdv\" (UID: \"6454b83c-e2d6-413c-abbf-21a8f749750c\") " pod="metallb-system/controller-f8648f98b-29cdv" Dec 07 19:29:50 crc kubenswrapper[4815]: I1207 19:29:50.109105 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/cd570d1e-fecb-4987-b842-d3e40a89a7a7-metallb-excludel2\") pod \"speaker-xdlfl\" (UID: \"cd570d1e-fecb-4987-b842-d3e40a89a7a7\") " pod="metallb-system/speaker-xdlfl" Dec 07 19:29:50 crc kubenswrapper[4815]: I1207 19:29:50.109132 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cd570d1e-fecb-4987-b842-d3e40a89a7a7-metrics-certs\") pod \"speaker-xdlfl\" (UID: \"cd570d1e-fecb-4987-b842-d3e40a89a7a7\") " pod="metallb-system/speaker-xdlfl" Dec 07 19:29:50 crc kubenswrapper[4815]: I1207 19:29:50.109156 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6454b83c-e2d6-413c-abbf-21a8f749750c-metrics-certs\") pod \"controller-f8648f98b-29cdv\" (UID: \"6454b83c-e2d6-413c-abbf-21a8f749750c\") " pod="metallb-system/controller-f8648f98b-29cdv" Dec 07 19:29:50 crc kubenswrapper[4815]: I1207 19:29:50.109178 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6454b83c-e2d6-413c-abbf-21a8f749750c-cert\") pod \"controller-f8648f98b-29cdv\" (UID: \"6454b83c-e2d6-413c-abbf-21a8f749750c\") " pod="metallb-system/controller-f8648f98b-29cdv" Dec 07 19:29:50 crc kubenswrapper[4815]: I1207 19:29:50.109207 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/cd570d1e-fecb-4987-b842-d3e40a89a7a7-memberlist\") pod \"speaker-xdlfl\" (UID: \"cd570d1e-fecb-4987-b842-d3e40a89a7a7\") " pod="metallb-system/speaker-xdlfl" Dec 07 19:29:50 crc kubenswrapper[4815]: E1207 19:29:50.109334 4815 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 07 19:29:50 crc kubenswrapper[4815]: E1207 19:29:50.109387 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cd570d1e-fecb-4987-b842-d3e40a89a7a7-memberlist podName:cd570d1e-fecb-4987-b842-d3e40a89a7a7 nodeName:}" failed. No retries permitted until 2025-12-07 19:29:50.609370406 +0000 UTC m=+895.188360451 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/cd570d1e-fecb-4987-b842-d3e40a89a7a7-memberlist") pod "speaker-xdlfl" (UID: "cd570d1e-fecb-4987-b842-d3e40a89a7a7") : secret "metallb-memberlist" not found Dec 07 19:29:50 crc kubenswrapper[4815]: E1207 19:29:50.109500 4815 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Dec 07 19:29:50 crc kubenswrapper[4815]: E1207 19:29:50.109576 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6454b83c-e2d6-413c-abbf-21a8f749750c-metrics-certs podName:6454b83c-e2d6-413c-abbf-21a8f749750c nodeName:}" failed. No retries permitted until 2025-12-07 19:29:50.609553781 +0000 UTC m=+895.188543876 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6454b83c-e2d6-413c-abbf-21a8f749750c-metrics-certs") pod "controller-f8648f98b-29cdv" (UID: "6454b83c-e2d6-413c-abbf-21a8f749750c") : secret "controller-certs-secret" not found Dec 07 19:29:50 crc kubenswrapper[4815]: I1207 19:29:50.110586 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/cd570d1e-fecb-4987-b842-d3e40a89a7a7-metallb-excludel2\") pod \"speaker-xdlfl\" (UID: \"cd570d1e-fecb-4987-b842-d3e40a89a7a7\") " pod="metallb-system/speaker-xdlfl" Dec 07 19:29:50 crc kubenswrapper[4815]: I1207 19:29:50.114326 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cd570d1e-fecb-4987-b842-d3e40a89a7a7-metrics-certs\") pod \"speaker-xdlfl\" (UID: \"cd570d1e-fecb-4987-b842-d3e40a89a7a7\") " pod="metallb-system/speaker-xdlfl" Dec 07 19:29:50 crc kubenswrapper[4815]: I1207 19:29:50.119768 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6454b83c-e2d6-413c-abbf-21a8f749750c-cert\") pod \"controller-f8648f98b-29cdv\" (UID: \"6454b83c-e2d6-413c-abbf-21a8f749750c\") " pod="metallb-system/controller-f8648f98b-29cdv" Dec 07 19:29:50 crc kubenswrapper[4815]: I1207 19:29:50.143374 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgcdn\" (UniqueName: \"kubernetes.io/projected/cd570d1e-fecb-4987-b842-d3e40a89a7a7-kube-api-access-dgcdn\") pod \"speaker-xdlfl\" (UID: \"cd570d1e-fecb-4987-b842-d3e40a89a7a7\") " pod="metallb-system/speaker-xdlfl" Dec 07 19:29:50 crc kubenswrapper[4815]: I1207 19:29:50.152634 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffbwp\" (UniqueName: \"kubernetes.io/projected/6454b83c-e2d6-413c-abbf-21a8f749750c-kube-api-access-ffbwp\") pod \"controller-f8648f98b-29cdv\" (UID: \"6454b83c-e2d6-413c-abbf-21a8f749750c\") " pod="metallb-system/controller-f8648f98b-29cdv" Dec 07 19:29:50 crc kubenswrapper[4815]: I1207 19:29:50.261193 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-7zz67"] Dec 07 19:29:50 crc kubenswrapper[4815]: I1207 19:29:50.312213 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f91a6d74-3342-4de5-92c4-354251161c5d-metrics-certs\") pod \"frr-k8s-wp7l7\" (UID: \"f91a6d74-3342-4de5-92c4-354251161c5d\") " pod="metallb-system/frr-k8s-wp7l7" Dec 07 19:29:50 crc kubenswrapper[4815]: I1207 19:29:50.316646 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f91a6d74-3342-4de5-92c4-354251161c5d-metrics-certs\") pod \"frr-k8s-wp7l7\" (UID: \"f91a6d74-3342-4de5-92c4-354251161c5d\") " pod="metallb-system/frr-k8s-wp7l7" Dec 07 19:29:50 crc kubenswrapper[4815]: I1207 19:29:50.582943 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-wp7l7" Dec 07 19:29:50 crc kubenswrapper[4815]: I1207 19:29:50.615416 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6454b83c-e2d6-413c-abbf-21a8f749750c-metrics-certs\") pod \"controller-f8648f98b-29cdv\" (UID: \"6454b83c-e2d6-413c-abbf-21a8f749750c\") " pod="metallb-system/controller-f8648f98b-29cdv" Dec 07 19:29:50 crc kubenswrapper[4815]: I1207 19:29:50.615509 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/cd570d1e-fecb-4987-b842-d3e40a89a7a7-memberlist\") pod \"speaker-xdlfl\" (UID: \"cd570d1e-fecb-4987-b842-d3e40a89a7a7\") " pod="metallb-system/speaker-xdlfl" Dec 07 19:29:50 crc kubenswrapper[4815]: E1207 19:29:50.615769 4815 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 07 19:29:50 crc kubenswrapper[4815]: E1207 19:29:50.615858 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cd570d1e-fecb-4987-b842-d3e40a89a7a7-memberlist podName:cd570d1e-fecb-4987-b842-d3e40a89a7a7 nodeName:}" failed. No retries permitted until 2025-12-07 19:29:51.615834924 +0000 UTC m=+896.194825009 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/cd570d1e-fecb-4987-b842-d3e40a89a7a7-memberlist") pod "speaker-xdlfl" (UID: "cd570d1e-fecb-4987-b842-d3e40a89a7a7") : secret "metallb-memberlist" not found Dec 07 19:29:50 crc kubenswrapper[4815]: I1207 19:29:50.619416 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6454b83c-e2d6-413c-abbf-21a8f749750c-metrics-certs\") pod \"controller-f8648f98b-29cdv\" (UID: \"6454b83c-e2d6-413c-abbf-21a8f749750c\") " pod="metallb-system/controller-f8648f98b-29cdv" Dec 07 19:29:50 crc kubenswrapper[4815]: I1207 19:29:50.761566 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-7zz67" event={"ID":"b91cf016-e00e-4b0c-a1ef-2b851971e00b","Type":"ContainerStarted","Data":"6de922cf04bed480d6320aeb87d867eb8fccb5a5afb1a5e50169e7a618212c7a"} Dec 07 19:29:50 crc kubenswrapper[4815]: I1207 19:29:50.829385 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-29cdv" Dec 07 19:29:51 crc kubenswrapper[4815]: I1207 19:29:51.105587 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-29cdv"] Dec 07 19:29:51 crc kubenswrapper[4815]: I1207 19:29:51.627774 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/cd570d1e-fecb-4987-b842-d3e40a89a7a7-memberlist\") pod \"speaker-xdlfl\" (UID: \"cd570d1e-fecb-4987-b842-d3e40a89a7a7\") " pod="metallb-system/speaker-xdlfl" Dec 07 19:29:51 crc kubenswrapper[4815]: I1207 19:29:51.632644 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/cd570d1e-fecb-4987-b842-d3e40a89a7a7-memberlist\") pod \"speaker-xdlfl\" (UID: \"cd570d1e-fecb-4987-b842-d3e40a89a7a7\") " pod="metallb-system/speaker-xdlfl" Dec 07 19:29:51 crc kubenswrapper[4815]: I1207 19:29:51.695747 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-xdlfl" Dec 07 19:29:51 crc kubenswrapper[4815]: I1207 19:29:51.792245 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-29cdv" event={"ID":"6454b83c-e2d6-413c-abbf-21a8f749750c","Type":"ContainerStarted","Data":"0e33baa2940387f69e364becdc924bac6a15e54c05b6b5c086b8e4c06eafffe5"} Dec 07 19:29:51 crc kubenswrapper[4815]: I1207 19:29:51.792287 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-29cdv" event={"ID":"6454b83c-e2d6-413c-abbf-21a8f749750c","Type":"ContainerStarted","Data":"ee25699b0b42d08770b1cd947670d83dcb204a2d7fbfc14f9f6751f646511445"} Dec 07 19:29:51 crc kubenswrapper[4815]: I1207 19:29:51.792297 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-29cdv" event={"ID":"6454b83c-e2d6-413c-abbf-21a8f749750c","Type":"ContainerStarted","Data":"2d5495c2cc67fc357192b6a14bd63b8a04bbc7a4f18db6cd0f5ea8757e1c0f2c"} Dec 07 19:29:51 crc kubenswrapper[4815]: I1207 19:29:51.792569 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-f8648f98b-29cdv" Dec 07 19:29:51 crc kubenswrapper[4815]: I1207 19:29:51.793820 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wp7l7" event={"ID":"f91a6d74-3342-4de5-92c4-354251161c5d","Type":"ContainerStarted","Data":"614a49ea7b00ed9bc2ef2ceceb62adfa9d9b84ba98fd5b1c38b2def3b25bcd35"} Dec 07 19:29:51 crc kubenswrapper[4815]: W1207 19:29:51.802164 4815 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd570d1e_fecb_4987_b842_d3e40a89a7a7.slice/crio-bceef6bc0a5daf451df29d8bb46f3016dae0f95722c285335db66db4e5b369d7 WatchSource:0}: Error finding container bceef6bc0a5daf451df29d8bb46f3016dae0f95722c285335db66db4e5b369d7: Status 404 returned error can't find the container with id bceef6bc0a5daf451df29d8bb46f3016dae0f95722c285335db66db4e5b369d7 Dec 07 19:29:51 crc kubenswrapper[4815]: I1207 19:29:51.824308 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-f8648f98b-29cdv" podStartSLOduration=2.824279617 podStartE2EDuration="2.824279617s" podCreationTimestamp="2025-12-07 19:29:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:29:51.818446671 +0000 UTC m=+896.397436706" watchObservedRunningTime="2025-12-07 19:29:51.824279617 +0000 UTC m=+896.403269692" Dec 07 19:29:52 crc kubenswrapper[4815]: I1207 19:29:52.805125 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-xdlfl" event={"ID":"cd570d1e-fecb-4987-b842-d3e40a89a7a7","Type":"ContainerStarted","Data":"4bfb2cb1c1215220a7cd6a65b5565d36fa126021da223987b71dd54a4531fe55"} Dec 07 19:29:52 crc kubenswrapper[4815]: I1207 19:29:52.805720 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-xdlfl" event={"ID":"cd570d1e-fecb-4987-b842-d3e40a89a7a7","Type":"ContainerStarted","Data":"3b4340b23806c8672e840afe2e5872aae5813e0fe73fa15ce8d4c9902c3eb98f"} Dec 07 19:29:52 crc kubenswrapper[4815]: I1207 19:29:52.805740 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-xdlfl" event={"ID":"cd570d1e-fecb-4987-b842-d3e40a89a7a7","Type":"ContainerStarted","Data":"bceef6bc0a5daf451df29d8bb46f3016dae0f95722c285335db66db4e5b369d7"} Dec 07 19:29:52 crc kubenswrapper[4815]: I1207 19:29:52.805864 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-xdlfl" Dec 07 19:29:55 crc kubenswrapper[4815]: I1207 19:29:55.802770 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-xdlfl" podStartSLOduration=6.802752235 podStartE2EDuration="6.802752235s" podCreationTimestamp="2025-12-07 19:29:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:29:52.832101469 +0000 UTC m=+897.411091524" watchObservedRunningTime="2025-12-07 19:29:55.802752235 +0000 UTC m=+900.381742280" Dec 07 19:30:00 crc kubenswrapper[4815]: I1207 19:30:00.153590 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29418930-bjxq9"] Dec 07 19:30:00 crc kubenswrapper[4815]: I1207 19:30:00.154701 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29418930-bjxq9" Dec 07 19:30:00 crc kubenswrapper[4815]: I1207 19:30:00.156678 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 07 19:30:00 crc kubenswrapper[4815]: I1207 19:30:00.156821 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 07 19:30:00 crc kubenswrapper[4815]: I1207 19:30:00.158282 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29418930-bjxq9"] Dec 07 19:30:00 crc kubenswrapper[4815]: I1207 19:30:00.345632 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6s4mb\" (UniqueName: \"kubernetes.io/projected/491d1d4c-d450-4c57-a735-5c03728dd027-kube-api-access-6s4mb\") pod \"collect-profiles-29418930-bjxq9\" (UID: \"491d1d4c-d450-4c57-a735-5c03728dd027\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29418930-bjxq9" Dec 07 19:30:00 crc kubenswrapper[4815]: I1207 19:30:00.345687 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/491d1d4c-d450-4c57-a735-5c03728dd027-config-volume\") pod \"collect-profiles-29418930-bjxq9\" (UID: \"491d1d4c-d450-4c57-a735-5c03728dd027\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29418930-bjxq9" Dec 07 19:30:00 crc kubenswrapper[4815]: I1207 19:30:00.345702 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/491d1d4c-d450-4c57-a735-5c03728dd027-secret-volume\") pod \"collect-profiles-29418930-bjxq9\" (UID: \"491d1d4c-d450-4c57-a735-5c03728dd027\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29418930-bjxq9" Dec 07 19:30:00 crc kubenswrapper[4815]: I1207 19:30:00.631816 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6s4mb\" (UniqueName: \"kubernetes.io/projected/491d1d4c-d450-4c57-a735-5c03728dd027-kube-api-access-6s4mb\") pod \"collect-profiles-29418930-bjxq9\" (UID: \"491d1d4c-d450-4c57-a735-5c03728dd027\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29418930-bjxq9" Dec 07 19:30:00 crc kubenswrapper[4815]: I1207 19:30:00.631898 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/491d1d4c-d450-4c57-a735-5c03728dd027-config-volume\") pod \"collect-profiles-29418930-bjxq9\" (UID: \"491d1d4c-d450-4c57-a735-5c03728dd027\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29418930-bjxq9" Dec 07 19:30:00 crc kubenswrapper[4815]: I1207 19:30:00.631977 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/491d1d4c-d450-4c57-a735-5c03728dd027-secret-volume\") pod \"collect-profiles-29418930-bjxq9\" (UID: \"491d1d4c-d450-4c57-a735-5c03728dd027\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29418930-bjxq9" Dec 07 19:30:00 crc kubenswrapper[4815]: I1207 19:30:00.633150 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/491d1d4c-d450-4c57-a735-5c03728dd027-config-volume\") pod \"collect-profiles-29418930-bjxq9\" (UID: \"491d1d4c-d450-4c57-a735-5c03728dd027\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29418930-bjxq9" Dec 07 19:30:00 crc kubenswrapper[4815]: I1207 19:30:00.643244 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/491d1d4c-d450-4c57-a735-5c03728dd027-secret-volume\") pod \"collect-profiles-29418930-bjxq9\" (UID: \"491d1d4c-d450-4c57-a735-5c03728dd027\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29418930-bjxq9" Dec 07 19:30:00 crc kubenswrapper[4815]: I1207 19:30:00.664838 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6s4mb\" (UniqueName: \"kubernetes.io/projected/491d1d4c-d450-4c57-a735-5c03728dd027-kube-api-access-6s4mb\") pod \"collect-profiles-29418930-bjxq9\" (UID: \"491d1d4c-d450-4c57-a735-5c03728dd027\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29418930-bjxq9" Dec 07 19:30:00 crc kubenswrapper[4815]: I1207 19:30:00.779108 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29418930-bjxq9" Dec 07 19:30:01 crc kubenswrapper[4815]: I1207 19:30:01.733509 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29418930-bjxq9"] Dec 07 19:30:01 crc kubenswrapper[4815]: I1207 19:30:01.909184 4815 generic.go:334] "Generic (PLEG): container finished" podID="f91a6d74-3342-4de5-92c4-354251161c5d" containerID="8a4219871f62ed0b0d21f121436dfcba5520d88934084a43ea5532310f6196a2" exitCode=0 Dec 07 19:30:01 crc kubenswrapper[4815]: I1207 19:30:01.909270 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wp7l7" event={"ID":"f91a6d74-3342-4de5-92c4-354251161c5d","Type":"ContainerDied","Data":"8a4219871f62ed0b0d21f121436dfcba5520d88934084a43ea5532310f6196a2"} Dec 07 19:30:01 crc kubenswrapper[4815]: I1207 19:30:01.911415 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29418930-bjxq9" event={"ID":"491d1d4c-d450-4c57-a735-5c03728dd027","Type":"ContainerStarted","Data":"817ed7315849b15591b2eff668422b540515d89fad7bcb07b2454a3287cec726"} Dec 07 19:30:02 crc kubenswrapper[4815]: I1207 19:30:02.920714 4815 generic.go:334] "Generic (PLEG): container finished" podID="f91a6d74-3342-4de5-92c4-354251161c5d" containerID="9561f64c6c6bf652dcc126f5f3d9e85588acc52f4ac7feb1bd4fc260110389de" exitCode=0 Dec 07 19:30:02 crc kubenswrapper[4815]: I1207 19:30:02.921229 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wp7l7" event={"ID":"f91a6d74-3342-4de5-92c4-354251161c5d","Type":"ContainerDied","Data":"9561f64c6c6bf652dcc126f5f3d9e85588acc52f4ac7feb1bd4fc260110389de"} Dec 07 19:30:02 crc kubenswrapper[4815]: I1207 19:30:02.926141 4815 generic.go:334] "Generic (PLEG): container finished" podID="491d1d4c-d450-4c57-a735-5c03728dd027" containerID="7fe28ae0a3706a4cb6c8e0cfbf9fa44dd50b5ba34408e6501908db316e910725" exitCode=0 Dec 07 19:30:02 crc kubenswrapper[4815]: I1207 19:30:02.926217 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29418930-bjxq9" event={"ID":"491d1d4c-d450-4c57-a735-5c03728dd027","Type":"ContainerDied","Data":"7fe28ae0a3706a4cb6c8e0cfbf9fa44dd50b5ba34408e6501908db316e910725"} Dec 07 19:30:02 crc kubenswrapper[4815]: I1207 19:30:02.929617 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-7zz67" event={"ID":"b91cf016-e00e-4b0c-a1ef-2b851971e00b","Type":"ContainerStarted","Data":"7e11799a538936b2f5d4c7469bdd113aaa2d72c67057eea85fce62225a313046"} Dec 07 19:30:02 crc kubenswrapper[4815]: I1207 19:30:02.930357 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-7zz67" Dec 07 19:30:02 crc kubenswrapper[4815]: I1207 19:30:02.964054 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-7zz67" podStartSLOduration=2.644819329 podStartE2EDuration="13.964034661s" podCreationTimestamp="2025-12-07 19:29:49 +0000 UTC" firstStartedPulling="2025-12-07 19:29:50.269084992 +0000 UTC m=+894.848075037" lastFinishedPulling="2025-12-07 19:30:01.588300324 +0000 UTC m=+906.167290369" observedRunningTime="2025-12-07 19:30:02.961391926 +0000 UTC m=+907.540381981" watchObservedRunningTime="2025-12-07 19:30:02.964034661 +0000 UTC m=+907.543024706" Dec 07 19:30:03 crc kubenswrapper[4815]: I1207 19:30:03.946966 4815 generic.go:334] "Generic (PLEG): container finished" podID="f91a6d74-3342-4de5-92c4-354251161c5d" containerID="3818f2d29bd930aa5068aae186535fb20adb1076746f7738fd6153a814a40f64" exitCode=0 Dec 07 19:30:03 crc kubenswrapper[4815]: I1207 19:30:03.947073 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wp7l7" event={"ID":"f91a6d74-3342-4de5-92c4-354251161c5d","Type":"ContainerDied","Data":"3818f2d29bd930aa5068aae186535fb20adb1076746f7738fd6153a814a40f64"} Dec 07 19:30:04 crc kubenswrapper[4815]: I1207 19:30:04.398528 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29418930-bjxq9" Dec 07 19:30:04 crc kubenswrapper[4815]: I1207 19:30:04.582267 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/491d1d4c-d450-4c57-a735-5c03728dd027-config-volume\") pod \"491d1d4c-d450-4c57-a735-5c03728dd027\" (UID: \"491d1d4c-d450-4c57-a735-5c03728dd027\") " Dec 07 19:30:04 crc kubenswrapper[4815]: I1207 19:30:04.582549 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/491d1d4c-d450-4c57-a735-5c03728dd027-secret-volume\") pod \"491d1d4c-d450-4c57-a735-5c03728dd027\" (UID: \"491d1d4c-d450-4c57-a735-5c03728dd027\") " Dec 07 19:30:04 crc kubenswrapper[4815]: I1207 19:30:04.582584 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6s4mb\" (UniqueName: \"kubernetes.io/projected/491d1d4c-d450-4c57-a735-5c03728dd027-kube-api-access-6s4mb\") pod \"491d1d4c-d450-4c57-a735-5c03728dd027\" (UID: \"491d1d4c-d450-4c57-a735-5c03728dd027\") " Dec 07 19:30:04 crc kubenswrapper[4815]: I1207 19:30:04.582985 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/491d1d4c-d450-4c57-a735-5c03728dd027-config-volume" (OuterVolumeSpecName: "config-volume") pod "491d1d4c-d450-4c57-a735-5c03728dd027" (UID: "491d1d4c-d450-4c57-a735-5c03728dd027"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:30:04 crc kubenswrapper[4815]: I1207 19:30:04.587278 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/491d1d4c-d450-4c57-a735-5c03728dd027-kube-api-access-6s4mb" (OuterVolumeSpecName: "kube-api-access-6s4mb") pod "491d1d4c-d450-4c57-a735-5c03728dd027" (UID: "491d1d4c-d450-4c57-a735-5c03728dd027"). InnerVolumeSpecName "kube-api-access-6s4mb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:30:04 crc kubenswrapper[4815]: I1207 19:30:04.587302 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/491d1d4c-d450-4c57-a735-5c03728dd027-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "491d1d4c-d450-4c57-a735-5c03728dd027" (UID: "491d1d4c-d450-4c57-a735-5c03728dd027"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:30:04 crc kubenswrapper[4815]: I1207 19:30:04.684230 4815 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/491d1d4c-d450-4c57-a735-5c03728dd027-config-volume\") on node \"crc\" DevicePath \"\"" Dec 07 19:30:04 crc kubenswrapper[4815]: I1207 19:30:04.684267 4815 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/491d1d4c-d450-4c57-a735-5c03728dd027-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 07 19:30:04 crc kubenswrapper[4815]: I1207 19:30:04.684278 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6s4mb\" (UniqueName: \"kubernetes.io/projected/491d1d4c-d450-4c57-a735-5c03728dd027-kube-api-access-6s4mb\") on node \"crc\" DevicePath \"\"" Dec 07 19:30:04 crc kubenswrapper[4815]: I1207 19:30:04.956936 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wp7l7" event={"ID":"f91a6d74-3342-4de5-92c4-354251161c5d","Type":"ContainerStarted","Data":"de84a4997af92771157810edf62a39c44ee5f18b3d9ad919100ac91548a17b63"} Dec 07 19:30:04 crc kubenswrapper[4815]: I1207 19:30:04.956982 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wp7l7" event={"ID":"f91a6d74-3342-4de5-92c4-354251161c5d","Type":"ContainerStarted","Data":"2b0e88eadee344fc22271ce0a070ea17426a622afd46fb01e5d330c368de1a96"} Dec 07 19:30:04 crc kubenswrapper[4815]: I1207 19:30:04.956999 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wp7l7" event={"ID":"f91a6d74-3342-4de5-92c4-354251161c5d","Type":"ContainerStarted","Data":"19ab0299eb9a86a0b51926531b6a603bbb1592447e9949c48db22b1ed0db39c8"} Dec 07 19:30:04 crc kubenswrapper[4815]: I1207 19:30:04.957012 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wp7l7" event={"ID":"f91a6d74-3342-4de5-92c4-354251161c5d","Type":"ContainerStarted","Data":"c2899921e0ea3df3fc22207223bfe3bd96831ff95f73ad49e4e4f8a05ae94ecb"} Dec 07 19:30:04 crc kubenswrapper[4815]: I1207 19:30:04.957027 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wp7l7" event={"ID":"f91a6d74-3342-4de5-92c4-354251161c5d","Type":"ContainerStarted","Data":"20af4636abed6da35f3102e2881b5b97db1a387d10d7ed4808b74669e90a2797"} Dec 07 19:30:04 crc kubenswrapper[4815]: I1207 19:30:04.958971 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29418930-bjxq9" Dec 07 19:30:04 crc kubenswrapper[4815]: I1207 19:30:04.961109 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29418930-bjxq9" event={"ID":"491d1d4c-d450-4c57-a735-5c03728dd027","Type":"ContainerDied","Data":"817ed7315849b15591b2eff668422b540515d89fad7bcb07b2454a3287cec726"} Dec 07 19:30:04 crc kubenswrapper[4815]: I1207 19:30:04.961137 4815 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="817ed7315849b15591b2eff668422b540515d89fad7bcb07b2454a3287cec726" Dec 07 19:30:05 crc kubenswrapper[4815]: I1207 19:30:05.969320 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wp7l7" event={"ID":"f91a6d74-3342-4de5-92c4-354251161c5d","Type":"ContainerStarted","Data":"401c53bb35ce6b33e1abd8804fa838697c91c6e04e82bf1d8ee3fbc52b278ac3"} Dec 07 19:30:05 crc kubenswrapper[4815]: I1207 19:30:05.969697 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-wp7l7" Dec 07 19:30:05 crc kubenswrapper[4815]: I1207 19:30:05.995132 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-wp7l7" podStartSLOduration=6.797591353 podStartE2EDuration="16.995111576s" podCreationTimestamp="2025-12-07 19:29:49 +0000 UTC" firstStartedPulling="2025-12-07 19:29:51.404481502 +0000 UTC m=+895.983471557" lastFinishedPulling="2025-12-07 19:30:01.602001735 +0000 UTC m=+906.180991780" observedRunningTime="2025-12-07 19:30:05.993140389 +0000 UTC m=+910.572130454" watchObservedRunningTime="2025-12-07 19:30:05.995111576 +0000 UTC m=+910.574101641" Dec 07 19:30:10 crc kubenswrapper[4815]: I1207 19:30:10.584176 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-wp7l7" Dec 07 19:30:10 crc kubenswrapper[4815]: I1207 19:30:10.635241 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-wp7l7" Dec 07 19:30:10 crc kubenswrapper[4815]: I1207 19:30:10.834963 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-f8648f98b-29cdv" Dec 07 19:30:11 crc kubenswrapper[4815]: I1207 19:30:11.703023 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-xdlfl" Dec 07 19:30:14 crc kubenswrapper[4815]: I1207 19:30:14.790652 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-wtrwc"] Dec 07 19:30:14 crc kubenswrapper[4815]: E1207 19:30:14.791244 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="491d1d4c-d450-4c57-a735-5c03728dd027" containerName="collect-profiles" Dec 07 19:30:14 crc kubenswrapper[4815]: I1207 19:30:14.791260 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="491d1d4c-d450-4c57-a735-5c03728dd027" containerName="collect-profiles" Dec 07 19:30:14 crc kubenswrapper[4815]: I1207 19:30:14.791378 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="491d1d4c-d450-4c57-a735-5c03728dd027" containerName="collect-profiles" Dec 07 19:30:14 crc kubenswrapper[4815]: I1207 19:30:14.791861 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-wtrwc" Dec 07 19:30:14 crc kubenswrapper[4815]: I1207 19:30:14.795512 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Dec 07 19:30:14 crc kubenswrapper[4815]: I1207 19:30:14.795848 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Dec 07 19:30:14 crc kubenswrapper[4815]: I1207 19:30:14.799797 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-cx2zb" Dec 07 19:30:14 crc kubenswrapper[4815]: I1207 19:30:14.817481 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xztbw\" (UniqueName: \"kubernetes.io/projected/7c0a84e0-49dc-4f77-8e27-d9a9d98547c3-kube-api-access-xztbw\") pod \"openstack-operator-index-wtrwc\" (UID: \"7c0a84e0-49dc-4f77-8e27-d9a9d98547c3\") " pod="openstack-operators/openstack-operator-index-wtrwc" Dec 07 19:30:14 crc kubenswrapper[4815]: I1207 19:30:14.852237 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-wtrwc"] Dec 07 19:30:14 crc kubenswrapper[4815]: I1207 19:30:14.918542 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xztbw\" (UniqueName: \"kubernetes.io/projected/7c0a84e0-49dc-4f77-8e27-d9a9d98547c3-kube-api-access-xztbw\") pod \"openstack-operator-index-wtrwc\" (UID: \"7c0a84e0-49dc-4f77-8e27-d9a9d98547c3\") " pod="openstack-operators/openstack-operator-index-wtrwc" Dec 07 19:30:14 crc kubenswrapper[4815]: I1207 19:30:14.951099 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xztbw\" (UniqueName: \"kubernetes.io/projected/7c0a84e0-49dc-4f77-8e27-d9a9d98547c3-kube-api-access-xztbw\") pod \"openstack-operator-index-wtrwc\" (UID: \"7c0a84e0-49dc-4f77-8e27-d9a9d98547c3\") " pod="openstack-operators/openstack-operator-index-wtrwc" Dec 07 19:30:15 crc kubenswrapper[4815]: I1207 19:30:15.109793 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-wtrwc" Dec 07 19:30:15 crc kubenswrapper[4815]: I1207 19:30:15.549251 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-wtrwc"] Dec 07 19:30:15 crc kubenswrapper[4815]: W1207 19:30:15.552368 4815 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c0a84e0_49dc_4f77_8e27_d9a9d98547c3.slice/crio-aa0997ad2ec563628ab2e51b66fb9625825a4dc65dbe3c200f16d1fa0f80f94c WatchSource:0}: Error finding container aa0997ad2ec563628ab2e51b66fb9625825a4dc65dbe3c200f16d1fa0f80f94c: Status 404 returned error can't find the container with id aa0997ad2ec563628ab2e51b66fb9625825a4dc65dbe3c200f16d1fa0f80f94c Dec 07 19:30:16 crc kubenswrapper[4815]: I1207 19:30:16.070172 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-wtrwc" event={"ID":"7c0a84e0-49dc-4f77-8e27-d9a9d98547c3","Type":"ContainerStarted","Data":"aa0997ad2ec563628ab2e51b66fb9625825a4dc65dbe3c200f16d1fa0f80f94c"} Dec 07 19:30:17 crc kubenswrapper[4815]: I1207 19:30:17.150114 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-wtrwc"] Dec 07 19:30:17 crc kubenswrapper[4815]: I1207 19:30:17.590483 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-xw277"] Dec 07 19:30:17 crc kubenswrapper[4815]: I1207 19:30:17.591224 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-xw277" Dec 07 19:30:17 crc kubenswrapper[4815]: I1207 19:30:17.602212 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-xw277"] Dec 07 19:30:17 crc kubenswrapper[4815]: I1207 19:30:17.678305 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-862rj\" (UniqueName: \"kubernetes.io/projected/58821771-f5ba-40e9-8ba4-ccced4549d86-kube-api-access-862rj\") pod \"openstack-operator-index-xw277\" (UID: \"58821771-f5ba-40e9-8ba4-ccced4549d86\") " pod="openstack-operators/openstack-operator-index-xw277" Dec 07 19:30:17 crc kubenswrapper[4815]: I1207 19:30:17.779142 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-862rj\" (UniqueName: \"kubernetes.io/projected/58821771-f5ba-40e9-8ba4-ccced4549d86-kube-api-access-862rj\") pod \"openstack-operator-index-xw277\" (UID: \"58821771-f5ba-40e9-8ba4-ccced4549d86\") " pod="openstack-operators/openstack-operator-index-xw277" Dec 07 19:30:17 crc kubenswrapper[4815]: I1207 19:30:17.799497 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-862rj\" (UniqueName: \"kubernetes.io/projected/58821771-f5ba-40e9-8ba4-ccced4549d86-kube-api-access-862rj\") pod \"openstack-operator-index-xw277\" (UID: \"58821771-f5ba-40e9-8ba4-ccced4549d86\") " pod="openstack-operators/openstack-operator-index-xw277" Dec 07 19:30:17 crc kubenswrapper[4815]: I1207 19:30:17.916408 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-xw277" Dec 07 19:30:18 crc kubenswrapper[4815]: I1207 19:30:18.663185 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-xw277"] Dec 07 19:30:19 crc kubenswrapper[4815]: I1207 19:30:19.091872 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-wtrwc" event={"ID":"7c0a84e0-49dc-4f77-8e27-d9a9d98547c3","Type":"ContainerStarted","Data":"2117636cee54b910d3f054a46c41777213275f637376e8cc2c1d343a46b2e232"} Dec 07 19:30:19 crc kubenswrapper[4815]: I1207 19:30:19.092027 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-wtrwc" podUID="7c0a84e0-49dc-4f77-8e27-d9a9d98547c3" containerName="registry-server" containerID="cri-o://2117636cee54b910d3f054a46c41777213275f637376e8cc2c1d343a46b2e232" gracePeriod=2 Dec 07 19:30:19 crc kubenswrapper[4815]: I1207 19:30:19.094984 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-xw277" event={"ID":"58821771-f5ba-40e9-8ba4-ccced4549d86","Type":"ContainerStarted","Data":"d5f447ff6f200b4551780cd5cc4f009b4e8a3fc02cb8fd5d8c47b567ce44dd66"} Dec 07 19:30:19 crc kubenswrapper[4815]: I1207 19:30:19.095052 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-xw277" event={"ID":"58821771-f5ba-40e9-8ba4-ccced4549d86","Type":"ContainerStarted","Data":"3f89535f1a0cadf97ac6af3bce38a37122191d850886a34ad526c1a2de163049"} Dec 07 19:30:19 crc kubenswrapper[4815]: I1207 19:30:19.126479 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-wtrwc" podStartSLOduration=2.386823002 podStartE2EDuration="5.126450478s" podCreationTimestamp="2025-12-07 19:30:14 +0000 UTC" firstStartedPulling="2025-12-07 19:30:15.554422688 +0000 UTC m=+920.133412733" lastFinishedPulling="2025-12-07 19:30:18.294050154 +0000 UTC m=+922.873040209" observedRunningTime="2025-12-07 19:30:19.120637421 +0000 UTC m=+923.699627486" watchObservedRunningTime="2025-12-07 19:30:19.126450478 +0000 UTC m=+923.705440533" Dec 07 19:30:19 crc kubenswrapper[4815]: I1207 19:30:19.139885 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-xw277" podStartSLOduration=2.065136023 podStartE2EDuration="2.139868144s" podCreationTimestamp="2025-12-07 19:30:17 +0000 UTC" firstStartedPulling="2025-12-07 19:30:18.673567295 +0000 UTC m=+923.252557350" lastFinishedPulling="2025-12-07 19:30:18.748299416 +0000 UTC m=+923.327289471" observedRunningTime="2025-12-07 19:30:19.138458614 +0000 UTC m=+923.717448659" watchObservedRunningTime="2025-12-07 19:30:19.139868144 +0000 UTC m=+923.718858189" Dec 07 19:30:19 crc kubenswrapper[4815]: I1207 19:30:19.496449 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-wtrwc" Dec 07 19:30:19 crc kubenswrapper[4815]: I1207 19:30:19.599813 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xztbw\" (UniqueName: \"kubernetes.io/projected/7c0a84e0-49dc-4f77-8e27-d9a9d98547c3-kube-api-access-xztbw\") pod \"7c0a84e0-49dc-4f77-8e27-d9a9d98547c3\" (UID: \"7c0a84e0-49dc-4f77-8e27-d9a9d98547c3\") " Dec 07 19:30:19 crc kubenswrapper[4815]: I1207 19:30:19.605278 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c0a84e0-49dc-4f77-8e27-d9a9d98547c3-kube-api-access-xztbw" (OuterVolumeSpecName: "kube-api-access-xztbw") pod "7c0a84e0-49dc-4f77-8e27-d9a9d98547c3" (UID: "7c0a84e0-49dc-4f77-8e27-d9a9d98547c3"). InnerVolumeSpecName "kube-api-access-xztbw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:30:19 crc kubenswrapper[4815]: I1207 19:30:19.701947 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xztbw\" (UniqueName: \"kubernetes.io/projected/7c0a84e0-49dc-4f77-8e27-d9a9d98547c3-kube-api-access-xztbw\") on node \"crc\" DevicePath \"\"" Dec 07 19:30:19 crc kubenswrapper[4815]: I1207 19:30:19.998246 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-7zz67" Dec 07 19:30:20 crc kubenswrapper[4815]: I1207 19:30:20.103301 4815 generic.go:334] "Generic (PLEG): container finished" podID="7c0a84e0-49dc-4f77-8e27-d9a9d98547c3" containerID="2117636cee54b910d3f054a46c41777213275f637376e8cc2c1d343a46b2e232" exitCode=0 Dec 07 19:30:20 crc kubenswrapper[4815]: I1207 19:30:20.103390 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-wtrwc" Dec 07 19:30:20 crc kubenswrapper[4815]: I1207 19:30:20.103434 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-wtrwc" event={"ID":"7c0a84e0-49dc-4f77-8e27-d9a9d98547c3","Type":"ContainerDied","Data":"2117636cee54b910d3f054a46c41777213275f637376e8cc2c1d343a46b2e232"} Dec 07 19:30:20 crc kubenswrapper[4815]: I1207 19:30:20.103502 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-wtrwc" event={"ID":"7c0a84e0-49dc-4f77-8e27-d9a9d98547c3","Type":"ContainerDied","Data":"aa0997ad2ec563628ab2e51b66fb9625825a4dc65dbe3c200f16d1fa0f80f94c"} Dec 07 19:30:20 crc kubenswrapper[4815]: I1207 19:30:20.103525 4815 scope.go:117] "RemoveContainer" containerID="2117636cee54b910d3f054a46c41777213275f637376e8cc2c1d343a46b2e232" Dec 07 19:30:20 crc kubenswrapper[4815]: I1207 19:30:20.123131 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-wtrwc"] Dec 07 19:30:20 crc kubenswrapper[4815]: I1207 19:30:20.129306 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-wtrwc"] Dec 07 19:30:20 crc kubenswrapper[4815]: I1207 19:30:20.132280 4815 scope.go:117] "RemoveContainer" containerID="2117636cee54b910d3f054a46c41777213275f637376e8cc2c1d343a46b2e232" Dec 07 19:30:20 crc kubenswrapper[4815]: E1207 19:30:20.132684 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2117636cee54b910d3f054a46c41777213275f637376e8cc2c1d343a46b2e232\": container with ID starting with 2117636cee54b910d3f054a46c41777213275f637376e8cc2c1d343a46b2e232 not found: ID does not exist" containerID="2117636cee54b910d3f054a46c41777213275f637376e8cc2c1d343a46b2e232" Dec 07 19:30:20 crc kubenswrapper[4815]: I1207 19:30:20.132717 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2117636cee54b910d3f054a46c41777213275f637376e8cc2c1d343a46b2e232"} err="failed to get container status \"2117636cee54b910d3f054a46c41777213275f637376e8cc2c1d343a46b2e232\": rpc error: code = NotFound desc = could not find container \"2117636cee54b910d3f054a46c41777213275f637376e8cc2c1d343a46b2e232\": container with ID starting with 2117636cee54b910d3f054a46c41777213275f637376e8cc2c1d343a46b2e232 not found: ID does not exist" Dec 07 19:30:20 crc kubenswrapper[4815]: I1207 19:30:20.590035 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-wp7l7" Dec 07 19:30:21 crc kubenswrapper[4815]: I1207 19:30:21.785756 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c0a84e0-49dc-4f77-8e27-d9a9d98547c3" path="/var/lib/kubelet/pods/7c0a84e0-49dc-4f77-8e27-d9a9d98547c3/volumes" Dec 07 19:30:26 crc kubenswrapper[4815]: I1207 19:30:26.359611 4815 patch_prober.go:28] interesting pod/machine-config-daemon-gkn4h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 07 19:30:26 crc kubenswrapper[4815]: I1207 19:30:26.360063 4815 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 07 19:30:27 crc kubenswrapper[4815]: I1207 19:30:27.917626 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-xw277" Dec 07 19:30:27 crc kubenswrapper[4815]: I1207 19:30:27.917997 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-xw277" Dec 07 19:30:27 crc kubenswrapper[4815]: I1207 19:30:27.967368 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-xw277" Dec 07 19:30:28 crc kubenswrapper[4815]: I1207 19:30:28.194692 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-xw277" Dec 07 19:30:29 crc kubenswrapper[4815]: I1207 19:30:29.791110 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/4fcc6706180aae165b51ffb35b91d9cba63d1d00f0c693434074c2a16etbzsv"] Dec 07 19:30:29 crc kubenswrapper[4815]: E1207 19:30:29.791409 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c0a84e0-49dc-4f77-8e27-d9a9d98547c3" containerName="registry-server" Dec 07 19:30:29 crc kubenswrapper[4815]: I1207 19:30:29.791424 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c0a84e0-49dc-4f77-8e27-d9a9d98547c3" containerName="registry-server" Dec 07 19:30:29 crc kubenswrapper[4815]: I1207 19:30:29.791559 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c0a84e0-49dc-4f77-8e27-d9a9d98547c3" containerName="registry-server" Dec 07 19:30:29 crc kubenswrapper[4815]: I1207 19:30:29.792495 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/4fcc6706180aae165b51ffb35b91d9cba63d1d00f0c693434074c2a16etbzsv" Dec 07 19:30:29 crc kubenswrapper[4815]: I1207 19:30:29.794493 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-j9vxc" Dec 07 19:30:29 crc kubenswrapper[4815]: I1207 19:30:29.811835 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/4fcc6706180aae165b51ffb35b91d9cba63d1d00f0c693434074c2a16etbzsv"] Dec 07 19:30:29 crc kubenswrapper[4815]: I1207 19:30:29.948571 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gctr9\" (UniqueName: \"kubernetes.io/projected/2c3a7f0a-597d-40ee-a0ad-7a83ff99a3c8-kube-api-access-gctr9\") pod \"4fcc6706180aae165b51ffb35b91d9cba63d1d00f0c693434074c2a16etbzsv\" (UID: \"2c3a7f0a-597d-40ee-a0ad-7a83ff99a3c8\") " pod="openstack-operators/4fcc6706180aae165b51ffb35b91d9cba63d1d00f0c693434074c2a16etbzsv" Dec 07 19:30:29 crc kubenswrapper[4815]: I1207 19:30:29.948999 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2c3a7f0a-597d-40ee-a0ad-7a83ff99a3c8-bundle\") pod \"4fcc6706180aae165b51ffb35b91d9cba63d1d00f0c693434074c2a16etbzsv\" (UID: \"2c3a7f0a-597d-40ee-a0ad-7a83ff99a3c8\") " pod="openstack-operators/4fcc6706180aae165b51ffb35b91d9cba63d1d00f0c693434074c2a16etbzsv" Dec 07 19:30:29 crc kubenswrapper[4815]: I1207 19:30:29.949213 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2c3a7f0a-597d-40ee-a0ad-7a83ff99a3c8-util\") pod \"4fcc6706180aae165b51ffb35b91d9cba63d1d00f0c693434074c2a16etbzsv\" (UID: \"2c3a7f0a-597d-40ee-a0ad-7a83ff99a3c8\") " pod="openstack-operators/4fcc6706180aae165b51ffb35b91d9cba63d1d00f0c693434074c2a16etbzsv" Dec 07 19:30:30 crc kubenswrapper[4815]: I1207 19:30:30.050542 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2c3a7f0a-597d-40ee-a0ad-7a83ff99a3c8-bundle\") pod \"4fcc6706180aae165b51ffb35b91d9cba63d1d00f0c693434074c2a16etbzsv\" (UID: \"2c3a7f0a-597d-40ee-a0ad-7a83ff99a3c8\") " pod="openstack-operators/4fcc6706180aae165b51ffb35b91d9cba63d1d00f0c693434074c2a16etbzsv" Dec 07 19:30:30 crc kubenswrapper[4815]: I1207 19:30:30.050641 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2c3a7f0a-597d-40ee-a0ad-7a83ff99a3c8-util\") pod \"4fcc6706180aae165b51ffb35b91d9cba63d1d00f0c693434074c2a16etbzsv\" (UID: \"2c3a7f0a-597d-40ee-a0ad-7a83ff99a3c8\") " pod="openstack-operators/4fcc6706180aae165b51ffb35b91d9cba63d1d00f0c693434074c2a16etbzsv" Dec 07 19:30:30 crc kubenswrapper[4815]: I1207 19:30:30.050807 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gctr9\" (UniqueName: \"kubernetes.io/projected/2c3a7f0a-597d-40ee-a0ad-7a83ff99a3c8-kube-api-access-gctr9\") pod \"4fcc6706180aae165b51ffb35b91d9cba63d1d00f0c693434074c2a16etbzsv\" (UID: \"2c3a7f0a-597d-40ee-a0ad-7a83ff99a3c8\") " pod="openstack-operators/4fcc6706180aae165b51ffb35b91d9cba63d1d00f0c693434074c2a16etbzsv" Dec 07 19:30:30 crc kubenswrapper[4815]: I1207 19:30:30.051199 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2c3a7f0a-597d-40ee-a0ad-7a83ff99a3c8-bundle\") pod \"4fcc6706180aae165b51ffb35b91d9cba63d1d00f0c693434074c2a16etbzsv\" (UID: \"2c3a7f0a-597d-40ee-a0ad-7a83ff99a3c8\") " pod="openstack-operators/4fcc6706180aae165b51ffb35b91d9cba63d1d00f0c693434074c2a16etbzsv" Dec 07 19:30:30 crc kubenswrapper[4815]: I1207 19:30:30.117634 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2c3a7f0a-597d-40ee-a0ad-7a83ff99a3c8-util\") pod \"4fcc6706180aae165b51ffb35b91d9cba63d1d00f0c693434074c2a16etbzsv\" (UID: \"2c3a7f0a-597d-40ee-a0ad-7a83ff99a3c8\") " pod="openstack-operators/4fcc6706180aae165b51ffb35b91d9cba63d1d00f0c693434074c2a16etbzsv" Dec 07 19:30:30 crc kubenswrapper[4815]: I1207 19:30:30.145421 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gctr9\" (UniqueName: \"kubernetes.io/projected/2c3a7f0a-597d-40ee-a0ad-7a83ff99a3c8-kube-api-access-gctr9\") pod \"4fcc6706180aae165b51ffb35b91d9cba63d1d00f0c693434074c2a16etbzsv\" (UID: \"2c3a7f0a-597d-40ee-a0ad-7a83ff99a3c8\") " pod="openstack-operators/4fcc6706180aae165b51ffb35b91d9cba63d1d00f0c693434074c2a16etbzsv" Dec 07 19:30:30 crc kubenswrapper[4815]: I1207 19:30:30.412171 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/4fcc6706180aae165b51ffb35b91d9cba63d1d00f0c693434074c2a16etbzsv" Dec 07 19:30:31 crc kubenswrapper[4815]: I1207 19:30:31.177542 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/4fcc6706180aae165b51ffb35b91d9cba63d1d00f0c693434074c2a16etbzsv"] Dec 07 19:30:31 crc kubenswrapper[4815]: W1207 19:30:31.183370 4815 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c3a7f0a_597d_40ee_a0ad_7a83ff99a3c8.slice/crio-af465e4831dee68a5a2d19578b899bf123c74a53ad3042d61f0582bb1f5cef99 WatchSource:0}: Error finding container af465e4831dee68a5a2d19578b899bf123c74a53ad3042d61f0582bb1f5cef99: Status 404 returned error can't find the container with id af465e4831dee68a5a2d19578b899bf123c74a53ad3042d61f0582bb1f5cef99 Dec 07 19:30:31 crc kubenswrapper[4815]: I1207 19:30:31.198508 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/4fcc6706180aae165b51ffb35b91d9cba63d1d00f0c693434074c2a16etbzsv" event={"ID":"2c3a7f0a-597d-40ee-a0ad-7a83ff99a3c8","Type":"ContainerStarted","Data":"af465e4831dee68a5a2d19578b899bf123c74a53ad3042d61f0582bb1f5cef99"} Dec 07 19:30:32 crc kubenswrapper[4815]: I1207 19:30:32.246885 4815 generic.go:334] "Generic (PLEG): container finished" podID="2c3a7f0a-597d-40ee-a0ad-7a83ff99a3c8" containerID="66169c8df75304739b1541dfa9dcba9e8ae214f0796b2a6d2fcbe90a989e2994" exitCode=0 Dec 07 19:30:32 crc kubenswrapper[4815]: I1207 19:30:32.246936 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/4fcc6706180aae165b51ffb35b91d9cba63d1d00f0c693434074c2a16etbzsv" event={"ID":"2c3a7f0a-597d-40ee-a0ad-7a83ff99a3c8","Type":"ContainerDied","Data":"66169c8df75304739b1541dfa9dcba9e8ae214f0796b2a6d2fcbe90a989e2994"} Dec 07 19:30:33 crc kubenswrapper[4815]: I1207 19:30:33.261003 4815 generic.go:334] "Generic (PLEG): container finished" podID="2c3a7f0a-597d-40ee-a0ad-7a83ff99a3c8" containerID="11b37b9d456c201c150ee4240a691a95e1fedf9dfc799c8ebc6235c3192984bd" exitCode=0 Dec 07 19:30:33 crc kubenswrapper[4815]: I1207 19:30:33.261076 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/4fcc6706180aae165b51ffb35b91d9cba63d1d00f0c693434074c2a16etbzsv" event={"ID":"2c3a7f0a-597d-40ee-a0ad-7a83ff99a3c8","Type":"ContainerDied","Data":"11b37b9d456c201c150ee4240a691a95e1fedf9dfc799c8ebc6235c3192984bd"} Dec 07 19:30:34 crc kubenswrapper[4815]: I1207 19:30:34.272237 4815 generic.go:334] "Generic (PLEG): container finished" podID="2c3a7f0a-597d-40ee-a0ad-7a83ff99a3c8" containerID="dce69d00f29d76c937179124a430651205aedfead40c6702475ac0056f5217a7" exitCode=0 Dec 07 19:30:34 crc kubenswrapper[4815]: I1207 19:30:34.272318 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/4fcc6706180aae165b51ffb35b91d9cba63d1d00f0c693434074c2a16etbzsv" event={"ID":"2c3a7f0a-597d-40ee-a0ad-7a83ff99a3c8","Type":"ContainerDied","Data":"dce69d00f29d76c937179124a430651205aedfead40c6702475ac0056f5217a7"} Dec 07 19:30:35 crc kubenswrapper[4815]: I1207 19:30:35.520764 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/4fcc6706180aae165b51ffb35b91d9cba63d1d00f0c693434074c2a16etbzsv" Dec 07 19:30:35 crc kubenswrapper[4815]: I1207 19:30:35.686484 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gctr9\" (UniqueName: \"kubernetes.io/projected/2c3a7f0a-597d-40ee-a0ad-7a83ff99a3c8-kube-api-access-gctr9\") pod \"2c3a7f0a-597d-40ee-a0ad-7a83ff99a3c8\" (UID: \"2c3a7f0a-597d-40ee-a0ad-7a83ff99a3c8\") " Dec 07 19:30:35 crc kubenswrapper[4815]: I1207 19:30:35.686639 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2c3a7f0a-597d-40ee-a0ad-7a83ff99a3c8-bundle\") pod \"2c3a7f0a-597d-40ee-a0ad-7a83ff99a3c8\" (UID: \"2c3a7f0a-597d-40ee-a0ad-7a83ff99a3c8\") " Dec 07 19:30:35 crc kubenswrapper[4815]: I1207 19:30:35.686712 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2c3a7f0a-597d-40ee-a0ad-7a83ff99a3c8-util\") pod \"2c3a7f0a-597d-40ee-a0ad-7a83ff99a3c8\" (UID: \"2c3a7f0a-597d-40ee-a0ad-7a83ff99a3c8\") " Dec 07 19:30:35 crc kubenswrapper[4815]: I1207 19:30:35.687547 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c3a7f0a-597d-40ee-a0ad-7a83ff99a3c8-bundle" (OuterVolumeSpecName: "bundle") pod "2c3a7f0a-597d-40ee-a0ad-7a83ff99a3c8" (UID: "2c3a7f0a-597d-40ee-a0ad-7a83ff99a3c8"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 07 19:30:35 crc kubenswrapper[4815]: I1207 19:30:35.691554 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c3a7f0a-597d-40ee-a0ad-7a83ff99a3c8-kube-api-access-gctr9" (OuterVolumeSpecName: "kube-api-access-gctr9") pod "2c3a7f0a-597d-40ee-a0ad-7a83ff99a3c8" (UID: "2c3a7f0a-597d-40ee-a0ad-7a83ff99a3c8"). InnerVolumeSpecName "kube-api-access-gctr9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:30:35 crc kubenswrapper[4815]: I1207 19:30:35.700833 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c3a7f0a-597d-40ee-a0ad-7a83ff99a3c8-util" (OuterVolumeSpecName: "util") pod "2c3a7f0a-597d-40ee-a0ad-7a83ff99a3c8" (UID: "2c3a7f0a-597d-40ee-a0ad-7a83ff99a3c8"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 07 19:30:35 crc kubenswrapper[4815]: I1207 19:30:35.788156 4815 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2c3a7f0a-597d-40ee-a0ad-7a83ff99a3c8-util\") on node \"crc\" DevicePath \"\"" Dec 07 19:30:35 crc kubenswrapper[4815]: I1207 19:30:35.788205 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gctr9\" (UniqueName: \"kubernetes.io/projected/2c3a7f0a-597d-40ee-a0ad-7a83ff99a3c8-kube-api-access-gctr9\") on node \"crc\" DevicePath \"\"" Dec 07 19:30:35 crc kubenswrapper[4815]: I1207 19:30:35.788227 4815 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2c3a7f0a-597d-40ee-a0ad-7a83ff99a3c8-bundle\") on node \"crc\" DevicePath \"\"" Dec 07 19:30:36 crc kubenswrapper[4815]: I1207 19:30:36.294547 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/4fcc6706180aae165b51ffb35b91d9cba63d1d00f0c693434074c2a16etbzsv" event={"ID":"2c3a7f0a-597d-40ee-a0ad-7a83ff99a3c8","Type":"ContainerDied","Data":"af465e4831dee68a5a2d19578b899bf123c74a53ad3042d61f0582bb1f5cef99"} Dec 07 19:30:36 crc kubenswrapper[4815]: I1207 19:30:36.294756 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/4fcc6706180aae165b51ffb35b91d9cba63d1d00f0c693434074c2a16etbzsv" Dec 07 19:30:36 crc kubenswrapper[4815]: I1207 19:30:36.295098 4815 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af465e4831dee68a5a2d19578b899bf123c74a53ad3042d61f0582bb1f5cef99" Dec 07 19:30:42 crc kubenswrapper[4815]: I1207 19:30:42.070554 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-76dbcc48db-k25j8"] Dec 07 19:30:42 crc kubenswrapper[4815]: E1207 19:30:42.071380 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c3a7f0a-597d-40ee-a0ad-7a83ff99a3c8" containerName="extract" Dec 07 19:30:42 crc kubenswrapper[4815]: I1207 19:30:42.071395 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c3a7f0a-597d-40ee-a0ad-7a83ff99a3c8" containerName="extract" Dec 07 19:30:42 crc kubenswrapper[4815]: E1207 19:30:42.071415 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c3a7f0a-597d-40ee-a0ad-7a83ff99a3c8" containerName="util" Dec 07 19:30:42 crc kubenswrapper[4815]: I1207 19:30:42.071424 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c3a7f0a-597d-40ee-a0ad-7a83ff99a3c8" containerName="util" Dec 07 19:30:42 crc kubenswrapper[4815]: E1207 19:30:42.071438 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c3a7f0a-597d-40ee-a0ad-7a83ff99a3c8" containerName="pull" Dec 07 19:30:42 crc kubenswrapper[4815]: I1207 19:30:42.071447 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c3a7f0a-597d-40ee-a0ad-7a83ff99a3c8" containerName="pull" Dec 07 19:30:42 crc kubenswrapper[4815]: I1207 19:30:42.071583 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c3a7f0a-597d-40ee-a0ad-7a83ff99a3c8" containerName="extract" Dec 07 19:30:42 crc kubenswrapper[4815]: I1207 19:30:42.072076 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-76dbcc48db-k25j8" Dec 07 19:30:42 crc kubenswrapper[4815]: I1207 19:30:42.074816 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-vqg4l" Dec 07 19:30:42 crc kubenswrapper[4815]: I1207 19:30:42.076606 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94nhd\" (UniqueName: \"kubernetes.io/projected/0d9bc620-6be8-4f29-aa0a-40c6f9d9c341-kube-api-access-94nhd\") pod \"openstack-operator-controller-operator-76dbcc48db-k25j8\" (UID: \"0d9bc620-6be8-4f29-aa0a-40c6f9d9c341\") " pod="openstack-operators/openstack-operator-controller-operator-76dbcc48db-k25j8" Dec 07 19:30:42 crc kubenswrapper[4815]: I1207 19:30:42.096465 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-76dbcc48db-k25j8"] Dec 07 19:30:42 crc kubenswrapper[4815]: I1207 19:30:42.177205 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94nhd\" (UniqueName: \"kubernetes.io/projected/0d9bc620-6be8-4f29-aa0a-40c6f9d9c341-kube-api-access-94nhd\") pod \"openstack-operator-controller-operator-76dbcc48db-k25j8\" (UID: \"0d9bc620-6be8-4f29-aa0a-40c6f9d9c341\") " pod="openstack-operators/openstack-operator-controller-operator-76dbcc48db-k25j8" Dec 07 19:30:42 crc kubenswrapper[4815]: I1207 19:30:42.206999 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94nhd\" (UniqueName: \"kubernetes.io/projected/0d9bc620-6be8-4f29-aa0a-40c6f9d9c341-kube-api-access-94nhd\") pod \"openstack-operator-controller-operator-76dbcc48db-k25j8\" (UID: \"0d9bc620-6be8-4f29-aa0a-40c6f9d9c341\") " pod="openstack-operators/openstack-operator-controller-operator-76dbcc48db-k25j8" Dec 07 19:30:42 crc kubenswrapper[4815]: I1207 19:30:42.391637 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-76dbcc48db-k25j8" Dec 07 19:30:42 crc kubenswrapper[4815]: I1207 19:30:42.867722 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-76dbcc48db-k25j8"] Dec 07 19:30:42 crc kubenswrapper[4815]: W1207 19:30:42.872859 4815 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d9bc620_6be8_4f29_aa0a_40c6f9d9c341.slice/crio-b8939f0d013302d094020db1062203c3bf0b9f6162c9f6de6dd808222c8db424 WatchSource:0}: Error finding container b8939f0d013302d094020db1062203c3bf0b9f6162c9f6de6dd808222c8db424: Status 404 returned error can't find the container with id b8939f0d013302d094020db1062203c3bf0b9f6162c9f6de6dd808222c8db424 Dec 07 19:30:43 crc kubenswrapper[4815]: I1207 19:30:43.346753 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-76dbcc48db-k25j8" event={"ID":"0d9bc620-6be8-4f29-aa0a-40c6f9d9c341","Type":"ContainerStarted","Data":"b8939f0d013302d094020db1062203c3bf0b9f6162c9f6de6dd808222c8db424"} Dec 07 19:30:48 crc kubenswrapper[4815]: I1207 19:30:48.384426 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-76dbcc48db-k25j8" event={"ID":"0d9bc620-6be8-4f29-aa0a-40c6f9d9c341","Type":"ContainerStarted","Data":"804e042ec1141f1d6a8467c4af9d0589d716b7ca2f8330774579ddc244deb6c9"} Dec 07 19:30:48 crc kubenswrapper[4815]: I1207 19:30:48.384885 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-76dbcc48db-k25j8" Dec 07 19:30:48 crc kubenswrapper[4815]: I1207 19:30:48.415196 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-76dbcc48db-k25j8" podStartSLOduration=1.258556112 podStartE2EDuration="6.41517797s" podCreationTimestamp="2025-12-07 19:30:42 +0000 UTC" firstStartedPulling="2025-12-07 19:30:42.875474679 +0000 UTC m=+947.454464724" lastFinishedPulling="2025-12-07 19:30:48.032096537 +0000 UTC m=+952.611086582" observedRunningTime="2025-12-07 19:30:48.409311531 +0000 UTC m=+952.988301606" watchObservedRunningTime="2025-12-07 19:30:48.41517797 +0000 UTC m=+952.994168015" Dec 07 19:30:56 crc kubenswrapper[4815]: I1207 19:30:56.359871 4815 patch_prober.go:28] interesting pod/machine-config-daemon-gkn4h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 07 19:30:56 crc kubenswrapper[4815]: I1207 19:30:56.360315 4815 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 07 19:31:02 crc kubenswrapper[4815]: I1207 19:31:02.395105 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-76dbcc48db-k25j8" Dec 07 19:31:06 crc kubenswrapper[4815]: I1207 19:31:06.766749 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zljgk"] Dec 07 19:31:06 crc kubenswrapper[4815]: I1207 19:31:06.768997 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zljgk" Dec 07 19:31:06 crc kubenswrapper[4815]: I1207 19:31:06.770563 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zljgk"] Dec 07 19:31:06 crc kubenswrapper[4815]: I1207 19:31:06.774068 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4k7t\" (UniqueName: \"kubernetes.io/projected/83888d61-30ef-4fc4-97f1-66d4074e355c-kube-api-access-m4k7t\") pod \"certified-operators-zljgk\" (UID: \"83888d61-30ef-4fc4-97f1-66d4074e355c\") " pod="openshift-marketplace/certified-operators-zljgk" Dec 07 19:31:06 crc kubenswrapper[4815]: I1207 19:31:06.774137 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83888d61-30ef-4fc4-97f1-66d4074e355c-catalog-content\") pod \"certified-operators-zljgk\" (UID: \"83888d61-30ef-4fc4-97f1-66d4074e355c\") " pod="openshift-marketplace/certified-operators-zljgk" Dec 07 19:31:06 crc kubenswrapper[4815]: I1207 19:31:06.774234 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83888d61-30ef-4fc4-97f1-66d4074e355c-utilities\") pod \"certified-operators-zljgk\" (UID: \"83888d61-30ef-4fc4-97f1-66d4074e355c\") " pod="openshift-marketplace/certified-operators-zljgk" Dec 07 19:31:06 crc kubenswrapper[4815]: I1207 19:31:06.874973 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83888d61-30ef-4fc4-97f1-66d4074e355c-utilities\") pod \"certified-operators-zljgk\" (UID: \"83888d61-30ef-4fc4-97f1-66d4074e355c\") " pod="openshift-marketplace/certified-operators-zljgk" Dec 07 19:31:06 crc kubenswrapper[4815]: I1207 19:31:06.875263 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4k7t\" (UniqueName: \"kubernetes.io/projected/83888d61-30ef-4fc4-97f1-66d4074e355c-kube-api-access-m4k7t\") pod \"certified-operators-zljgk\" (UID: \"83888d61-30ef-4fc4-97f1-66d4074e355c\") " pod="openshift-marketplace/certified-operators-zljgk" Dec 07 19:31:06 crc kubenswrapper[4815]: I1207 19:31:06.875389 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83888d61-30ef-4fc4-97f1-66d4074e355c-catalog-content\") pod \"certified-operators-zljgk\" (UID: \"83888d61-30ef-4fc4-97f1-66d4074e355c\") " pod="openshift-marketplace/certified-operators-zljgk" Dec 07 19:31:06 crc kubenswrapper[4815]: I1207 19:31:06.875498 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83888d61-30ef-4fc4-97f1-66d4074e355c-utilities\") pod \"certified-operators-zljgk\" (UID: \"83888d61-30ef-4fc4-97f1-66d4074e355c\") " pod="openshift-marketplace/certified-operators-zljgk" Dec 07 19:31:06 crc kubenswrapper[4815]: I1207 19:31:06.875804 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83888d61-30ef-4fc4-97f1-66d4074e355c-catalog-content\") pod \"certified-operators-zljgk\" (UID: \"83888d61-30ef-4fc4-97f1-66d4074e355c\") " pod="openshift-marketplace/certified-operators-zljgk" Dec 07 19:31:06 crc kubenswrapper[4815]: I1207 19:31:06.893635 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4k7t\" (UniqueName: \"kubernetes.io/projected/83888d61-30ef-4fc4-97f1-66d4074e355c-kube-api-access-m4k7t\") pod \"certified-operators-zljgk\" (UID: \"83888d61-30ef-4fc4-97f1-66d4074e355c\") " pod="openshift-marketplace/certified-operators-zljgk" Dec 07 19:31:07 crc kubenswrapper[4815]: I1207 19:31:07.083812 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zljgk" Dec 07 19:31:07 crc kubenswrapper[4815]: I1207 19:31:07.692338 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zljgk"] Dec 07 19:31:08 crc kubenswrapper[4815]: I1207 19:31:08.589512 4815 generic.go:334] "Generic (PLEG): container finished" podID="83888d61-30ef-4fc4-97f1-66d4074e355c" containerID="df312392ad638bdbc8203e1296945b62154717f72a0947ee9edf7cce651dae4b" exitCode=0 Dec 07 19:31:08 crc kubenswrapper[4815]: I1207 19:31:08.589786 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zljgk" event={"ID":"83888d61-30ef-4fc4-97f1-66d4074e355c","Type":"ContainerDied","Data":"df312392ad638bdbc8203e1296945b62154717f72a0947ee9edf7cce651dae4b"} Dec 07 19:31:08 crc kubenswrapper[4815]: I1207 19:31:08.589810 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zljgk" event={"ID":"83888d61-30ef-4fc4-97f1-66d4074e355c","Type":"ContainerStarted","Data":"d6fa7c9edde05e64f521c334f00d3ccdf595f10ddd823093da576a62f57e632c"} Dec 07 19:31:09 crc kubenswrapper[4815]: I1207 19:31:09.913454 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-snsd5"] Dec 07 19:31:09 crc kubenswrapper[4815]: I1207 19:31:09.914530 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-snsd5" Dec 07 19:31:09 crc kubenswrapper[4815]: I1207 19:31:09.975966 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b1c55a7-0911-4926-92da-1710e03daba2-catalog-content\") pod \"community-operators-snsd5\" (UID: \"1b1c55a7-0911-4926-92da-1710e03daba2\") " pod="openshift-marketplace/community-operators-snsd5" Dec 07 19:31:09 crc kubenswrapper[4815]: I1207 19:31:09.976005 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ksbj\" (UniqueName: \"kubernetes.io/projected/1b1c55a7-0911-4926-92da-1710e03daba2-kube-api-access-5ksbj\") pod \"community-operators-snsd5\" (UID: \"1b1c55a7-0911-4926-92da-1710e03daba2\") " pod="openshift-marketplace/community-operators-snsd5" Dec 07 19:31:09 crc kubenswrapper[4815]: I1207 19:31:09.976029 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b1c55a7-0911-4926-92da-1710e03daba2-utilities\") pod \"community-operators-snsd5\" (UID: \"1b1c55a7-0911-4926-92da-1710e03daba2\") " pod="openshift-marketplace/community-operators-snsd5" Dec 07 19:31:10 crc kubenswrapper[4815]: I1207 19:31:10.024034 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-snsd5"] Dec 07 19:31:10 crc kubenswrapper[4815]: I1207 19:31:10.181366 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b1c55a7-0911-4926-92da-1710e03daba2-catalog-content\") pod \"community-operators-snsd5\" (UID: \"1b1c55a7-0911-4926-92da-1710e03daba2\") " pod="openshift-marketplace/community-operators-snsd5" Dec 07 19:31:10 crc kubenswrapper[4815]: I1207 19:31:10.181423 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ksbj\" (UniqueName: \"kubernetes.io/projected/1b1c55a7-0911-4926-92da-1710e03daba2-kube-api-access-5ksbj\") pod \"community-operators-snsd5\" (UID: \"1b1c55a7-0911-4926-92da-1710e03daba2\") " pod="openshift-marketplace/community-operators-snsd5" Dec 07 19:31:10 crc kubenswrapper[4815]: I1207 19:31:10.181453 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b1c55a7-0911-4926-92da-1710e03daba2-utilities\") pod \"community-operators-snsd5\" (UID: \"1b1c55a7-0911-4926-92da-1710e03daba2\") " pod="openshift-marketplace/community-operators-snsd5" Dec 07 19:31:10 crc kubenswrapper[4815]: I1207 19:31:10.181929 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b1c55a7-0911-4926-92da-1710e03daba2-catalog-content\") pod \"community-operators-snsd5\" (UID: \"1b1c55a7-0911-4926-92da-1710e03daba2\") " pod="openshift-marketplace/community-operators-snsd5" Dec 07 19:31:10 crc kubenswrapper[4815]: I1207 19:31:10.182009 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b1c55a7-0911-4926-92da-1710e03daba2-utilities\") pod \"community-operators-snsd5\" (UID: \"1b1c55a7-0911-4926-92da-1710e03daba2\") " pod="openshift-marketplace/community-operators-snsd5" Dec 07 19:31:10 crc kubenswrapper[4815]: I1207 19:31:10.255823 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ksbj\" (UniqueName: \"kubernetes.io/projected/1b1c55a7-0911-4926-92da-1710e03daba2-kube-api-access-5ksbj\") pod \"community-operators-snsd5\" (UID: \"1b1c55a7-0911-4926-92da-1710e03daba2\") " pod="openshift-marketplace/community-operators-snsd5" Dec 07 19:31:10 crc kubenswrapper[4815]: I1207 19:31:10.278475 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-snsd5" Dec 07 19:31:10 crc kubenswrapper[4815]: I1207 19:31:10.672263 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zljgk" event={"ID":"83888d61-30ef-4fc4-97f1-66d4074e355c","Type":"ContainerStarted","Data":"dfec260fdf5dfc6d507c9eede67e0e597a28616da6ebe047d308b376dea2f014"} Dec 07 19:31:11 crc kubenswrapper[4815]: I1207 19:31:11.504688 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-snsd5"] Dec 07 19:31:11 crc kubenswrapper[4815]: I1207 19:31:11.799152 4815 generic.go:334] "Generic (PLEG): container finished" podID="83888d61-30ef-4fc4-97f1-66d4074e355c" containerID="dfec260fdf5dfc6d507c9eede67e0e597a28616da6ebe047d308b376dea2f014" exitCode=0 Dec 07 19:31:11 crc kubenswrapper[4815]: I1207 19:31:11.799207 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zljgk" event={"ID":"83888d61-30ef-4fc4-97f1-66d4074e355c","Type":"ContainerDied","Data":"dfec260fdf5dfc6d507c9eede67e0e597a28616da6ebe047d308b376dea2f014"} Dec 07 19:31:11 crc kubenswrapper[4815]: I1207 19:31:11.804517 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-snsd5" event={"ID":"1b1c55a7-0911-4926-92da-1710e03daba2","Type":"ContainerStarted","Data":"cc6581f6b5fc455f1cff61b3e2c9da73ec8d7e8473f5d294c473e7bcf8118bf2"} Dec 07 19:31:12 crc kubenswrapper[4815]: I1207 19:31:12.814280 4815 generic.go:334] "Generic (PLEG): container finished" podID="1b1c55a7-0911-4926-92da-1710e03daba2" containerID="53e871fe6dfb5c13e672331f9acfb0b424eb8b53b9f15e0cd10ac0be0eb423df" exitCode=0 Dec 07 19:31:12 crc kubenswrapper[4815]: I1207 19:31:12.814464 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-snsd5" event={"ID":"1b1c55a7-0911-4926-92da-1710e03daba2","Type":"ContainerDied","Data":"53e871fe6dfb5c13e672331f9acfb0b424eb8b53b9f15e0cd10ac0be0eb423df"} Dec 07 19:31:13 crc kubenswrapper[4815]: I1207 19:31:13.821415 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zljgk" event={"ID":"83888d61-30ef-4fc4-97f1-66d4074e355c","Type":"ContainerStarted","Data":"1de32c423f98e3683ff4a1f36be417291027cea364f3360131e7f1b005db9e39"} Dec 07 19:31:13 crc kubenswrapper[4815]: I1207 19:31:13.827642 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-snsd5" event={"ID":"1b1c55a7-0911-4926-92da-1710e03daba2","Type":"ContainerStarted","Data":"e91f67df91f1c2442cd8312085dd66a659e66aba5c9e0d84e0a9d0a91cbe51d9"} Dec 07 19:31:13 crc kubenswrapper[4815]: I1207 19:31:13.962377 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zljgk" podStartSLOduration=3.775806411 podStartE2EDuration="7.962360784s" podCreationTimestamp="2025-12-07 19:31:06 +0000 UTC" firstStartedPulling="2025-12-07 19:31:08.592087139 +0000 UTC m=+973.171077184" lastFinishedPulling="2025-12-07 19:31:12.778641512 +0000 UTC m=+977.357631557" observedRunningTime="2025-12-07 19:31:13.93024236 +0000 UTC m=+978.509232405" watchObservedRunningTime="2025-12-07 19:31:13.962360784 +0000 UTC m=+978.541350819" Dec 07 19:31:14 crc kubenswrapper[4815]: I1207 19:31:14.913703 4815 generic.go:334] "Generic (PLEG): container finished" podID="1b1c55a7-0911-4926-92da-1710e03daba2" containerID="e91f67df91f1c2442cd8312085dd66a659e66aba5c9e0d84e0a9d0a91cbe51d9" exitCode=0 Dec 07 19:31:14 crc kubenswrapper[4815]: I1207 19:31:14.914999 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-snsd5" event={"ID":"1b1c55a7-0911-4926-92da-1710e03daba2","Type":"ContainerDied","Data":"e91f67df91f1c2442cd8312085dd66a659e66aba5c9e0d84e0a9d0a91cbe51d9"} Dec 07 19:31:16 crc kubenswrapper[4815]: I1207 19:31:16.931475 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-snsd5" event={"ID":"1b1c55a7-0911-4926-92da-1710e03daba2","Type":"ContainerStarted","Data":"e91938526d28edd335c004632c099c8d5624470d4c3a70a68868af4fc61f67bb"} Dec 07 19:31:16 crc kubenswrapper[4815]: I1207 19:31:16.974803 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-snsd5" podStartSLOduration=4.882550618 podStartE2EDuration="7.974789191s" podCreationTimestamp="2025-12-07 19:31:09 +0000 UTC" firstStartedPulling="2025-12-07 19:31:12.815714399 +0000 UTC m=+977.394704444" lastFinishedPulling="2025-12-07 19:31:15.907952972 +0000 UTC m=+980.486943017" observedRunningTime="2025-12-07 19:31:16.973455492 +0000 UTC m=+981.552445537" watchObservedRunningTime="2025-12-07 19:31:16.974789191 +0000 UTC m=+981.553779236" Dec 07 19:31:17 crc kubenswrapper[4815]: I1207 19:31:17.084830 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zljgk" Dec 07 19:31:17 crc kubenswrapper[4815]: I1207 19:31:17.084903 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zljgk" Dec 07 19:31:18 crc kubenswrapper[4815]: I1207 19:31:18.179175 4815 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-zljgk" podUID="83888d61-30ef-4fc4-97f1-66d4074e355c" containerName="registry-server" probeResult="failure" output=< Dec 07 19:31:18 crc kubenswrapper[4815]: timeout: failed to connect service ":50051" within 1s Dec 07 19:31:18 crc kubenswrapper[4815]: > Dec 07 19:31:20 crc kubenswrapper[4815]: I1207 19:31:20.279475 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-snsd5" Dec 07 19:31:20 crc kubenswrapper[4815]: I1207 19:31:20.279739 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-snsd5" Dec 07 19:31:21 crc kubenswrapper[4815]: I1207 19:31:21.347046 4815 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-snsd5" podUID="1b1c55a7-0911-4926-92da-1710e03daba2" containerName="registry-server" probeResult="failure" output=< Dec 07 19:31:21 crc kubenswrapper[4815]: timeout: failed to connect service ":50051" within 1s Dec 07 19:31:21 crc kubenswrapper[4815]: > Dec 07 19:31:23 crc kubenswrapper[4815]: I1207 19:31:23.929898 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-jr22f"] Dec 07 19:31:23 crc kubenswrapper[4815]: I1207 19:31:23.931064 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-jr22f" Dec 07 19:31:23 crc kubenswrapper[4815]: I1207 19:31:23.934500 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-9rfc2" Dec 07 19:31:23 crc kubenswrapper[4815]: I1207 19:31:23.953235 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-jr22f"] Dec 07 19:31:23 crc kubenswrapper[4815]: I1207 19:31:23.973460 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-6c677c69b-bq2qd"] Dec 07 19:31:23 crc kubenswrapper[4815]: I1207 19:31:23.976163 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-bq2qd" Dec 07 19:31:24 crc kubenswrapper[4815]: I1207 19:31:24.288182 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-wlgp8" Dec 07 19:31:24 crc kubenswrapper[4815]: I1207 19:31:24.288994 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ssjw\" (UniqueName: \"kubernetes.io/projected/0cc9e387-27e4-4b5d-ac7a-d9f098acb973-kube-api-access-9ssjw\") pod \"cinder-operator-controller-manager-6c677c69b-bq2qd\" (UID: \"0cc9e387-27e4-4b5d-ac7a-d9f098acb973\") " pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-bq2qd" Dec 07 19:31:24 crc kubenswrapper[4815]: I1207 19:31:24.289023 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdf77\" (UniqueName: \"kubernetes.io/projected/4c8e53f4-6dec-4655-b931-b8d0b8ddc8da-kube-api-access-bdf77\") pod \"barbican-operator-controller-manager-7d9dfd778-jr22f\" (UID: \"4c8e53f4-6dec-4655-b931-b8d0b8ddc8da\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-jr22f" Dec 07 19:31:24 crc kubenswrapper[4815]: I1207 19:31:24.341974 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-697fb699cf-psfhw"] Dec 07 19:31:24 crc kubenswrapper[4815]: I1207 19:31:24.342988 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-psfhw" Dec 07 19:31:24 crc kubenswrapper[4815]: I1207 19:31:24.352756 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-dpznt" Dec 07 19:31:24 crc kubenswrapper[4815]: I1207 19:31:24.368301 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-6c677c69b-bq2qd"] Dec 07 19:31:24 crc kubenswrapper[4815]: I1207 19:31:24.388219 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-5697bb5779-7wwz9"] Dec 07 19:31:24 crc kubenswrapper[4815]: I1207 19:31:24.389489 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-7wwz9" Dec 07 19:31:24 crc kubenswrapper[4815]: I1207 19:31:24.390931 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ssjw\" (UniqueName: \"kubernetes.io/projected/0cc9e387-27e4-4b5d-ac7a-d9f098acb973-kube-api-access-9ssjw\") pod \"cinder-operator-controller-manager-6c677c69b-bq2qd\" (UID: \"0cc9e387-27e4-4b5d-ac7a-d9f098acb973\") " pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-bq2qd" Dec 07 19:31:24 crc kubenswrapper[4815]: I1207 19:31:24.391093 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdf77\" (UniqueName: \"kubernetes.io/projected/4c8e53f4-6dec-4655-b931-b8d0b8ddc8da-kube-api-access-bdf77\") pod \"barbican-operator-controller-manager-7d9dfd778-jr22f\" (UID: \"4c8e53f4-6dec-4655-b931-b8d0b8ddc8da\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-jr22f" Dec 07 19:31:24 crc kubenswrapper[4815]: I1207 19:31:24.392644 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-dmd79" Dec 07 19:31:24 crc kubenswrapper[4815]: I1207 19:31:24.401965 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-697fb699cf-psfhw"] Dec 07 19:31:24 crc kubenswrapper[4815]: I1207 19:31:24.409411 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5697bb5779-7wwz9"] Dec 07 19:31:24 crc kubenswrapper[4815]: I1207 19:31:24.427319 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-trnlm"] Dec 07 19:31:24 crc kubenswrapper[4815]: I1207 19:31:24.428276 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-trnlm" Dec 07 19:31:24 crc kubenswrapper[4815]: I1207 19:31:24.432526 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-ctp7h" Dec 07 19:31:24 crc kubenswrapper[4815]: I1207 19:31:24.439182 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ssjw\" (UniqueName: \"kubernetes.io/projected/0cc9e387-27e4-4b5d-ac7a-d9f098acb973-kube-api-access-9ssjw\") pod \"cinder-operator-controller-manager-6c677c69b-bq2qd\" (UID: \"0cc9e387-27e4-4b5d-ac7a-d9f098acb973\") " pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-bq2qd" Dec 07 19:31:24 crc kubenswrapper[4815]: I1207 19:31:24.462815 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdf77\" (UniqueName: \"kubernetes.io/projected/4c8e53f4-6dec-4655-b931-b8d0b8ddc8da-kube-api-access-bdf77\") pod \"barbican-operator-controller-manager-7d9dfd778-jr22f\" (UID: \"4c8e53f4-6dec-4655-b931-b8d0b8ddc8da\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-jr22f" Dec 07 19:31:24 crc kubenswrapper[4815]: I1207 19:31:24.471675 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-trnlm"] Dec 07 19:31:24 crc kubenswrapper[4815]: I1207 19:31:24.487046 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-spddl"] Dec 07 19:31:24 crc kubenswrapper[4815]: I1207 19:31:24.488088 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-spddl" Dec 07 19:31:24 crc kubenswrapper[4815]: I1207 19:31:24.491562 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-bd588" Dec 07 19:31:24 crc kubenswrapper[4815]: I1207 19:31:24.494736 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwhbn\" (UniqueName: \"kubernetes.io/projected/cbeacff5-ac96-4444-aa57-04a320582348-kube-api-access-dwhbn\") pod \"designate-operator-controller-manager-697fb699cf-psfhw\" (UID: \"cbeacff5-ac96-4444-aa57-04a320582348\") " pod="openstack-operators/designate-operator-controller-manager-697fb699cf-psfhw" Dec 07 19:31:24 crc kubenswrapper[4815]: I1207 19:31:24.494817 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hg9l\" (UniqueName: \"kubernetes.io/projected/603612f4-25fa-4356-a99c-b054645d8919-kube-api-access-4hg9l\") pod \"glance-operator-controller-manager-5697bb5779-7wwz9\" (UID: \"603612f4-25fa-4356-a99c-b054645d8919\") " pod="openstack-operators/glance-operator-controller-manager-5697bb5779-7wwz9" Dec 07 19:31:24 crc kubenswrapper[4815]: I1207 19:31:24.517274 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-spddl"] Dec 07 19:31:24 crc kubenswrapper[4815]: I1207 19:31:24.517584 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-78d48bff9d-5rmk7"] Dec 07 19:31:24 crc kubenswrapper[4815]: I1207 19:31:24.518473 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-5rmk7" Dec 07 19:31:24 crc kubenswrapper[4815]: I1207 19:31:24.527125 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-v7kcd" Dec 07 19:31:24 crc kubenswrapper[4815]: I1207 19:31:24.527358 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Dec 07 19:31:24 crc kubenswrapper[4815]: I1207 19:31:24.560196 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-78d48bff9d-5rmk7"] Dec 07 19:31:24 crc kubenswrapper[4815]: I1207 19:31:24.560319 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-jr22f" Dec 07 19:31:24 crc kubenswrapper[4815]: I1207 19:31:24.582146 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-967d97867-z7p7t"] Dec 07 19:31:24 crc kubenswrapper[4815]: I1207 19:31:24.583218 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-967d97867-z7p7t" Dec 07 19:31:24 crc kubenswrapper[4815]: I1207 19:31:24.588977 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-cwqbr" Dec 07 19:31:24 crc kubenswrapper[4815]: I1207 19:31:24.595730 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwhbn\" (UniqueName: \"kubernetes.io/projected/cbeacff5-ac96-4444-aa57-04a320582348-kube-api-access-dwhbn\") pod \"designate-operator-controller-manager-697fb699cf-psfhw\" (UID: \"cbeacff5-ac96-4444-aa57-04a320582348\") " pod="openstack-operators/designate-operator-controller-manager-697fb699cf-psfhw" Dec 07 19:31:24 crc kubenswrapper[4815]: I1207 19:31:24.595791 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cbw8\" (UniqueName: \"kubernetes.io/projected/33bdbbd1-62ad-42d5-a10a-a5da1344af19-kube-api-access-9cbw8\") pod \"horizon-operator-controller-manager-68c6d99b8f-spddl\" (UID: \"33bdbbd1-62ad-42d5-a10a-a5da1344af19\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-spddl" Dec 07 19:31:24 crc kubenswrapper[4815]: I1207 19:31:24.595858 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hg9l\" (UniqueName: \"kubernetes.io/projected/603612f4-25fa-4356-a99c-b054645d8919-kube-api-access-4hg9l\") pod \"glance-operator-controller-manager-5697bb5779-7wwz9\" (UID: \"603612f4-25fa-4356-a99c-b054645d8919\") " pod="openstack-operators/glance-operator-controller-manager-5697bb5779-7wwz9" Dec 07 19:31:24 crc kubenswrapper[4815]: I1207 19:31:24.595938 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6n5rr\" (UniqueName: \"kubernetes.io/projected/d6eb6a40-b713-4a4a-9554-749f94cf1137-kube-api-access-6n5rr\") pod \"heat-operator-controller-manager-5f64f6f8bb-trnlm\" (UID: \"d6eb6a40-b713-4a4a-9554-749f94cf1137\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-trnlm" Dec 07 19:31:24 crc kubenswrapper[4815]: I1207 19:31:24.604815 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-5shnd"] Dec 07 19:31:24 crc kubenswrapper[4815]: I1207 19:31:24.606213 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-5shnd" Dec 07 19:31:24 crc kubenswrapper[4815]: I1207 19:31:24.615394 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-bq2qd" Dec 07 19:31:24 crc kubenswrapper[4815]: I1207 19:31:24.615963 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-xj7tt" Dec 07 19:31:24 crc kubenswrapper[4815]: I1207 19:31:24.631129 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-967d97867-z7p7t"] Dec 07 19:31:24 crc kubenswrapper[4815]: I1207 19:31:24.659634 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwhbn\" (UniqueName: \"kubernetes.io/projected/cbeacff5-ac96-4444-aa57-04a320582348-kube-api-access-dwhbn\") pod \"designate-operator-controller-manager-697fb699cf-psfhw\" (UID: \"cbeacff5-ac96-4444-aa57-04a320582348\") " pod="openstack-operators/designate-operator-controller-manager-697fb699cf-psfhw" Dec 07 19:31:24 crc kubenswrapper[4815]: I1207 19:31:24.668096 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hg9l\" (UniqueName: \"kubernetes.io/projected/603612f4-25fa-4356-a99c-b054645d8919-kube-api-access-4hg9l\") pod \"glance-operator-controller-manager-5697bb5779-7wwz9\" (UID: \"603612f4-25fa-4356-a99c-b054645d8919\") " pod="openstack-operators/glance-operator-controller-manager-5697bb5779-7wwz9" Dec 07 19:31:24 crc kubenswrapper[4815]: I1207 19:31:24.679299 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-psfhw" Dec 07 19:31:24 crc kubenswrapper[4815]: I1207 19:31:24.683534 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-5shnd"] Dec 07 19:31:24 crc kubenswrapper[4815]: I1207 19:31:24.686828 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-5b5fd79c9c-7d9wv"] Dec 07 19:31:24 crc kubenswrapper[4815]: I1207 19:31:24.698328 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-x59hf"] Dec 07 19:31:24 crc kubenswrapper[4815]: I1207 19:31:24.698613 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-7d9wv" Dec 07 19:31:24 crc kubenswrapper[4815]: I1207 19:31:24.698753 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/01adf042-9afe-46be-ba0c-0c1a3f86ed8d-cert\") pod \"infra-operator-controller-manager-78d48bff9d-5rmk7\" (UID: \"01adf042-9afe-46be-ba0c-0c1a3f86ed8d\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-5rmk7" Dec 07 19:31:24 crc kubenswrapper[4815]: I1207 19:31:24.698797 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9cbw8\" (UniqueName: \"kubernetes.io/projected/33bdbbd1-62ad-42d5-a10a-a5da1344af19-kube-api-access-9cbw8\") pod \"horizon-operator-controller-manager-68c6d99b8f-spddl\" (UID: \"33bdbbd1-62ad-42d5-a10a-a5da1344af19\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-spddl" Dec 07 19:31:24 crc kubenswrapper[4815]: I1207 19:31:24.698818 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jc4xx\" (UniqueName: \"kubernetes.io/projected/01adf042-9afe-46be-ba0c-0c1a3f86ed8d-kube-api-access-jc4xx\") pod \"infra-operator-controller-manager-78d48bff9d-5rmk7\" (UID: \"01adf042-9afe-46be-ba0c-0c1a3f86ed8d\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-5rmk7" Dec 07 19:31:24 crc kubenswrapper[4815]: I1207 19:31:24.698836 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jb55l\" (UniqueName: \"kubernetes.io/projected/b7efe346-4e91-45a1-84d6-6a5aac1c739c-kube-api-access-jb55l\") pod \"manila-operator-controller-manager-5b5fd79c9c-7d9wv\" (UID: \"b7efe346-4e91-45a1-84d6-6a5aac1c739c\") " pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-7d9wv" Dec 07 19:31:24 crc kubenswrapper[4815]: I1207 19:31:24.698894 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6n5rr\" (UniqueName: \"kubernetes.io/projected/d6eb6a40-b713-4a4a-9554-749f94cf1137-kube-api-access-6n5rr\") pod \"heat-operator-controller-manager-5f64f6f8bb-trnlm\" (UID: \"d6eb6a40-b713-4a4a-9554-749f94cf1137\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-trnlm" Dec 07 19:31:24 crc kubenswrapper[4815]: I1207 19:31:24.698934 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqzhj\" (UniqueName: \"kubernetes.io/projected/0404b56e-87cb-40c5-b11c-64dc7c960718-kube-api-access-tqzhj\") pod \"keystone-operator-controller-manager-7765d96ddf-5shnd\" (UID: \"0404b56e-87cb-40c5-b11c-64dc7c960718\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-5shnd" Dec 07 19:31:24 crc kubenswrapper[4815]: I1207 19:31:24.698962 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kk997\" (UniqueName: \"kubernetes.io/projected/954764fa-df14-4604-96d2-6ddc12155406-kube-api-access-kk997\") pod \"ironic-operator-controller-manager-967d97867-z7p7t\" (UID: \"954764fa-df14-4604-96d2-6ddc12155406\") " pod="openstack-operators/ironic-operator-controller-manager-967d97867-z7p7t" Dec 07 19:31:24 crc kubenswrapper[4815]: I1207 19:31:24.700288 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-x59hf" Dec 07 19:31:24 crc kubenswrapper[4815]: I1207 19:31:24.709370 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-6tj8c" Dec 07 19:31:24 crc kubenswrapper[4815]: I1207 19:31:24.709548 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-l6x96" Dec 07 19:31:24 crc kubenswrapper[4815]: I1207 19:31:24.709883 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-7wwz9" Dec 07 19:31:24 crc kubenswrapper[4815]: I1207 19:31:24.769901 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cbw8\" (UniqueName: \"kubernetes.io/projected/33bdbbd1-62ad-42d5-a10a-a5da1344af19-kube-api-access-9cbw8\") pod \"horizon-operator-controller-manager-68c6d99b8f-spddl\" (UID: \"33bdbbd1-62ad-42d5-a10a-a5da1344af19\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-spddl" Dec 07 19:31:24 crc kubenswrapper[4815]: I1207 19:31:24.794228 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6n5rr\" (UniqueName: \"kubernetes.io/projected/d6eb6a40-b713-4a4a-9554-749f94cf1137-kube-api-access-6n5rr\") pod \"heat-operator-controller-manager-5f64f6f8bb-trnlm\" (UID: \"d6eb6a40-b713-4a4a-9554-749f94cf1137\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-trnlm" Dec 07 19:31:24 crc kubenswrapper[4815]: I1207 19:31:24.799324 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-trnlm" Dec 07 19:31:24 crc kubenswrapper[4815]: I1207 19:31:24.815142 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqzhj\" (UniqueName: \"kubernetes.io/projected/0404b56e-87cb-40c5-b11c-64dc7c960718-kube-api-access-tqzhj\") pod \"keystone-operator-controller-manager-7765d96ddf-5shnd\" (UID: \"0404b56e-87cb-40c5-b11c-64dc7c960718\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-5shnd" Dec 07 19:31:24 crc kubenswrapper[4815]: I1207 19:31:24.815211 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kk997\" (UniqueName: \"kubernetes.io/projected/954764fa-df14-4604-96d2-6ddc12155406-kube-api-access-kk997\") pod \"ironic-operator-controller-manager-967d97867-z7p7t\" (UID: \"954764fa-df14-4604-96d2-6ddc12155406\") " pod="openstack-operators/ironic-operator-controller-manager-967d97867-z7p7t" Dec 07 19:31:24 crc kubenswrapper[4815]: I1207 19:31:24.815255 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/01adf042-9afe-46be-ba0c-0c1a3f86ed8d-cert\") pod \"infra-operator-controller-manager-78d48bff9d-5rmk7\" (UID: \"01adf042-9afe-46be-ba0c-0c1a3f86ed8d\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-5rmk7" Dec 07 19:31:24 crc kubenswrapper[4815]: I1207 19:31:24.815286 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jc4xx\" (UniqueName: \"kubernetes.io/projected/01adf042-9afe-46be-ba0c-0c1a3f86ed8d-kube-api-access-jc4xx\") pod \"infra-operator-controller-manager-78d48bff9d-5rmk7\" (UID: \"01adf042-9afe-46be-ba0c-0c1a3f86ed8d\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-5rmk7" Dec 07 19:31:24 crc kubenswrapper[4815]: I1207 19:31:24.815319 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jb55l\" (UniqueName: \"kubernetes.io/projected/b7efe346-4e91-45a1-84d6-6a5aac1c739c-kube-api-access-jb55l\") pod \"manila-operator-controller-manager-5b5fd79c9c-7d9wv\" (UID: \"b7efe346-4e91-45a1-84d6-6a5aac1c739c\") " pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-7d9wv" Dec 07 19:31:24 crc kubenswrapper[4815]: E1207 19:31:24.816386 4815 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 07 19:31:24 crc kubenswrapper[4815]: E1207 19:31:24.816645 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/01adf042-9afe-46be-ba0c-0c1a3f86ed8d-cert podName:01adf042-9afe-46be-ba0c-0c1a3f86ed8d nodeName:}" failed. No retries permitted until 2025-12-07 19:31:25.316431774 +0000 UTC m=+989.895421819 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/01adf042-9afe-46be-ba0c-0c1a3f86ed8d-cert") pod "infra-operator-controller-manager-78d48bff9d-5rmk7" (UID: "01adf042-9afe-46be-ba0c-0c1a3f86ed8d") : secret "infra-operator-webhook-server-cert" not found Dec 07 19:31:25 crc kubenswrapper[4815]: I1207 19:31:25.060815 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pff4n\" (UniqueName: \"kubernetes.io/projected/8055e37c-efd4-4c82-a9df-4d5e2a12ef63-kube-api-access-pff4n\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-x59hf\" (UID: \"8055e37c-efd4-4c82-a9df-4d5e2a12ef63\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-x59hf" Dec 07 19:31:25 crc kubenswrapper[4815]: I1207 19:31:25.061709 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-spddl" Dec 07 19:31:25 crc kubenswrapper[4815]: I1207 19:31:25.095032 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-5b5fd79c9c-7d9wv"] Dec 07 19:31:25 crc kubenswrapper[4815]: I1207 19:31:25.095067 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-79c8c4686c-4wf77"] Dec 07 19:31:25 crc kubenswrapper[4815]: I1207 19:31:25.112799 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-4wf77" Dec 07 19:31:25 crc kubenswrapper[4815]: I1207 19:31:25.123228 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jc4xx\" (UniqueName: \"kubernetes.io/projected/01adf042-9afe-46be-ba0c-0c1a3f86ed8d-kube-api-access-jc4xx\") pod \"infra-operator-controller-manager-78d48bff9d-5rmk7\" (UID: \"01adf042-9afe-46be-ba0c-0c1a3f86ed8d\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-5rmk7" Dec 07 19:31:25 crc kubenswrapper[4815]: I1207 19:31:25.124433 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kk997\" (UniqueName: \"kubernetes.io/projected/954764fa-df14-4604-96d2-6ddc12155406-kube-api-access-kk997\") pod \"ironic-operator-controller-manager-967d97867-z7p7t\" (UID: \"954764fa-df14-4604-96d2-6ddc12155406\") " pod="openstack-operators/ironic-operator-controller-manager-967d97867-z7p7t" Dec 07 19:31:25 crc kubenswrapper[4815]: I1207 19:31:25.138587 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jb55l\" (UniqueName: \"kubernetes.io/projected/b7efe346-4e91-45a1-84d6-6a5aac1c739c-kube-api-access-jb55l\") pod \"manila-operator-controller-manager-5b5fd79c9c-7d9wv\" (UID: \"b7efe346-4e91-45a1-84d6-6a5aac1c739c\") " pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-7d9wv" Dec 07 19:31:25 crc kubenswrapper[4815]: I1207 19:31:25.140679 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-2cpj2" Dec 07 19:31:25 crc kubenswrapper[4815]: I1207 19:31:25.162695 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pff4n\" (UniqueName: \"kubernetes.io/projected/8055e37c-efd4-4c82-a9df-4d5e2a12ef63-kube-api-access-pff4n\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-x59hf\" (UID: \"8055e37c-efd4-4c82-a9df-4d5e2a12ef63\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-x59hf" Dec 07 19:31:25 crc kubenswrapper[4815]: I1207 19:31:25.237239 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-x59hf"] Dec 07 19:31:25 crc kubenswrapper[4815]: I1207 19:31:25.238367 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqzhj\" (UniqueName: \"kubernetes.io/projected/0404b56e-87cb-40c5-b11c-64dc7c960718-kube-api-access-tqzhj\") pod \"keystone-operator-controller-manager-7765d96ddf-5shnd\" (UID: \"0404b56e-87cb-40c5-b11c-64dc7c960718\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-5shnd" Dec 07 19:31:25 crc kubenswrapper[4815]: I1207 19:31:25.248422 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-79c8c4686c-4wf77"] Dec 07 19:31:25 crc kubenswrapper[4815]: I1207 19:31:25.248771 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pff4n\" (UniqueName: \"kubernetes.io/projected/8055e37c-efd4-4c82-a9df-4d5e2a12ef63-kube-api-access-pff4n\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-x59hf\" (UID: \"8055e37c-efd4-4c82-a9df-4d5e2a12ef63\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-x59hf" Dec 07 19:31:25 crc kubenswrapper[4815]: I1207 19:31:25.258463 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-8mg5j"] Dec 07 19:31:25 crc kubenswrapper[4815]: I1207 19:31:25.259842 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-8mg5j" Dec 07 19:31:25 crc kubenswrapper[4815]: I1207 19:31:25.263612 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjvr2\" (UniqueName: \"kubernetes.io/projected/c727b772-a57c-4564-bf8d-7c8917b4bb0d-kube-api-access-vjvr2\") pod \"mariadb-operator-controller-manager-79c8c4686c-4wf77\" (UID: \"c727b772-a57c-4564-bf8d-7c8917b4bb0d\") " pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-4wf77" Dec 07 19:31:25 crc kubenswrapper[4815]: I1207 19:31:25.269250 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-nsvl7" Dec 07 19:31:25 crc kubenswrapper[4815]: I1207 19:31:25.311474 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-8mg5j"] Dec 07 19:31:25 crc kubenswrapper[4815]: I1207 19:31:25.320989 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-m6p4v"] Dec 07 19:31:25 crc kubenswrapper[4815]: I1207 19:31:25.322067 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fvvqrm"] Dec 07 19:31:25 crc kubenswrapper[4815]: I1207 19:31:25.322868 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fvvqrm" Dec 07 19:31:25 crc kubenswrapper[4815]: I1207 19:31:25.323239 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-m6p4v" Dec 07 19:31:25 crc kubenswrapper[4815]: I1207 19:31:25.325181 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-4k6hw" Dec 07 19:31:25 crc kubenswrapper[4815]: I1207 19:31:25.325758 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-jbpcq" Dec 07 19:31:25 crc kubenswrapper[4815]: I1207 19:31:25.326601 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Dec 07 19:31:25 crc kubenswrapper[4815]: I1207 19:31:25.329005 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-967d97867-z7p7t" Dec 07 19:31:25 crc kubenswrapper[4815]: I1207 19:31:25.362514 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-x59hf" Dec 07 19:31:25 crc kubenswrapper[4815]: I1207 19:31:25.369781 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjvr2\" (UniqueName: \"kubernetes.io/projected/c727b772-a57c-4564-bf8d-7c8917b4bb0d-kube-api-access-vjvr2\") pod \"mariadb-operator-controller-manager-79c8c4686c-4wf77\" (UID: \"c727b772-a57c-4564-bf8d-7c8917b4bb0d\") " pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-4wf77" Dec 07 19:31:25 crc kubenswrapper[4815]: I1207 19:31:25.369842 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/097ed41c-7445-4a2f-ba72-c9ff11bb0e28-cert\") pod \"openstack-baremetal-operator-controller-manager-84b575879fvvqrm\" (UID: \"097ed41c-7445-4a2f-ba72-c9ff11bb0e28\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fvvqrm" Dec 07 19:31:25 crc kubenswrapper[4815]: I1207 19:31:25.369886 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lqzr\" (UniqueName: \"kubernetes.io/projected/097ed41c-7445-4a2f-ba72-c9ff11bb0e28-kube-api-access-4lqzr\") pod \"openstack-baremetal-operator-controller-manager-84b575879fvvqrm\" (UID: \"097ed41c-7445-4a2f-ba72-c9ff11bb0e28\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fvvqrm" Dec 07 19:31:25 crc kubenswrapper[4815]: I1207 19:31:25.369933 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/01adf042-9afe-46be-ba0c-0c1a3f86ed8d-cert\") pod \"infra-operator-controller-manager-78d48bff9d-5rmk7\" (UID: \"01adf042-9afe-46be-ba0c-0c1a3f86ed8d\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-5rmk7" Dec 07 19:31:25 crc kubenswrapper[4815]: I1207 19:31:25.369968 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t97cc\" (UniqueName: \"kubernetes.io/projected/2cd3ca37-5d31-4068-88df-1344ebfad5e7-kube-api-access-t97cc\") pod \"octavia-operator-controller-manager-998648c74-m6p4v\" (UID: \"2cd3ca37-5d31-4068-88df-1344ebfad5e7\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-m6p4v" Dec 07 19:31:25 crc kubenswrapper[4815]: I1207 19:31:25.370044 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q95zs\" (UniqueName: \"kubernetes.io/projected/0dc4f082-f669-444f-a46e-9bca4cc20f31-kube-api-access-q95zs\") pod \"nova-operator-controller-manager-697bc559fc-8mg5j\" (UID: \"0dc4f082-f669-444f-a46e-9bca4cc20f31\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-8mg5j" Dec 07 19:31:25 crc kubenswrapper[4815]: E1207 19:31:25.370797 4815 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 07 19:31:25 crc kubenswrapper[4815]: E1207 19:31:25.370891 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/01adf042-9afe-46be-ba0c-0c1a3f86ed8d-cert podName:01adf042-9afe-46be-ba0c-0c1a3f86ed8d nodeName:}" failed. No retries permitted until 2025-12-07 19:31:26.370821928 +0000 UTC m=+990.949811973 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/01adf042-9afe-46be-ba0c-0c1a3f86ed8d-cert") pod "infra-operator-controller-manager-78d48bff9d-5rmk7" (UID: "01adf042-9afe-46be-ba0c-0c1a3f86ed8d") : secret "infra-operator-webhook-server-cert" not found Dec 07 19:31:25 crc kubenswrapper[4815]: I1207 19:31:25.375908 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-dtpkv"] Dec 07 19:31:25 crc kubenswrapper[4815]: I1207 19:31:25.378334 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-dtpkv" Dec 07 19:31:25 crc kubenswrapper[4815]: I1207 19:31:25.378839 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-5shnd" Dec 07 19:31:25 crc kubenswrapper[4815]: I1207 19:31:25.402278 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-ww4dw" Dec 07 19:31:25 crc kubenswrapper[4815]: I1207 19:31:25.405637 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-m6p4v"] Dec 07 19:31:25 crc kubenswrapper[4815]: I1207 19:31:25.419523 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-dtpkv"] Dec 07 19:31:25 crc kubenswrapper[4815]: I1207 19:31:25.419808 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjvr2\" (UniqueName: \"kubernetes.io/projected/c727b772-a57c-4564-bf8d-7c8917b4bb0d-kube-api-access-vjvr2\") pod \"mariadb-operator-controller-manager-79c8c4686c-4wf77\" (UID: \"c727b772-a57c-4564-bf8d-7c8917b4bb0d\") " pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-4wf77" Dec 07 19:31:25 crc kubenswrapper[4815]: I1207 19:31:25.429715 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-7d9wv" Dec 07 19:31:25 crc kubenswrapper[4815]: I1207 19:31:25.476494 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lqzr\" (UniqueName: \"kubernetes.io/projected/097ed41c-7445-4a2f-ba72-c9ff11bb0e28-kube-api-access-4lqzr\") pod \"openstack-baremetal-operator-controller-manager-84b575879fvvqrm\" (UID: \"097ed41c-7445-4a2f-ba72-c9ff11bb0e28\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fvvqrm" Dec 07 19:31:25 crc kubenswrapper[4815]: I1207 19:31:25.476562 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7j7bz\" (UniqueName: \"kubernetes.io/projected/34bdeed9-8374-4eea-a1d9-562d933324e9-kube-api-access-7j7bz\") pod \"ovn-operator-controller-manager-b6456fdb6-dtpkv\" (UID: \"34bdeed9-8374-4eea-a1d9-562d933324e9\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-dtpkv" Dec 07 19:31:25 crc kubenswrapper[4815]: I1207 19:31:25.476612 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t97cc\" (UniqueName: \"kubernetes.io/projected/2cd3ca37-5d31-4068-88df-1344ebfad5e7-kube-api-access-t97cc\") pod \"octavia-operator-controller-manager-998648c74-m6p4v\" (UID: \"2cd3ca37-5d31-4068-88df-1344ebfad5e7\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-m6p4v" Dec 07 19:31:25 crc kubenswrapper[4815]: I1207 19:31:25.476631 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q95zs\" (UniqueName: \"kubernetes.io/projected/0dc4f082-f669-444f-a46e-9bca4cc20f31-kube-api-access-q95zs\") pod \"nova-operator-controller-manager-697bc559fc-8mg5j\" (UID: \"0dc4f082-f669-444f-a46e-9bca4cc20f31\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-8mg5j" Dec 07 19:31:25 crc kubenswrapper[4815]: I1207 19:31:25.476684 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/097ed41c-7445-4a2f-ba72-c9ff11bb0e28-cert\") pod \"openstack-baremetal-operator-controller-manager-84b575879fvvqrm\" (UID: \"097ed41c-7445-4a2f-ba72-c9ff11bb0e28\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fvvqrm" Dec 07 19:31:25 crc kubenswrapper[4815]: E1207 19:31:25.476793 4815 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 07 19:31:25 crc kubenswrapper[4815]: E1207 19:31:25.476835 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/097ed41c-7445-4a2f-ba72-c9ff11bb0e28-cert podName:097ed41c-7445-4a2f-ba72-c9ff11bb0e28 nodeName:}" failed. No retries permitted until 2025-12-07 19:31:25.976822198 +0000 UTC m=+990.555812243 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/097ed41c-7445-4a2f-ba72-c9ff11bb0e28-cert") pod "openstack-baremetal-operator-controller-manager-84b575879fvvqrm" (UID: "097ed41c-7445-4a2f-ba72-c9ff11bb0e28") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 07 19:31:25 crc kubenswrapper[4815]: I1207 19:31:25.491112 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fvvqrm"] Dec 07 19:31:25 crc kubenswrapper[4815]: I1207 19:31:25.495732 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-4sb78"] Dec 07 19:31:25 crc kubenswrapper[4815]: I1207 19:31:25.496730 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-4sb78" Dec 07 19:31:25 crc kubenswrapper[4815]: I1207 19:31:25.499415 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-9d58d64bc-jsdgf"] Dec 07 19:31:25 crc kubenswrapper[4815]: I1207 19:31:25.500361 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-jsdgf" Dec 07 19:31:25 crc kubenswrapper[4815]: I1207 19:31:25.507408 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-pzg5q" Dec 07 19:31:25 crc kubenswrapper[4815]: I1207 19:31:25.507609 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-qwd5j" Dec 07 19:31:25 crc kubenswrapper[4815]: I1207 19:31:25.508054 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-9d58d64bc-jsdgf"] Dec 07 19:31:25 crc kubenswrapper[4815]: I1207 19:31:25.513142 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-4sb78"] Dec 07 19:31:25 crc kubenswrapper[4815]: I1207 19:31:25.523964 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-58d5ff84df-8spbl"] Dec 07 19:31:25 crc kubenswrapper[4815]: I1207 19:31:25.524937 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-8spbl" Dec 07 19:31:25 crc kubenswrapper[4815]: I1207 19:31:25.525278 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-58d5ff84df-8spbl"] Dec 07 19:31:25 crc kubenswrapper[4815]: I1207 19:31:25.535751 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-gfvgj"] Dec 07 19:31:25 crc kubenswrapper[4815]: I1207 19:31:25.536795 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-gfvgj" Dec 07 19:31:25 crc kubenswrapper[4815]: I1207 19:31:25.537252 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-pzgpd" Dec 07 19:31:25 crc kubenswrapper[4815]: I1207 19:31:25.546266 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-gfvgj"] Dec 07 19:31:25 crc kubenswrapper[4815]: I1207 19:31:25.549149 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-mh94j" Dec 07 19:31:25 crc kubenswrapper[4815]: I1207 19:31:25.563653 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q95zs\" (UniqueName: \"kubernetes.io/projected/0dc4f082-f669-444f-a46e-9bca4cc20f31-kube-api-access-q95zs\") pod \"nova-operator-controller-manager-697bc559fc-8mg5j\" (UID: \"0dc4f082-f669-444f-a46e-9bca4cc20f31\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-8mg5j" Dec 07 19:31:25 crc kubenswrapper[4815]: I1207 19:31:25.564750 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lqzr\" (UniqueName: \"kubernetes.io/projected/097ed41c-7445-4a2f-ba72-c9ff11bb0e28-kube-api-access-4lqzr\") pod \"openstack-baremetal-operator-controller-manager-84b575879fvvqrm\" (UID: \"097ed41c-7445-4a2f-ba72-c9ff11bb0e28\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fvvqrm" Dec 07 19:31:25 crc kubenswrapper[4815]: I1207 19:31:25.583084 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t97cc\" (UniqueName: \"kubernetes.io/projected/2cd3ca37-5d31-4068-88df-1344ebfad5e7-kube-api-access-t97cc\") pod \"octavia-operator-controller-manager-998648c74-m6p4v\" (UID: \"2cd3ca37-5d31-4068-88df-1344ebfad5e7\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-m6p4v" Dec 07 19:31:25 crc kubenswrapper[4815]: I1207 19:31:25.583670 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kr57n\" (UniqueName: \"kubernetes.io/projected/8e51367a-70c8-4b67-b15f-ee4202171e38-kube-api-access-kr57n\") pod \"telemetry-operator-controller-manager-58d5ff84df-8spbl\" (UID: \"8e51367a-70c8-4b67-b15f-ee4202171e38\") " pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-8spbl" Dec 07 19:31:25 crc kubenswrapper[4815]: I1207 19:31:25.583692 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97w9j\" (UniqueName: \"kubernetes.io/projected/94d115be-3d8c-46e7-9a22-e09bb888afc8-kube-api-access-97w9j\") pod \"placement-operator-controller-manager-78f8948974-4sb78\" (UID: \"94d115be-3d8c-46e7-9a22-e09bb888afc8\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-4sb78" Dec 07 19:31:25 crc kubenswrapper[4815]: I1207 19:31:25.583734 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7j7bz\" (UniqueName: \"kubernetes.io/projected/34bdeed9-8374-4eea-a1d9-562d933324e9-kube-api-access-7j7bz\") pod \"ovn-operator-controller-manager-b6456fdb6-dtpkv\" (UID: \"34bdeed9-8374-4eea-a1d9-562d933324e9\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-dtpkv" Dec 07 19:31:25 crc kubenswrapper[4815]: I1207 19:31:25.583765 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsg8g\" (UniqueName: \"kubernetes.io/projected/2e427dde-fbfa-4b36-9749-e83080d8733a-kube-api-access-zsg8g\") pod \"swift-operator-controller-manager-9d58d64bc-jsdgf\" (UID: \"2e427dde-fbfa-4b36-9749-e83080d8733a\") " pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-jsdgf" Dec 07 19:31:25 crc kubenswrapper[4815]: I1207 19:31:25.593510 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-667bd8d554-kdbct"] Dec 07 19:31:25 crc kubenswrapper[4815]: I1207 19:31:25.594686 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-kdbct" Dec 07 19:31:25 crc kubenswrapper[4815]: I1207 19:31:25.617432 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7j7bz\" (UniqueName: \"kubernetes.io/projected/34bdeed9-8374-4eea-a1d9-562d933324e9-kube-api-access-7j7bz\") pod \"ovn-operator-controller-manager-b6456fdb6-dtpkv\" (UID: \"34bdeed9-8374-4eea-a1d9-562d933324e9\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-dtpkv" Dec 07 19:31:25 crc kubenswrapper[4815]: I1207 19:31:25.623532 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-6mk2d" Dec 07 19:31:25 crc kubenswrapper[4815]: I1207 19:31:25.650312 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-4wf77" Dec 07 19:31:25 crc kubenswrapper[4815]: I1207 19:31:25.685559 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kr57n\" (UniqueName: \"kubernetes.io/projected/8e51367a-70c8-4b67-b15f-ee4202171e38-kube-api-access-kr57n\") pod \"telemetry-operator-controller-manager-58d5ff84df-8spbl\" (UID: \"8e51367a-70c8-4b67-b15f-ee4202171e38\") " pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-8spbl" Dec 07 19:31:25 crc kubenswrapper[4815]: I1207 19:31:25.685602 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97w9j\" (UniqueName: \"kubernetes.io/projected/94d115be-3d8c-46e7-9a22-e09bb888afc8-kube-api-access-97w9j\") pod \"placement-operator-controller-manager-78f8948974-4sb78\" (UID: \"94d115be-3d8c-46e7-9a22-e09bb888afc8\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-4sb78" Dec 07 19:31:25 crc kubenswrapper[4815]: I1207 19:31:25.685634 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zk7k\" (UniqueName: \"kubernetes.io/projected/21b81ae2-586e-418c-867d-0d10c3c094eb-kube-api-access-8zk7k\") pod \"watcher-operator-controller-manager-667bd8d554-kdbct\" (UID: \"21b81ae2-586e-418c-867d-0d10c3c094eb\") " pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-kdbct" Dec 07 19:31:25 crc kubenswrapper[4815]: I1207 19:31:25.685665 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsg8g\" (UniqueName: \"kubernetes.io/projected/2e427dde-fbfa-4b36-9749-e83080d8733a-kube-api-access-zsg8g\") pod \"swift-operator-controller-manager-9d58d64bc-jsdgf\" (UID: \"2e427dde-fbfa-4b36-9749-e83080d8733a\") " pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-jsdgf" Dec 07 19:31:25 crc kubenswrapper[4815]: I1207 19:31:25.685691 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szc7k\" (UniqueName: \"kubernetes.io/projected/bed3f9f7-d38b-4987-90fd-1c4a380165f4-kube-api-access-szc7k\") pod \"test-operator-controller-manager-5854674fcc-gfvgj\" (UID: \"bed3f9f7-d38b-4987-90fd-1c4a380165f4\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-gfvgj" Dec 07 19:31:25 crc kubenswrapper[4815]: I1207 19:31:25.715337 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-667bd8d554-kdbct"] Dec 07 19:31:25 crc kubenswrapper[4815]: I1207 19:31:25.721741 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kr57n\" (UniqueName: \"kubernetes.io/projected/8e51367a-70c8-4b67-b15f-ee4202171e38-kube-api-access-kr57n\") pod \"telemetry-operator-controller-manager-58d5ff84df-8spbl\" (UID: \"8e51367a-70c8-4b67-b15f-ee4202171e38\") " pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-8spbl" Dec 07 19:31:25 crc kubenswrapper[4815]: I1207 19:31:25.726272 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97w9j\" (UniqueName: \"kubernetes.io/projected/94d115be-3d8c-46e7-9a22-e09bb888afc8-kube-api-access-97w9j\") pod \"placement-operator-controller-manager-78f8948974-4sb78\" (UID: \"94d115be-3d8c-46e7-9a22-e09bb888afc8\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-4sb78" Dec 07 19:31:25 crc kubenswrapper[4815]: I1207 19:31:25.732016 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-m6p4v" Dec 07 19:31:25 crc kubenswrapper[4815]: I1207 19:31:25.739676 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsg8g\" (UniqueName: \"kubernetes.io/projected/2e427dde-fbfa-4b36-9749-e83080d8733a-kube-api-access-zsg8g\") pod \"swift-operator-controller-manager-9d58d64bc-jsdgf\" (UID: \"2e427dde-fbfa-4b36-9749-e83080d8733a\") " pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-jsdgf" Dec 07 19:31:25 crc kubenswrapper[4815]: I1207 19:31:25.789292 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szc7k\" (UniqueName: \"kubernetes.io/projected/bed3f9f7-d38b-4987-90fd-1c4a380165f4-kube-api-access-szc7k\") pod \"test-operator-controller-manager-5854674fcc-gfvgj\" (UID: \"bed3f9f7-d38b-4987-90fd-1c4a380165f4\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-gfvgj" Dec 07 19:31:25 crc kubenswrapper[4815]: I1207 19:31:25.789522 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zk7k\" (UniqueName: \"kubernetes.io/projected/21b81ae2-586e-418c-867d-0d10c3c094eb-kube-api-access-8zk7k\") pod \"watcher-operator-controller-manager-667bd8d554-kdbct\" (UID: \"21b81ae2-586e-418c-867d-0d10c3c094eb\") " pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-kdbct" Dec 07 19:31:25 crc kubenswrapper[4815]: I1207 19:31:25.821864 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-8mg5j" Dec 07 19:31:25 crc kubenswrapper[4815]: I1207 19:31:25.919792 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-dtpkv" Dec 07 19:31:26 crc kubenswrapper[4815]: I1207 19:31:26.022298 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-jsdgf" Dec 07 19:31:26 crc kubenswrapper[4815]: I1207 19:31:26.030026 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-4sb78" Dec 07 19:31:26 crc kubenswrapper[4815]: I1207 19:31:26.032399 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zk7k\" (UniqueName: \"kubernetes.io/projected/21b81ae2-586e-418c-867d-0d10c3c094eb-kube-api-access-8zk7k\") pod \"watcher-operator-controller-manager-667bd8d554-kdbct\" (UID: \"21b81ae2-586e-418c-867d-0d10c3c094eb\") " pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-kdbct" Dec 07 19:31:26 crc kubenswrapper[4815]: I1207 19:31:26.034605 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/097ed41c-7445-4a2f-ba72-c9ff11bb0e28-cert\") pod \"openstack-baremetal-operator-controller-manager-84b575879fvvqrm\" (UID: \"097ed41c-7445-4a2f-ba72-c9ff11bb0e28\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fvvqrm" Dec 07 19:31:26 crc kubenswrapper[4815]: E1207 19:31:26.034746 4815 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 07 19:31:26 crc kubenswrapper[4815]: E1207 19:31:26.034792 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/097ed41c-7445-4a2f-ba72-c9ff11bb0e28-cert podName:097ed41c-7445-4a2f-ba72-c9ff11bb0e28 nodeName:}" failed. No retries permitted until 2025-12-07 19:31:27.034781654 +0000 UTC m=+991.613771699 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/097ed41c-7445-4a2f-ba72-c9ff11bb0e28-cert") pod "openstack-baremetal-operator-controller-manager-84b575879fvvqrm" (UID: "097ed41c-7445-4a2f-ba72-c9ff11bb0e28") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 07 19:31:26 crc kubenswrapper[4815]: I1207 19:31:26.046969 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-8spbl" Dec 07 19:31:26 crc kubenswrapper[4815]: I1207 19:31:26.117440 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szc7k\" (UniqueName: \"kubernetes.io/projected/bed3f9f7-d38b-4987-90fd-1c4a380165f4-kube-api-access-szc7k\") pod \"test-operator-controller-manager-5854674fcc-gfvgj\" (UID: \"bed3f9f7-d38b-4987-90fd-1c4a380165f4\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-gfvgj" Dec 07 19:31:26 crc kubenswrapper[4815]: I1207 19:31:26.242951 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-gfvgj" Dec 07 19:31:26 crc kubenswrapper[4815]: I1207 19:31:26.361054 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7dc7d5d6ff-dnrvh"] Dec 07 19:31:26 crc kubenswrapper[4815]: I1207 19:31:26.361527 4815 patch_prober.go:28] interesting pod/machine-config-daemon-gkn4h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 07 19:31:26 crc kubenswrapper[4815]: I1207 19:31:26.361596 4815 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 07 19:31:26 crc kubenswrapper[4815]: I1207 19:31:26.362006 4815 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" Dec 07 19:31:26 crc kubenswrapper[4815]: I1207 19:31:26.362369 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-7dc7d5d6ff-dnrvh" Dec 07 19:31:26 crc kubenswrapper[4815]: I1207 19:31:26.362962 4815 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"79cc7c5fc46172fc78f3ba5349136e2b03ed048984c9a5cc4bf7488a43b013ac"} pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 07 19:31:26 crc kubenswrapper[4815]: I1207 19:31:26.364683 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" containerName="machine-config-daemon" containerID="cri-o://79cc7c5fc46172fc78f3ba5349136e2b03ed048984c9a5cc4bf7488a43b013ac" gracePeriod=600 Dec 07 19:31:26 crc kubenswrapper[4815]: I1207 19:31:26.365280 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-8mk6z" Dec 07 19:31:26 crc kubenswrapper[4815]: I1207 19:31:26.365293 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Dec 07 19:31:26 crc kubenswrapper[4815]: I1207 19:31:26.365847 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-kdbct" Dec 07 19:31:26 crc kubenswrapper[4815]: I1207 19:31:26.371623 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Dec 07 19:31:26 crc kubenswrapper[4815]: I1207 19:31:26.384126 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7dc7d5d6ff-dnrvh"] Dec 07 19:31:26 crc kubenswrapper[4815]: I1207 19:31:26.400932 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xx86c"] Dec 07 19:31:26 crc kubenswrapper[4815]: I1207 19:31:26.401982 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xx86c" Dec 07 19:31:26 crc kubenswrapper[4815]: I1207 19:31:26.404261 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-6tfpv" Dec 07 19:31:26 crc kubenswrapper[4815]: I1207 19:31:26.411081 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xx86c"] Dec 07 19:31:26 crc kubenswrapper[4815]: I1207 19:31:26.451574 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/33f773e2-304c-4d0b-98e9-6fd309462297-metrics-certs\") pod \"openstack-operator-controller-manager-7dc7d5d6ff-dnrvh\" (UID: \"33f773e2-304c-4d0b-98e9-6fd309462297\") " pod="openstack-operators/openstack-operator-controller-manager-7dc7d5d6ff-dnrvh" Dec 07 19:31:26 crc kubenswrapper[4815]: I1207 19:31:26.451615 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/01adf042-9afe-46be-ba0c-0c1a3f86ed8d-cert\") pod \"infra-operator-controller-manager-78d48bff9d-5rmk7\" (UID: \"01adf042-9afe-46be-ba0c-0c1a3f86ed8d\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-5rmk7" Dec 07 19:31:26 crc kubenswrapper[4815]: I1207 19:31:26.451637 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slbrg\" (UniqueName: \"kubernetes.io/projected/2aeb4ccc-f6a6-4614-917c-55cf0a46c3cd-kube-api-access-slbrg\") pod \"rabbitmq-cluster-operator-manager-668c99d594-xx86c\" (UID: \"2aeb4ccc-f6a6-4614-917c-55cf0a46c3cd\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xx86c" Dec 07 19:31:26 crc kubenswrapper[4815]: I1207 19:31:26.451706 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwp9d\" (UniqueName: \"kubernetes.io/projected/33f773e2-304c-4d0b-98e9-6fd309462297-kube-api-access-zwp9d\") pod \"openstack-operator-controller-manager-7dc7d5d6ff-dnrvh\" (UID: \"33f773e2-304c-4d0b-98e9-6fd309462297\") " pod="openstack-operators/openstack-operator-controller-manager-7dc7d5d6ff-dnrvh" Dec 07 19:31:26 crc kubenswrapper[4815]: I1207 19:31:26.451729 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/33f773e2-304c-4d0b-98e9-6fd309462297-webhook-certs\") pod \"openstack-operator-controller-manager-7dc7d5d6ff-dnrvh\" (UID: \"33f773e2-304c-4d0b-98e9-6fd309462297\") " pod="openstack-operators/openstack-operator-controller-manager-7dc7d5d6ff-dnrvh" Dec 07 19:31:26 crc kubenswrapper[4815]: E1207 19:31:26.452539 4815 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 07 19:31:26 crc kubenswrapper[4815]: E1207 19:31:26.452581 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/01adf042-9afe-46be-ba0c-0c1a3f86ed8d-cert podName:01adf042-9afe-46be-ba0c-0c1a3f86ed8d nodeName:}" failed. No retries permitted until 2025-12-07 19:31:28.452566917 +0000 UTC m=+993.031556962 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/01adf042-9afe-46be-ba0c-0c1a3f86ed8d-cert") pod "infra-operator-controller-manager-78d48bff9d-5rmk7" (UID: "01adf042-9afe-46be-ba0c-0c1a3f86ed8d") : secret "infra-operator-webhook-server-cert" not found Dec 07 19:31:26 crc kubenswrapper[4815]: I1207 19:31:26.479895 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-jr22f"] Dec 07 19:31:26 crc kubenswrapper[4815]: I1207 19:31:26.505891 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-6c677c69b-bq2qd"] Dec 07 19:31:26 crc kubenswrapper[4815]: I1207 19:31:26.555394 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/33f773e2-304c-4d0b-98e9-6fd309462297-metrics-certs\") pod \"openstack-operator-controller-manager-7dc7d5d6ff-dnrvh\" (UID: \"33f773e2-304c-4d0b-98e9-6fd309462297\") " pod="openstack-operators/openstack-operator-controller-manager-7dc7d5d6ff-dnrvh" Dec 07 19:31:26 crc kubenswrapper[4815]: I1207 19:31:26.555466 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-slbrg\" (UniqueName: \"kubernetes.io/projected/2aeb4ccc-f6a6-4614-917c-55cf0a46c3cd-kube-api-access-slbrg\") pod \"rabbitmq-cluster-operator-manager-668c99d594-xx86c\" (UID: \"2aeb4ccc-f6a6-4614-917c-55cf0a46c3cd\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xx86c" Dec 07 19:31:26 crc kubenswrapper[4815]: I1207 19:31:26.555545 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwp9d\" (UniqueName: \"kubernetes.io/projected/33f773e2-304c-4d0b-98e9-6fd309462297-kube-api-access-zwp9d\") pod \"openstack-operator-controller-manager-7dc7d5d6ff-dnrvh\" (UID: \"33f773e2-304c-4d0b-98e9-6fd309462297\") " pod="openstack-operators/openstack-operator-controller-manager-7dc7d5d6ff-dnrvh" Dec 07 19:31:26 crc kubenswrapper[4815]: I1207 19:31:26.555572 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/33f773e2-304c-4d0b-98e9-6fd309462297-webhook-certs\") pod \"openstack-operator-controller-manager-7dc7d5d6ff-dnrvh\" (UID: \"33f773e2-304c-4d0b-98e9-6fd309462297\") " pod="openstack-operators/openstack-operator-controller-manager-7dc7d5d6ff-dnrvh" Dec 07 19:31:26 crc kubenswrapper[4815]: E1207 19:31:26.555720 4815 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 07 19:31:26 crc kubenswrapper[4815]: E1207 19:31:26.555765 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33f773e2-304c-4d0b-98e9-6fd309462297-webhook-certs podName:33f773e2-304c-4d0b-98e9-6fd309462297 nodeName:}" failed. No retries permitted until 2025-12-07 19:31:27.055751206 +0000 UTC m=+991.634741251 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/33f773e2-304c-4d0b-98e9-6fd309462297-webhook-certs") pod "openstack-operator-controller-manager-7dc7d5d6ff-dnrvh" (UID: "33f773e2-304c-4d0b-98e9-6fd309462297") : secret "webhook-server-cert" not found Dec 07 19:31:26 crc kubenswrapper[4815]: E1207 19:31:26.556503 4815 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 07 19:31:26 crc kubenswrapper[4815]: E1207 19:31:26.556532 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33f773e2-304c-4d0b-98e9-6fd309462297-metrics-certs podName:33f773e2-304c-4d0b-98e9-6fd309462297 nodeName:}" failed. No retries permitted until 2025-12-07 19:31:27.056524318 +0000 UTC m=+991.635514363 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/33f773e2-304c-4d0b-98e9-6fd309462297-metrics-certs") pod "openstack-operator-controller-manager-7dc7d5d6ff-dnrvh" (UID: "33f773e2-304c-4d0b-98e9-6fd309462297") : secret "metrics-server-cert" not found Dec 07 19:31:26 crc kubenswrapper[4815]: I1207 19:31:26.576105 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-slbrg\" (UniqueName: \"kubernetes.io/projected/2aeb4ccc-f6a6-4614-917c-55cf0a46c3cd-kube-api-access-slbrg\") pod \"rabbitmq-cluster-operator-manager-668c99d594-xx86c\" (UID: \"2aeb4ccc-f6a6-4614-917c-55cf0a46c3cd\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xx86c" Dec 07 19:31:26 crc kubenswrapper[4815]: I1207 19:31:26.664484 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-697fb699cf-psfhw"] Dec 07 19:31:26 crc kubenswrapper[4815]: I1207 19:31:26.775039 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xx86c" Dec 07 19:31:26 crc kubenswrapper[4815]: I1207 19:31:26.775478 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwp9d\" (UniqueName: \"kubernetes.io/projected/33f773e2-304c-4d0b-98e9-6fd309462297-kube-api-access-zwp9d\") pod \"openstack-operator-controller-manager-7dc7d5d6ff-dnrvh\" (UID: \"33f773e2-304c-4d0b-98e9-6fd309462297\") " pod="openstack-operators/openstack-operator-controller-manager-7dc7d5d6ff-dnrvh" Dec 07 19:31:27 crc kubenswrapper[4815]: I1207 19:31:27.091647 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/33f773e2-304c-4d0b-98e9-6fd309462297-webhook-certs\") pod \"openstack-operator-controller-manager-7dc7d5d6ff-dnrvh\" (UID: \"33f773e2-304c-4d0b-98e9-6fd309462297\") " pod="openstack-operators/openstack-operator-controller-manager-7dc7d5d6ff-dnrvh" Dec 07 19:31:27 crc kubenswrapper[4815]: I1207 19:31:27.091723 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/097ed41c-7445-4a2f-ba72-c9ff11bb0e28-cert\") pod \"openstack-baremetal-operator-controller-manager-84b575879fvvqrm\" (UID: \"097ed41c-7445-4a2f-ba72-c9ff11bb0e28\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fvvqrm" Dec 07 19:31:27 crc kubenswrapper[4815]: I1207 19:31:27.091794 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/33f773e2-304c-4d0b-98e9-6fd309462297-metrics-certs\") pod \"openstack-operator-controller-manager-7dc7d5d6ff-dnrvh\" (UID: \"33f773e2-304c-4d0b-98e9-6fd309462297\") " pod="openstack-operators/openstack-operator-controller-manager-7dc7d5d6ff-dnrvh" Dec 07 19:31:27 crc kubenswrapper[4815]: E1207 19:31:27.091815 4815 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 07 19:31:27 crc kubenswrapper[4815]: E1207 19:31:27.091868 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33f773e2-304c-4d0b-98e9-6fd309462297-webhook-certs podName:33f773e2-304c-4d0b-98e9-6fd309462297 nodeName:}" failed. No retries permitted until 2025-12-07 19:31:28.091848283 +0000 UTC m=+992.670838338 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/33f773e2-304c-4d0b-98e9-6fd309462297-webhook-certs") pod "openstack-operator-controller-manager-7dc7d5d6ff-dnrvh" (UID: "33f773e2-304c-4d0b-98e9-6fd309462297") : secret "webhook-server-cert" not found Dec 07 19:31:27 crc kubenswrapper[4815]: E1207 19:31:27.091945 4815 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 07 19:31:27 crc kubenswrapper[4815]: E1207 19:31:27.091970 4815 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 07 19:31:27 crc kubenswrapper[4815]: E1207 19:31:27.091983 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/097ed41c-7445-4a2f-ba72-c9ff11bb0e28-cert podName:097ed41c-7445-4a2f-ba72-c9ff11bb0e28 nodeName:}" failed. No retries permitted until 2025-12-07 19:31:29.091973686 +0000 UTC m=+993.670963731 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/097ed41c-7445-4a2f-ba72-c9ff11bb0e28-cert") pod "openstack-baremetal-operator-controller-manager-84b575879fvvqrm" (UID: "097ed41c-7445-4a2f-ba72-c9ff11bb0e28") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 07 19:31:27 crc kubenswrapper[4815]: E1207 19:31:27.092005 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33f773e2-304c-4d0b-98e9-6fd309462297-metrics-certs podName:33f773e2-304c-4d0b-98e9-6fd309462297 nodeName:}" failed. No retries permitted until 2025-12-07 19:31:28.091994257 +0000 UTC m=+992.670984302 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/33f773e2-304c-4d0b-98e9-6fd309462297-metrics-certs") pod "openstack-operator-controller-manager-7dc7d5d6ff-dnrvh" (UID: "33f773e2-304c-4d0b-98e9-6fd309462297") : secret "metrics-server-cert" not found Dec 07 19:31:27 crc kubenswrapper[4815]: I1207 19:31:27.142703 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5697bb5779-7wwz9"] Dec 07 19:31:27 crc kubenswrapper[4815]: I1207 19:31:27.166857 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-trnlm"] Dec 07 19:31:27 crc kubenswrapper[4815]: W1207 19:31:27.199847 4815 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6eb6a40_b713_4a4a_9554_749f94cf1137.slice/crio-66a9f0523374d3512229427fe52a4797964514fc1a78bc9ee1d3d0faf1b2a766 WatchSource:0}: Error finding container 66a9f0523374d3512229427fe52a4797964514fc1a78bc9ee1d3d0faf1b2a766: Status 404 returned error can't find the container with id 66a9f0523374d3512229427fe52a4797964514fc1a78bc9ee1d3d0faf1b2a766 Dec 07 19:31:27 crc kubenswrapper[4815]: W1207 19:31:27.211040 4815 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod603612f4_25fa_4356_a99c_b054645d8919.slice/crio-bcabf28246761586c811557ecd29ba285d06da3d6dfd3dde213227c168fa85b5 WatchSource:0}: Error finding container bcabf28246761586c811557ecd29ba285d06da3d6dfd3dde213227c168fa85b5: Status 404 returned error can't find the container with id bcabf28246761586c811557ecd29ba285d06da3d6dfd3dde213227c168fa85b5 Dec 07 19:31:27 crc kubenswrapper[4815]: I1207 19:31:27.212740 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zljgk" Dec 07 19:31:27 crc kubenswrapper[4815]: I1207 19:31:27.598123 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-bq2qd" event={"ID":"0cc9e387-27e4-4b5d-ac7a-d9f098acb973","Type":"ContainerStarted","Data":"4d784f8e283995599de37b0654b96b98df890cef95179bfdbe4e6d82205d25d8"} Dec 07 19:31:27 crc kubenswrapper[4815]: I1207 19:31:27.601408 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-spddl"] Dec 07 19:31:27 crc kubenswrapper[4815]: I1207 19:31:27.610662 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zljgk" Dec 07 19:31:27 crc kubenswrapper[4815]: I1207 19:31:27.612146 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-7wwz9" event={"ID":"603612f4-25fa-4356-a99c-b054645d8919","Type":"ContainerStarted","Data":"bcabf28246761586c811557ecd29ba285d06da3d6dfd3dde213227c168fa85b5"} Dec 07 19:31:27 crc kubenswrapper[4815]: I1207 19:31:27.628382 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-jr22f" event={"ID":"4c8e53f4-6dec-4655-b931-b8d0b8ddc8da","Type":"ContainerStarted","Data":"baed81e7c0ad7f2bd259ec1648b21c62c4ca2e8f135242ec0529f38f7cfb5ad7"} Dec 07 19:31:27 crc kubenswrapper[4815]: I1207 19:31:27.634221 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-trnlm" event={"ID":"d6eb6a40-b713-4a4a-9554-749f94cf1137","Type":"ContainerStarted","Data":"66a9f0523374d3512229427fe52a4797964514fc1a78bc9ee1d3d0faf1b2a766"} Dec 07 19:31:27 crc kubenswrapper[4815]: I1207 19:31:27.638752 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-psfhw" event={"ID":"cbeacff5-ac96-4444-aa57-04a320582348","Type":"ContainerStarted","Data":"331915908b7b979f673447d7f04332d41b7842b60dcbc7d1bcaefcc43373d5e4"} Dec 07 19:31:27 crc kubenswrapper[4815]: I1207 19:31:27.740758 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-967d97867-z7p7t"] Dec 07 19:31:27 crc kubenswrapper[4815]: W1207 19:31:27.759533 4815 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0404b56e_87cb_40c5_b11c_64dc7c960718.slice/crio-740c2efcbe5a5a45b5d913ee5109c76d429408008b9fda65ca21d47c52588b24 WatchSource:0}: Error finding container 740c2efcbe5a5a45b5d913ee5109c76d429408008b9fda65ca21d47c52588b24: Status 404 returned error can't find the container with id 740c2efcbe5a5a45b5d913ee5109c76d429408008b9fda65ca21d47c52588b24 Dec 07 19:31:27 crc kubenswrapper[4815]: I1207 19:31:27.761405 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-5shnd"] Dec 07 19:31:27 crc kubenswrapper[4815]: I1207 19:31:27.940606 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zljgk"] Dec 07 19:31:27 crc kubenswrapper[4815]: I1207 19:31:27.940670 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-5b5fd79c9c-7d9wv"] Dec 07 19:31:27 crc kubenswrapper[4815]: I1207 19:31:27.972777 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-m6p4v"] Dec 07 19:31:27 crc kubenswrapper[4815]: I1207 19:31:27.985469 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-x59hf"] Dec 07 19:31:28 crc kubenswrapper[4815]: I1207 19:31:28.007301 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-8mg5j"] Dec 07 19:31:28 crc kubenswrapper[4815]: W1207 19:31:28.015148 4815 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0dc4f082_f669_444f_a46e_9bca4cc20f31.slice/crio-6119ec3a7f33e185bdd3605922af77d568916dcac54bbe87a64230831c7403e8 WatchSource:0}: Error finding container 6119ec3a7f33e185bdd3605922af77d568916dcac54bbe87a64230831c7403e8: Status 404 returned error can't find the container with id 6119ec3a7f33e185bdd3605922af77d568916dcac54bbe87a64230831c7403e8 Dec 07 19:31:28 crc kubenswrapper[4815]: I1207 19:31:28.189845 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/33f773e2-304c-4d0b-98e9-6fd309462297-webhook-certs\") pod \"openstack-operator-controller-manager-7dc7d5d6ff-dnrvh\" (UID: \"33f773e2-304c-4d0b-98e9-6fd309462297\") " pod="openstack-operators/openstack-operator-controller-manager-7dc7d5d6ff-dnrvh" Dec 07 19:31:28 crc kubenswrapper[4815]: I1207 19:31:28.189949 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/33f773e2-304c-4d0b-98e9-6fd309462297-metrics-certs\") pod \"openstack-operator-controller-manager-7dc7d5d6ff-dnrvh\" (UID: \"33f773e2-304c-4d0b-98e9-6fd309462297\") " pod="openstack-operators/openstack-operator-controller-manager-7dc7d5d6ff-dnrvh" Dec 07 19:31:28 crc kubenswrapper[4815]: I1207 19:31:28.189987 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-dtpkv"] Dec 07 19:31:28 crc kubenswrapper[4815]: E1207 19:31:28.190062 4815 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 07 19:31:28 crc kubenswrapper[4815]: E1207 19:31:28.190115 4815 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 07 19:31:28 crc kubenswrapper[4815]: E1207 19:31:28.190118 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33f773e2-304c-4d0b-98e9-6fd309462297-metrics-certs podName:33f773e2-304c-4d0b-98e9-6fd309462297 nodeName:}" failed. No retries permitted until 2025-12-07 19:31:30.190092285 +0000 UTC m=+994.769082330 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/33f773e2-304c-4d0b-98e9-6fd309462297-metrics-certs") pod "openstack-operator-controller-manager-7dc7d5d6ff-dnrvh" (UID: "33f773e2-304c-4d0b-98e9-6fd309462297") : secret "metrics-server-cert" not found Dec 07 19:31:28 crc kubenswrapper[4815]: E1207 19:31:28.190175 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33f773e2-304c-4d0b-98e9-6fd309462297-webhook-certs podName:33f773e2-304c-4d0b-98e9-6fd309462297 nodeName:}" failed. No retries permitted until 2025-12-07 19:31:30.190161327 +0000 UTC m=+994.769151372 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/33f773e2-304c-4d0b-98e9-6fd309462297-webhook-certs") pod "openstack-operator-controller-manager-7dc7d5d6ff-dnrvh" (UID: "33f773e2-304c-4d0b-98e9-6fd309462297") : secret "webhook-server-cert" not found Dec 07 19:31:28 crc kubenswrapper[4815]: I1207 19:31:28.213086 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-9d58d64bc-jsdgf"] Dec 07 19:31:28 crc kubenswrapper[4815]: I1207 19:31:28.247677 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-79c8c4686c-4wf77"] Dec 07 19:31:28 crc kubenswrapper[4815]: W1207 19:31:28.253052 4815 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc727b772_a57c_4564_bf8d_7c8917b4bb0d.slice/crio-b62583d8f679caaf36ebc9c832947d89cc08f1a25ba1f9dd558e357f9f087fb9 WatchSource:0}: Error finding container b62583d8f679caaf36ebc9c832947d89cc08f1a25ba1f9dd558e357f9f087fb9: Status 404 returned error can't find the container with id b62583d8f679caaf36ebc9c832947d89cc08f1a25ba1f9dd558e357f9f087fb9 Dec 07 19:31:28 crc kubenswrapper[4815]: I1207 19:31:28.286002 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-gfvgj"] Dec 07 19:31:28 crc kubenswrapper[4815]: W1207 19:31:28.287191 4815 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbed3f9f7_d38b_4987_90fd_1c4a380165f4.slice/crio-37947f68e3fa95858ebec80d1c6a1ff1705f3b26a6a4720f5746bbba7e10a497 WatchSource:0}: Error finding container 37947f68e3fa95858ebec80d1c6a1ff1705f3b26a6a4720f5746bbba7e10a497: Status 404 returned error can't find the container with id 37947f68e3fa95858ebec80d1c6a1ff1705f3b26a6a4720f5746bbba7e10a497 Dec 07 19:31:28 crc kubenswrapper[4815]: I1207 19:31:28.353308 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-667bd8d554-kdbct"] Dec 07 19:31:28 crc kubenswrapper[4815]: I1207 19:31:28.366200 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xx86c"] Dec 07 19:31:28 crc kubenswrapper[4815]: W1207 19:31:28.370834 4815 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2aeb4ccc_f6a6_4614_917c_55cf0a46c3cd.slice/crio-4418f31d60a7b6e22f670eff83fa904d67a7edc643babba55784efb285444a34 WatchSource:0}: Error finding container 4418f31d60a7b6e22f670eff83fa904d67a7edc643babba55784efb285444a34: Status 404 returned error can't find the container with id 4418f31d60a7b6e22f670eff83fa904d67a7edc643babba55784efb285444a34 Dec 07 19:31:28 crc kubenswrapper[4815]: I1207 19:31:28.382561 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-58d5ff84df-8spbl"] Dec 07 19:31:28 crc kubenswrapper[4815]: E1207 19:31:28.384852 4815 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:6b3e0302608a2e70f9b5ae9167f6fbf59264f226d9db99d48f70466ab2f216b8,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8zk7k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-667bd8d554-kdbct_openstack-operators(21b81ae2-586e-418c-867d-0d10c3c094eb): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 07 19:31:28 crc kubenswrapper[4815]: E1207 19:31:28.397825 4815 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8zk7k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-667bd8d554-kdbct_openstack-operators(21b81ae2-586e-418c-867d-0d10c3c094eb): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 07 19:31:28 crc kubenswrapper[4815]: E1207 19:31:28.400986 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-kdbct" podUID="21b81ae2-586e-418c-867d-0d10c3c094eb" Dec 07 19:31:28 crc kubenswrapper[4815]: I1207 19:31:28.405135 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-4sb78"] Dec 07 19:31:28 crc kubenswrapper[4815]: E1207 19:31:28.410802 4815 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:f27e732ec1faee765461bf137d9be81278b2fa39675019a73622755e1e610b6f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kr57n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-58d5ff84df-8spbl_openstack-operators(8e51367a-70c8-4b67-b15f-ee4202171e38): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 07 19:31:28 crc kubenswrapper[4815]: E1207 19:31:28.422462 4815 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kr57n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-58d5ff84df-8spbl_openstack-operators(8e51367a-70c8-4b67-b15f-ee4202171e38): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 07 19:31:28 crc kubenswrapper[4815]: E1207 19:31:28.423682 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-8spbl" podUID="8e51367a-70c8-4b67-b15f-ee4202171e38" Dec 07 19:31:28 crc kubenswrapper[4815]: E1207 19:31:28.457589 4815 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-97w9j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-78f8948974-4sb78_openstack-operators(94d115be-3d8c-46e7-9a22-e09bb888afc8): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 07 19:31:28 crc kubenswrapper[4815]: E1207 19:31:28.463477 4815 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-97w9j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-78f8948974-4sb78_openstack-operators(94d115be-3d8c-46e7-9a22-e09bb888afc8): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 07 19:31:28 crc kubenswrapper[4815]: E1207 19:31:28.464764 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/placement-operator-controller-manager-78f8948974-4sb78" podUID="94d115be-3d8c-46e7-9a22-e09bb888afc8" Dec 07 19:31:28 crc kubenswrapper[4815]: I1207 19:31:28.498605 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/01adf042-9afe-46be-ba0c-0c1a3f86ed8d-cert\") pod \"infra-operator-controller-manager-78d48bff9d-5rmk7\" (UID: \"01adf042-9afe-46be-ba0c-0c1a3f86ed8d\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-5rmk7" Dec 07 19:31:28 crc kubenswrapper[4815]: E1207 19:31:28.498783 4815 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 07 19:31:28 crc kubenswrapper[4815]: E1207 19:31:28.498876 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/01adf042-9afe-46be-ba0c-0c1a3f86ed8d-cert podName:01adf042-9afe-46be-ba0c-0c1a3f86ed8d nodeName:}" failed. No retries permitted until 2025-12-07 19:31:32.49885436 +0000 UTC m=+997.077844455 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/01adf042-9afe-46be-ba0c-0c1a3f86ed8d-cert") pod "infra-operator-controller-manager-78d48bff9d-5rmk7" (UID: "01adf042-9afe-46be-ba0c-0c1a3f86ed8d") : secret "infra-operator-webhook-server-cert" not found Dec 07 19:31:28 crc kubenswrapper[4815]: I1207 19:31:28.661935 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-x59hf" event={"ID":"8055e37c-efd4-4c82-a9df-4d5e2a12ef63","Type":"ContainerStarted","Data":"ca0daf6c1734f4023b74f3154e69059f06a4003bc347bdd34c8cabcdb933f5f9"} Dec 07 19:31:28 crc kubenswrapper[4815]: I1207 19:31:28.671164 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-4wf77" event={"ID":"c727b772-a57c-4564-bf8d-7c8917b4bb0d","Type":"ContainerStarted","Data":"b62583d8f679caaf36ebc9c832947d89cc08f1a25ba1f9dd558e357f9f087fb9"} Dec 07 19:31:28 crc kubenswrapper[4815]: I1207 19:31:28.675113 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-spddl" event={"ID":"33bdbbd1-62ad-42d5-a10a-a5da1344af19","Type":"ContainerStarted","Data":"c7ac816119624af2648f781364f50aeb8592dcd9c81a96997cd77e79bc3cc001"} Dec 07 19:31:28 crc kubenswrapper[4815]: I1207 19:31:28.676440 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xx86c" event={"ID":"2aeb4ccc-f6a6-4614-917c-55cf0a46c3cd","Type":"ContainerStarted","Data":"4418f31d60a7b6e22f670eff83fa904d67a7edc643babba55784efb285444a34"} Dec 07 19:31:28 crc kubenswrapper[4815]: I1207 19:31:28.735325 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-7d9wv" event={"ID":"b7efe346-4e91-45a1-84d6-6a5aac1c739c","Type":"ContainerStarted","Data":"f19ec5b5fb92bad6c74f72c624033c1d9d3ed2e1caa5f1274b1723faa8071e7b"} Dec 07 19:31:28 crc kubenswrapper[4815]: I1207 19:31:28.741839 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-4sb78" event={"ID":"94d115be-3d8c-46e7-9a22-e09bb888afc8","Type":"ContainerStarted","Data":"1fc581ea8df71664aaa5e0846a5a21eaec7064350a04e7fc90fd95b80335436a"} Dec 07 19:31:28 crc kubenswrapper[4815]: E1207 19:31:28.749857 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/placement-operator-controller-manager-78f8948974-4sb78" podUID="94d115be-3d8c-46e7-9a22-e09bb888afc8" Dec 07 19:31:28 crc kubenswrapper[4815]: I1207 19:31:28.753618 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-gfvgj" event={"ID":"bed3f9f7-d38b-4987-90fd-1c4a380165f4","Type":"ContainerStarted","Data":"37947f68e3fa95858ebec80d1c6a1ff1705f3b26a6a4720f5746bbba7e10a497"} Dec 07 19:31:28 crc kubenswrapper[4815]: I1207 19:31:28.779293 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-jsdgf" event={"ID":"2e427dde-fbfa-4b36-9749-e83080d8733a","Type":"ContainerStarted","Data":"c7d7219aa198180da2ea1a6a7164c2775479bd1b3da2ca9a274a52d0d94c3b91"} Dec 07 19:31:28 crc kubenswrapper[4815]: I1207 19:31:28.781018 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-5shnd" event={"ID":"0404b56e-87cb-40c5-b11c-64dc7c960718","Type":"ContainerStarted","Data":"740c2efcbe5a5a45b5d913ee5109c76d429408008b9fda65ca21d47c52588b24"} Dec 07 19:31:28 crc kubenswrapper[4815]: I1207 19:31:28.789472 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-m6p4v" event={"ID":"2cd3ca37-5d31-4068-88df-1344ebfad5e7","Type":"ContainerStarted","Data":"e2eeebcaae4d3086a831275c5e7f33a7244ef61047a67223d95f039ead136e3d"} Dec 07 19:31:28 crc kubenswrapper[4815]: I1207 19:31:28.793675 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-8mg5j" event={"ID":"0dc4f082-f669-444f-a46e-9bca4cc20f31","Type":"ContainerStarted","Data":"6119ec3a7f33e185bdd3605922af77d568916dcac54bbe87a64230831c7403e8"} Dec 07 19:31:28 crc kubenswrapper[4815]: I1207 19:31:28.798841 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-8spbl" event={"ID":"8e51367a-70c8-4b67-b15f-ee4202171e38","Type":"ContainerStarted","Data":"d58cba493bc41f76c01eb9f9acacd8a713eeae3c62d9306a2b1e8d9459f31cf9"} Dec 07 19:31:28 crc kubenswrapper[4815]: E1207 19:31:28.803455 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:f27e732ec1faee765461bf137d9be81278b2fa39675019a73622755e1e610b6f\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-8spbl" podUID="8e51367a-70c8-4b67-b15f-ee4202171e38" Dec 07 19:31:28 crc kubenswrapper[4815]: I1207 19:31:28.812238 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-kdbct" event={"ID":"21b81ae2-586e-418c-867d-0d10c3c094eb","Type":"ContainerStarted","Data":"b0cc99710ab95fabb2f8f488727b230fc7cd90e1b4e006ec0d16bb7cb1189aff"} Dec 07 19:31:28 crc kubenswrapper[4815]: E1207 19:31:28.819726 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:6b3e0302608a2e70f9b5ae9167f6fbf59264f226d9db99d48f70466ab2f216b8\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-kdbct" podUID="21b81ae2-586e-418c-867d-0d10c3c094eb" Dec 07 19:31:28 crc kubenswrapper[4815]: I1207 19:31:28.854302 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-967d97867-z7p7t" event={"ID":"954764fa-df14-4604-96d2-6ddc12155406","Type":"ContainerStarted","Data":"663f6f19bc47dedd12ffd51bb781d3ef55d4a737762a90fdc020f80c448a9055"} Dec 07 19:31:28 crc kubenswrapper[4815]: I1207 19:31:28.891414 4815 generic.go:334] "Generic (PLEG): container finished" podID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" containerID="79cc7c5fc46172fc78f3ba5349136e2b03ed048984c9a5cc4bf7488a43b013ac" exitCode=0 Dec 07 19:31:28 crc kubenswrapper[4815]: I1207 19:31:28.891476 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" event={"ID":"3d662ba2-aa03-4eea-bd30-8ad40638f6c7","Type":"ContainerDied","Data":"79cc7c5fc46172fc78f3ba5349136e2b03ed048984c9a5cc4bf7488a43b013ac"} Dec 07 19:31:28 crc kubenswrapper[4815]: I1207 19:31:28.891511 4815 scope.go:117] "RemoveContainer" containerID="e659021b677075a133a9841eb7e4e0a1041afcc4554ca5f769bf95532398a212" Dec 07 19:31:28 crc kubenswrapper[4815]: I1207 19:31:28.894018 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-zljgk" podUID="83888d61-30ef-4fc4-97f1-66d4074e355c" containerName="registry-server" containerID="cri-o://1de32c423f98e3683ff4a1f36be417291027cea364f3360131e7f1b005db9e39" gracePeriod=2 Dec 07 19:31:28 crc kubenswrapper[4815]: I1207 19:31:28.894209 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-dtpkv" event={"ID":"34bdeed9-8374-4eea-a1d9-562d933324e9","Type":"ContainerStarted","Data":"41017d0ecf9b07cbfa48c851dc1a7f13f54f80bee4b880a359f859f19b5d51cb"} Dec 07 19:31:29 crc kubenswrapper[4815]: I1207 19:31:29.108770 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/097ed41c-7445-4a2f-ba72-c9ff11bb0e28-cert\") pod \"openstack-baremetal-operator-controller-manager-84b575879fvvqrm\" (UID: \"097ed41c-7445-4a2f-ba72-c9ff11bb0e28\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fvvqrm" Dec 07 19:31:29 crc kubenswrapper[4815]: E1207 19:31:29.109162 4815 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 07 19:31:29 crc kubenswrapper[4815]: E1207 19:31:29.109204 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/097ed41c-7445-4a2f-ba72-c9ff11bb0e28-cert podName:097ed41c-7445-4a2f-ba72-c9ff11bb0e28 nodeName:}" failed. No retries permitted until 2025-12-07 19:31:33.109190564 +0000 UTC m=+997.688180609 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/097ed41c-7445-4a2f-ba72-c9ff11bb0e28-cert") pod "openstack-baremetal-operator-controller-manager-84b575879fvvqrm" (UID: "097ed41c-7445-4a2f-ba72-c9ff11bb0e28") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 07 19:31:29 crc kubenswrapper[4815]: I1207 19:31:29.643931 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zljgk" Dec 07 19:31:29 crc kubenswrapper[4815]: I1207 19:31:29.836247 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83888d61-30ef-4fc4-97f1-66d4074e355c-catalog-content\") pod \"83888d61-30ef-4fc4-97f1-66d4074e355c\" (UID: \"83888d61-30ef-4fc4-97f1-66d4074e355c\") " Dec 07 19:31:29 crc kubenswrapper[4815]: I1207 19:31:29.836397 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4k7t\" (UniqueName: \"kubernetes.io/projected/83888d61-30ef-4fc4-97f1-66d4074e355c-kube-api-access-m4k7t\") pod \"83888d61-30ef-4fc4-97f1-66d4074e355c\" (UID: \"83888d61-30ef-4fc4-97f1-66d4074e355c\") " Dec 07 19:31:29 crc kubenswrapper[4815]: I1207 19:31:29.836440 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83888d61-30ef-4fc4-97f1-66d4074e355c-utilities\") pod \"83888d61-30ef-4fc4-97f1-66d4074e355c\" (UID: \"83888d61-30ef-4fc4-97f1-66d4074e355c\") " Dec 07 19:31:29 crc kubenswrapper[4815]: I1207 19:31:29.845989 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83888d61-30ef-4fc4-97f1-66d4074e355c-utilities" (OuterVolumeSpecName: "utilities") pod "83888d61-30ef-4fc4-97f1-66d4074e355c" (UID: "83888d61-30ef-4fc4-97f1-66d4074e355c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 07 19:31:29 crc kubenswrapper[4815]: I1207 19:31:29.879467 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83888d61-30ef-4fc4-97f1-66d4074e355c-kube-api-access-m4k7t" (OuterVolumeSpecName: "kube-api-access-m4k7t") pod "83888d61-30ef-4fc4-97f1-66d4074e355c" (UID: "83888d61-30ef-4fc4-97f1-66d4074e355c"). InnerVolumeSpecName "kube-api-access-m4k7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:31:29 crc kubenswrapper[4815]: I1207 19:31:29.949590 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4k7t\" (UniqueName: \"kubernetes.io/projected/83888d61-30ef-4fc4-97f1-66d4074e355c-kube-api-access-m4k7t\") on node \"crc\" DevicePath \"\"" Dec 07 19:31:29 crc kubenswrapper[4815]: I1207 19:31:29.949645 4815 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83888d61-30ef-4fc4-97f1-66d4074e355c-utilities\") on node \"crc\" DevicePath \"\"" Dec 07 19:31:30 crc kubenswrapper[4815]: I1207 19:31:29.997718 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83888d61-30ef-4fc4-97f1-66d4074e355c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "83888d61-30ef-4fc4-97f1-66d4074e355c" (UID: "83888d61-30ef-4fc4-97f1-66d4074e355c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 07 19:31:30 crc kubenswrapper[4815]: I1207 19:31:30.025048 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" event={"ID":"3d662ba2-aa03-4eea-bd30-8ad40638f6c7","Type":"ContainerStarted","Data":"f9fe1f41b8a7f99c3095c8366f5a8da29f411acbdda680568a628c2a0720c31e"} Dec 07 19:31:30 crc kubenswrapper[4815]: I1207 19:31:30.051025 4815 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83888d61-30ef-4fc4-97f1-66d4074e355c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 07 19:31:30 crc kubenswrapper[4815]: I1207 19:31:30.061958 4815 generic.go:334] "Generic (PLEG): container finished" podID="83888d61-30ef-4fc4-97f1-66d4074e355c" containerID="1de32c423f98e3683ff4a1f36be417291027cea364f3360131e7f1b005db9e39" exitCode=0 Dec 07 19:31:30 crc kubenswrapper[4815]: I1207 19:31:30.062700 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zljgk" Dec 07 19:31:30 crc kubenswrapper[4815]: I1207 19:31:30.065857 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zljgk" event={"ID":"83888d61-30ef-4fc4-97f1-66d4074e355c","Type":"ContainerDied","Data":"1de32c423f98e3683ff4a1f36be417291027cea364f3360131e7f1b005db9e39"} Dec 07 19:31:30 crc kubenswrapper[4815]: I1207 19:31:30.065976 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zljgk" event={"ID":"83888d61-30ef-4fc4-97f1-66d4074e355c","Type":"ContainerDied","Data":"d6fa7c9edde05e64f521c334f00d3ccdf595f10ddd823093da576a62f57e632c"} Dec 07 19:31:30 crc kubenswrapper[4815]: I1207 19:31:30.066048 4815 scope.go:117] "RemoveContainer" containerID="1de32c423f98e3683ff4a1f36be417291027cea364f3360131e7f1b005db9e39" Dec 07 19:31:30 crc kubenswrapper[4815]: E1207 19:31:30.100684 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/placement-operator-controller-manager-78f8948974-4sb78" podUID="94d115be-3d8c-46e7-9a22-e09bb888afc8" Dec 07 19:31:30 crc kubenswrapper[4815]: E1207 19:31:30.101077 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:6b3e0302608a2e70f9b5ae9167f6fbf59264f226d9db99d48f70466ab2f216b8\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-kdbct" podUID="21b81ae2-586e-418c-867d-0d10c3c094eb" Dec 07 19:31:30 crc kubenswrapper[4815]: E1207 19:31:30.105839 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:f27e732ec1faee765461bf137d9be81278b2fa39675019a73622755e1e610b6f\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-8spbl" podUID="8e51367a-70c8-4b67-b15f-ee4202171e38" Dec 07 19:31:30 crc kubenswrapper[4815]: I1207 19:31:30.153770 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zljgk"] Dec 07 19:31:30 crc kubenswrapper[4815]: I1207 19:31:30.167317 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-zljgk"] Dec 07 19:31:30 crc kubenswrapper[4815]: I1207 19:31:30.202556 4815 scope.go:117] "RemoveContainer" containerID="dfec260fdf5dfc6d507c9eede67e0e597a28616da6ebe047d308b376dea2f014" Dec 07 19:31:30 crc kubenswrapper[4815]: I1207 19:31:30.233846 4815 scope.go:117] "RemoveContainer" containerID="df312392ad638bdbc8203e1296945b62154717f72a0947ee9edf7cce651dae4b" Dec 07 19:31:30 crc kubenswrapper[4815]: I1207 19:31:30.261329 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/33f773e2-304c-4d0b-98e9-6fd309462297-metrics-certs\") pod \"openstack-operator-controller-manager-7dc7d5d6ff-dnrvh\" (UID: \"33f773e2-304c-4d0b-98e9-6fd309462297\") " pod="openstack-operators/openstack-operator-controller-manager-7dc7d5d6ff-dnrvh" Dec 07 19:31:30 crc kubenswrapper[4815]: I1207 19:31:30.261606 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/33f773e2-304c-4d0b-98e9-6fd309462297-webhook-certs\") pod \"openstack-operator-controller-manager-7dc7d5d6ff-dnrvh\" (UID: \"33f773e2-304c-4d0b-98e9-6fd309462297\") " pod="openstack-operators/openstack-operator-controller-manager-7dc7d5d6ff-dnrvh" Dec 07 19:31:30 crc kubenswrapper[4815]: E1207 19:31:30.261741 4815 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 07 19:31:30 crc kubenswrapper[4815]: E1207 19:31:30.261789 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33f773e2-304c-4d0b-98e9-6fd309462297-webhook-certs podName:33f773e2-304c-4d0b-98e9-6fd309462297 nodeName:}" failed. No retries permitted until 2025-12-07 19:31:34.261773641 +0000 UTC m=+998.840763686 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/33f773e2-304c-4d0b-98e9-6fd309462297-webhook-certs") pod "openstack-operator-controller-manager-7dc7d5d6ff-dnrvh" (UID: "33f773e2-304c-4d0b-98e9-6fd309462297") : secret "webhook-server-cert" not found Dec 07 19:31:30 crc kubenswrapper[4815]: E1207 19:31:30.262197 4815 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 07 19:31:30 crc kubenswrapper[4815]: E1207 19:31:30.262245 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33f773e2-304c-4d0b-98e9-6fd309462297-metrics-certs podName:33f773e2-304c-4d0b-98e9-6fd309462297 nodeName:}" failed. No retries permitted until 2025-12-07 19:31:34.262230144 +0000 UTC m=+998.841220189 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/33f773e2-304c-4d0b-98e9-6fd309462297-metrics-certs") pod "openstack-operator-controller-manager-7dc7d5d6ff-dnrvh" (UID: "33f773e2-304c-4d0b-98e9-6fd309462297") : secret "metrics-server-cert" not found Dec 07 19:31:30 crc kubenswrapper[4815]: I1207 19:31:30.339055 4815 scope.go:117] "RemoveContainer" containerID="1de32c423f98e3683ff4a1f36be417291027cea364f3360131e7f1b005db9e39" Dec 07 19:31:30 crc kubenswrapper[4815]: E1207 19:31:30.339843 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1de32c423f98e3683ff4a1f36be417291027cea364f3360131e7f1b005db9e39\": container with ID starting with 1de32c423f98e3683ff4a1f36be417291027cea364f3360131e7f1b005db9e39 not found: ID does not exist" containerID="1de32c423f98e3683ff4a1f36be417291027cea364f3360131e7f1b005db9e39" Dec 07 19:31:30 crc kubenswrapper[4815]: I1207 19:31:30.339886 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1de32c423f98e3683ff4a1f36be417291027cea364f3360131e7f1b005db9e39"} err="failed to get container status \"1de32c423f98e3683ff4a1f36be417291027cea364f3360131e7f1b005db9e39\": rpc error: code = NotFound desc = could not find container \"1de32c423f98e3683ff4a1f36be417291027cea364f3360131e7f1b005db9e39\": container with ID starting with 1de32c423f98e3683ff4a1f36be417291027cea364f3360131e7f1b005db9e39 not found: ID does not exist" Dec 07 19:31:30 crc kubenswrapper[4815]: I1207 19:31:30.339931 4815 scope.go:117] "RemoveContainer" containerID="dfec260fdf5dfc6d507c9eede67e0e597a28616da6ebe047d308b376dea2f014" Dec 07 19:31:30 crc kubenswrapper[4815]: E1207 19:31:30.343814 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dfec260fdf5dfc6d507c9eede67e0e597a28616da6ebe047d308b376dea2f014\": container with ID starting with dfec260fdf5dfc6d507c9eede67e0e597a28616da6ebe047d308b376dea2f014 not found: ID does not exist" containerID="dfec260fdf5dfc6d507c9eede67e0e597a28616da6ebe047d308b376dea2f014" Dec 07 19:31:30 crc kubenswrapper[4815]: I1207 19:31:30.343848 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dfec260fdf5dfc6d507c9eede67e0e597a28616da6ebe047d308b376dea2f014"} err="failed to get container status \"dfec260fdf5dfc6d507c9eede67e0e597a28616da6ebe047d308b376dea2f014\": rpc error: code = NotFound desc = could not find container \"dfec260fdf5dfc6d507c9eede67e0e597a28616da6ebe047d308b376dea2f014\": container with ID starting with dfec260fdf5dfc6d507c9eede67e0e597a28616da6ebe047d308b376dea2f014 not found: ID does not exist" Dec 07 19:31:30 crc kubenswrapper[4815]: I1207 19:31:30.343867 4815 scope.go:117] "RemoveContainer" containerID="df312392ad638bdbc8203e1296945b62154717f72a0947ee9edf7cce651dae4b" Dec 07 19:31:30 crc kubenswrapper[4815]: E1207 19:31:30.345578 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df312392ad638bdbc8203e1296945b62154717f72a0947ee9edf7cce651dae4b\": container with ID starting with df312392ad638bdbc8203e1296945b62154717f72a0947ee9edf7cce651dae4b not found: ID does not exist" containerID="df312392ad638bdbc8203e1296945b62154717f72a0947ee9edf7cce651dae4b" Dec 07 19:31:30 crc kubenswrapper[4815]: I1207 19:31:30.345631 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df312392ad638bdbc8203e1296945b62154717f72a0947ee9edf7cce651dae4b"} err="failed to get container status \"df312392ad638bdbc8203e1296945b62154717f72a0947ee9edf7cce651dae4b\": rpc error: code = NotFound desc = could not find container \"df312392ad638bdbc8203e1296945b62154717f72a0947ee9edf7cce651dae4b\": container with ID starting with df312392ad638bdbc8203e1296945b62154717f72a0947ee9edf7cce651dae4b not found: ID does not exist" Dec 07 19:31:30 crc kubenswrapper[4815]: I1207 19:31:30.426883 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-snsd5" Dec 07 19:31:30 crc kubenswrapper[4815]: I1207 19:31:30.517088 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-snsd5" Dec 07 19:31:30 crc kubenswrapper[4815]: I1207 19:31:30.949010 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4hccq"] Dec 07 19:31:30 crc kubenswrapper[4815]: E1207 19:31:30.949384 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83888d61-30ef-4fc4-97f1-66d4074e355c" containerName="registry-server" Dec 07 19:31:30 crc kubenswrapper[4815]: I1207 19:31:30.949398 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="83888d61-30ef-4fc4-97f1-66d4074e355c" containerName="registry-server" Dec 07 19:31:30 crc kubenswrapper[4815]: E1207 19:31:30.949406 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83888d61-30ef-4fc4-97f1-66d4074e355c" containerName="extract-utilities" Dec 07 19:31:30 crc kubenswrapper[4815]: I1207 19:31:30.949412 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="83888d61-30ef-4fc4-97f1-66d4074e355c" containerName="extract-utilities" Dec 07 19:31:30 crc kubenswrapper[4815]: E1207 19:31:30.949428 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83888d61-30ef-4fc4-97f1-66d4074e355c" containerName="extract-content" Dec 07 19:31:30 crc kubenswrapper[4815]: I1207 19:31:30.949433 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="83888d61-30ef-4fc4-97f1-66d4074e355c" containerName="extract-content" Dec 07 19:31:30 crc kubenswrapper[4815]: I1207 19:31:30.949625 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="83888d61-30ef-4fc4-97f1-66d4074e355c" containerName="registry-server" Dec 07 19:31:30 crc kubenswrapper[4815]: I1207 19:31:30.951877 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4hccq" Dec 07 19:31:30 crc kubenswrapper[4815]: I1207 19:31:30.955000 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4hccq"] Dec 07 19:31:30 crc kubenswrapper[4815]: I1207 19:31:30.985580 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcf62\" (UniqueName: \"kubernetes.io/projected/9b60f96b-6202-45f2-b805-dfa1dbd47004-kube-api-access-dcf62\") pod \"redhat-marketplace-4hccq\" (UID: \"9b60f96b-6202-45f2-b805-dfa1dbd47004\") " pod="openshift-marketplace/redhat-marketplace-4hccq" Dec 07 19:31:30 crc kubenswrapper[4815]: I1207 19:31:30.985686 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b60f96b-6202-45f2-b805-dfa1dbd47004-utilities\") pod \"redhat-marketplace-4hccq\" (UID: \"9b60f96b-6202-45f2-b805-dfa1dbd47004\") " pod="openshift-marketplace/redhat-marketplace-4hccq" Dec 07 19:31:30 crc kubenswrapper[4815]: I1207 19:31:30.985823 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b60f96b-6202-45f2-b805-dfa1dbd47004-catalog-content\") pod \"redhat-marketplace-4hccq\" (UID: \"9b60f96b-6202-45f2-b805-dfa1dbd47004\") " pod="openshift-marketplace/redhat-marketplace-4hccq" Dec 07 19:31:31 crc kubenswrapper[4815]: I1207 19:31:31.091597 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcf62\" (UniqueName: \"kubernetes.io/projected/9b60f96b-6202-45f2-b805-dfa1dbd47004-kube-api-access-dcf62\") pod \"redhat-marketplace-4hccq\" (UID: \"9b60f96b-6202-45f2-b805-dfa1dbd47004\") " pod="openshift-marketplace/redhat-marketplace-4hccq" Dec 07 19:31:31 crc kubenswrapper[4815]: I1207 19:31:31.091702 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b60f96b-6202-45f2-b805-dfa1dbd47004-utilities\") pod \"redhat-marketplace-4hccq\" (UID: \"9b60f96b-6202-45f2-b805-dfa1dbd47004\") " pod="openshift-marketplace/redhat-marketplace-4hccq" Dec 07 19:31:31 crc kubenswrapper[4815]: I1207 19:31:31.091754 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b60f96b-6202-45f2-b805-dfa1dbd47004-catalog-content\") pod \"redhat-marketplace-4hccq\" (UID: \"9b60f96b-6202-45f2-b805-dfa1dbd47004\") " pod="openshift-marketplace/redhat-marketplace-4hccq" Dec 07 19:31:31 crc kubenswrapper[4815]: I1207 19:31:31.092208 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b60f96b-6202-45f2-b805-dfa1dbd47004-catalog-content\") pod \"redhat-marketplace-4hccq\" (UID: \"9b60f96b-6202-45f2-b805-dfa1dbd47004\") " pod="openshift-marketplace/redhat-marketplace-4hccq" Dec 07 19:31:31 crc kubenswrapper[4815]: I1207 19:31:31.092256 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b60f96b-6202-45f2-b805-dfa1dbd47004-utilities\") pod \"redhat-marketplace-4hccq\" (UID: \"9b60f96b-6202-45f2-b805-dfa1dbd47004\") " pod="openshift-marketplace/redhat-marketplace-4hccq" Dec 07 19:31:31 crc kubenswrapper[4815]: I1207 19:31:31.113940 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcf62\" (UniqueName: \"kubernetes.io/projected/9b60f96b-6202-45f2-b805-dfa1dbd47004-kube-api-access-dcf62\") pod \"redhat-marketplace-4hccq\" (UID: \"9b60f96b-6202-45f2-b805-dfa1dbd47004\") " pod="openshift-marketplace/redhat-marketplace-4hccq" Dec 07 19:31:31 crc kubenswrapper[4815]: I1207 19:31:31.266744 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4hccq" Dec 07 19:31:31 crc kubenswrapper[4815]: I1207 19:31:31.914643 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83888d61-30ef-4fc4-97f1-66d4074e355c" path="/var/lib/kubelet/pods/83888d61-30ef-4fc4-97f1-66d4074e355c/volumes" Dec 07 19:31:32 crc kubenswrapper[4815]: I1207 19:31:32.539431 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/01adf042-9afe-46be-ba0c-0c1a3f86ed8d-cert\") pod \"infra-operator-controller-manager-78d48bff9d-5rmk7\" (UID: \"01adf042-9afe-46be-ba0c-0c1a3f86ed8d\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-5rmk7" Dec 07 19:31:32 crc kubenswrapper[4815]: E1207 19:31:32.539738 4815 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 07 19:31:32 crc kubenswrapper[4815]: E1207 19:31:32.539822 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/01adf042-9afe-46be-ba0c-0c1a3f86ed8d-cert podName:01adf042-9afe-46be-ba0c-0c1a3f86ed8d nodeName:}" failed. No retries permitted until 2025-12-07 19:31:40.539788853 +0000 UTC m=+1005.118778898 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/01adf042-9afe-46be-ba0c-0c1a3f86ed8d-cert") pod "infra-operator-controller-manager-78d48bff9d-5rmk7" (UID: "01adf042-9afe-46be-ba0c-0c1a3f86ed8d") : secret "infra-operator-webhook-server-cert" not found Dec 07 19:31:32 crc kubenswrapper[4815]: I1207 19:31:32.939548 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4hccq"] Dec 07 19:31:33 crc kubenswrapper[4815]: I1207 19:31:33.135739 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4hccq" event={"ID":"9b60f96b-6202-45f2-b805-dfa1dbd47004","Type":"ContainerStarted","Data":"7bf61a096267b25a66c6d2b9d610acd25418af867890e77b4ccb77264d985222"} Dec 07 19:31:33 crc kubenswrapper[4815]: I1207 19:31:33.173184 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/097ed41c-7445-4a2f-ba72-c9ff11bb0e28-cert\") pod \"openstack-baremetal-operator-controller-manager-84b575879fvvqrm\" (UID: \"097ed41c-7445-4a2f-ba72-c9ff11bb0e28\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fvvqrm" Dec 07 19:31:33 crc kubenswrapper[4815]: E1207 19:31:33.173280 4815 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 07 19:31:33 crc kubenswrapper[4815]: E1207 19:31:33.173334 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/097ed41c-7445-4a2f-ba72-c9ff11bb0e28-cert podName:097ed41c-7445-4a2f-ba72-c9ff11bb0e28 nodeName:}" failed. No retries permitted until 2025-12-07 19:31:41.173318634 +0000 UTC m=+1005.752308679 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/097ed41c-7445-4a2f-ba72-c9ff11bb0e28-cert") pod "openstack-baremetal-operator-controller-manager-84b575879fvvqrm" (UID: "097ed41c-7445-4a2f-ba72-c9ff11bb0e28") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 07 19:31:33 crc kubenswrapper[4815]: I1207 19:31:33.376302 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-snsd5"] Dec 07 19:31:33 crc kubenswrapper[4815]: I1207 19:31:33.376875 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-snsd5" podUID="1b1c55a7-0911-4926-92da-1710e03daba2" containerName="registry-server" containerID="cri-o://e91938526d28edd335c004632c099c8d5624470d4c3a70a68868af4fc61f67bb" gracePeriod=2 Dec 07 19:31:34 crc kubenswrapper[4815]: I1207 19:31:34.184394 4815 generic.go:334] "Generic (PLEG): container finished" podID="1b1c55a7-0911-4926-92da-1710e03daba2" containerID="e91938526d28edd335c004632c099c8d5624470d4c3a70a68868af4fc61f67bb" exitCode=0 Dec 07 19:31:34 crc kubenswrapper[4815]: I1207 19:31:34.184524 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-snsd5" event={"ID":"1b1c55a7-0911-4926-92da-1710e03daba2","Type":"ContainerDied","Data":"e91938526d28edd335c004632c099c8d5624470d4c3a70a68868af4fc61f67bb"} Dec 07 19:31:34 crc kubenswrapper[4815]: I1207 19:31:34.192292 4815 generic.go:334] "Generic (PLEG): container finished" podID="9b60f96b-6202-45f2-b805-dfa1dbd47004" containerID="f2be0ccc605fbb4bfcc86b4a56886049a15eb05b6b29894fb43cca9a6613fe46" exitCode=0 Dec 07 19:31:34 crc kubenswrapper[4815]: I1207 19:31:34.192340 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4hccq" event={"ID":"9b60f96b-6202-45f2-b805-dfa1dbd47004","Type":"ContainerDied","Data":"f2be0ccc605fbb4bfcc86b4a56886049a15eb05b6b29894fb43cca9a6613fe46"} Dec 07 19:31:34 crc kubenswrapper[4815]: I1207 19:31:34.317706 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/33f773e2-304c-4d0b-98e9-6fd309462297-webhook-certs\") pod \"openstack-operator-controller-manager-7dc7d5d6ff-dnrvh\" (UID: \"33f773e2-304c-4d0b-98e9-6fd309462297\") " pod="openstack-operators/openstack-operator-controller-manager-7dc7d5d6ff-dnrvh" Dec 07 19:31:34 crc kubenswrapper[4815]: I1207 19:31:34.318100 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/33f773e2-304c-4d0b-98e9-6fd309462297-metrics-certs\") pod \"openstack-operator-controller-manager-7dc7d5d6ff-dnrvh\" (UID: \"33f773e2-304c-4d0b-98e9-6fd309462297\") " pod="openstack-operators/openstack-operator-controller-manager-7dc7d5d6ff-dnrvh" Dec 07 19:31:34 crc kubenswrapper[4815]: E1207 19:31:34.318265 4815 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 07 19:31:34 crc kubenswrapper[4815]: E1207 19:31:34.318311 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33f773e2-304c-4d0b-98e9-6fd309462297-metrics-certs podName:33f773e2-304c-4d0b-98e9-6fd309462297 nodeName:}" failed. No retries permitted until 2025-12-07 19:31:42.318296593 +0000 UTC m=+1006.897286628 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/33f773e2-304c-4d0b-98e9-6fd309462297-metrics-certs") pod "openstack-operator-controller-manager-7dc7d5d6ff-dnrvh" (UID: "33f773e2-304c-4d0b-98e9-6fd309462297") : secret "metrics-server-cert" not found Dec 07 19:31:34 crc kubenswrapper[4815]: E1207 19:31:34.318826 4815 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 07 19:31:34 crc kubenswrapper[4815]: E1207 19:31:34.318857 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33f773e2-304c-4d0b-98e9-6fd309462297-webhook-certs podName:33f773e2-304c-4d0b-98e9-6fd309462297 nodeName:}" failed. No retries permitted until 2025-12-07 19:31:42.318850309 +0000 UTC m=+1006.897840344 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/33f773e2-304c-4d0b-98e9-6fd309462297-webhook-certs") pod "openstack-operator-controller-manager-7dc7d5d6ff-dnrvh" (UID: "33f773e2-304c-4d0b-98e9-6fd309462297") : secret "webhook-server-cert" not found Dec 07 19:31:34 crc kubenswrapper[4815]: I1207 19:31:34.328374 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-snsd5" Dec 07 19:31:34 crc kubenswrapper[4815]: I1207 19:31:34.521268 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5ksbj\" (UniqueName: \"kubernetes.io/projected/1b1c55a7-0911-4926-92da-1710e03daba2-kube-api-access-5ksbj\") pod \"1b1c55a7-0911-4926-92da-1710e03daba2\" (UID: \"1b1c55a7-0911-4926-92da-1710e03daba2\") " Dec 07 19:31:34 crc kubenswrapper[4815]: I1207 19:31:34.521331 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b1c55a7-0911-4926-92da-1710e03daba2-utilities\") pod \"1b1c55a7-0911-4926-92da-1710e03daba2\" (UID: \"1b1c55a7-0911-4926-92da-1710e03daba2\") " Dec 07 19:31:34 crc kubenswrapper[4815]: I1207 19:31:34.521407 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b1c55a7-0911-4926-92da-1710e03daba2-catalog-content\") pod \"1b1c55a7-0911-4926-92da-1710e03daba2\" (UID: \"1b1c55a7-0911-4926-92da-1710e03daba2\") " Dec 07 19:31:34 crc kubenswrapper[4815]: I1207 19:31:34.523365 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b1c55a7-0911-4926-92da-1710e03daba2-utilities" (OuterVolumeSpecName: "utilities") pod "1b1c55a7-0911-4926-92da-1710e03daba2" (UID: "1b1c55a7-0911-4926-92da-1710e03daba2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 07 19:31:34 crc kubenswrapper[4815]: I1207 19:31:34.528490 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b1c55a7-0911-4926-92da-1710e03daba2-kube-api-access-5ksbj" (OuterVolumeSpecName: "kube-api-access-5ksbj") pod "1b1c55a7-0911-4926-92da-1710e03daba2" (UID: "1b1c55a7-0911-4926-92da-1710e03daba2"). InnerVolumeSpecName "kube-api-access-5ksbj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:31:34 crc kubenswrapper[4815]: I1207 19:31:34.587889 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b1c55a7-0911-4926-92da-1710e03daba2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1b1c55a7-0911-4926-92da-1710e03daba2" (UID: "1b1c55a7-0911-4926-92da-1710e03daba2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 07 19:31:34 crc kubenswrapper[4815]: I1207 19:31:34.623173 4815 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b1c55a7-0911-4926-92da-1710e03daba2-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 07 19:31:34 crc kubenswrapper[4815]: I1207 19:31:34.623200 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5ksbj\" (UniqueName: \"kubernetes.io/projected/1b1c55a7-0911-4926-92da-1710e03daba2-kube-api-access-5ksbj\") on node \"crc\" DevicePath \"\"" Dec 07 19:31:34 crc kubenswrapper[4815]: I1207 19:31:34.623212 4815 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b1c55a7-0911-4926-92da-1710e03daba2-utilities\") on node \"crc\" DevicePath \"\"" Dec 07 19:31:35 crc kubenswrapper[4815]: I1207 19:31:35.220177 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-snsd5" event={"ID":"1b1c55a7-0911-4926-92da-1710e03daba2","Type":"ContainerDied","Data":"cc6581f6b5fc455f1cff61b3e2c9da73ec8d7e8473f5d294c473e7bcf8118bf2"} Dec 07 19:31:35 crc kubenswrapper[4815]: I1207 19:31:35.220496 4815 scope.go:117] "RemoveContainer" containerID="e91938526d28edd335c004632c099c8d5624470d4c3a70a68868af4fc61f67bb" Dec 07 19:31:35 crc kubenswrapper[4815]: I1207 19:31:35.220559 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-snsd5" Dec 07 19:31:35 crc kubenswrapper[4815]: I1207 19:31:35.264878 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-snsd5"] Dec 07 19:31:35 crc kubenswrapper[4815]: I1207 19:31:35.275331 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-snsd5"] Dec 07 19:31:35 crc kubenswrapper[4815]: I1207 19:31:35.788900 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b1c55a7-0911-4926-92da-1710e03daba2" path="/var/lib/kubelet/pods/1b1c55a7-0911-4926-92da-1710e03daba2/volumes" Dec 07 19:31:40 crc kubenswrapper[4815]: I1207 19:31:40.576666 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/01adf042-9afe-46be-ba0c-0c1a3f86ed8d-cert\") pod \"infra-operator-controller-manager-78d48bff9d-5rmk7\" (UID: \"01adf042-9afe-46be-ba0c-0c1a3f86ed8d\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-5rmk7" Dec 07 19:31:40 crc kubenswrapper[4815]: I1207 19:31:40.585151 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/01adf042-9afe-46be-ba0c-0c1a3f86ed8d-cert\") pod \"infra-operator-controller-manager-78d48bff9d-5rmk7\" (UID: \"01adf042-9afe-46be-ba0c-0c1a3f86ed8d\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-5rmk7" Dec 07 19:31:40 crc kubenswrapper[4815]: I1207 19:31:40.740507 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-5rmk7" Dec 07 19:31:41 crc kubenswrapper[4815]: I1207 19:31:41.237697 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/097ed41c-7445-4a2f-ba72-c9ff11bb0e28-cert\") pod \"openstack-baremetal-operator-controller-manager-84b575879fvvqrm\" (UID: \"097ed41c-7445-4a2f-ba72-c9ff11bb0e28\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fvvqrm" Dec 07 19:31:41 crc kubenswrapper[4815]: I1207 19:31:41.245087 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/097ed41c-7445-4a2f-ba72-c9ff11bb0e28-cert\") pod \"openstack-baremetal-operator-controller-manager-84b575879fvvqrm\" (UID: \"097ed41c-7445-4a2f-ba72-c9ff11bb0e28\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fvvqrm" Dec 07 19:31:41 crc kubenswrapper[4815]: I1207 19:31:41.261940 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fvvqrm" Dec 07 19:31:42 crc kubenswrapper[4815]: I1207 19:31:42.387272 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/33f773e2-304c-4d0b-98e9-6fd309462297-webhook-certs\") pod \"openstack-operator-controller-manager-7dc7d5d6ff-dnrvh\" (UID: \"33f773e2-304c-4d0b-98e9-6fd309462297\") " pod="openstack-operators/openstack-operator-controller-manager-7dc7d5d6ff-dnrvh" Dec 07 19:31:42 crc kubenswrapper[4815]: I1207 19:31:42.387543 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/33f773e2-304c-4d0b-98e9-6fd309462297-metrics-certs\") pod \"openstack-operator-controller-manager-7dc7d5d6ff-dnrvh\" (UID: \"33f773e2-304c-4d0b-98e9-6fd309462297\") " pod="openstack-operators/openstack-operator-controller-manager-7dc7d5d6ff-dnrvh" Dec 07 19:31:42 crc kubenswrapper[4815]: E1207 19:31:42.387464 4815 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 07 19:31:42 crc kubenswrapper[4815]: E1207 19:31:42.387617 4815 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 07 19:31:42 crc kubenswrapper[4815]: E1207 19:31:42.387664 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33f773e2-304c-4d0b-98e9-6fd309462297-webhook-certs podName:33f773e2-304c-4d0b-98e9-6fd309462297 nodeName:}" failed. No retries permitted until 2025-12-07 19:31:58.387624758 +0000 UTC m=+1022.966614813 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/33f773e2-304c-4d0b-98e9-6fd309462297-webhook-certs") pod "openstack-operator-controller-manager-7dc7d5d6ff-dnrvh" (UID: "33f773e2-304c-4d0b-98e9-6fd309462297") : secret "webhook-server-cert" not found Dec 07 19:31:42 crc kubenswrapper[4815]: E1207 19:31:42.387685 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33f773e2-304c-4d0b-98e9-6fd309462297-metrics-certs podName:33f773e2-304c-4d0b-98e9-6fd309462297 nodeName:}" failed. No retries permitted until 2025-12-07 19:31:58.387676849 +0000 UTC m=+1022.966666894 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/33f773e2-304c-4d0b-98e9-6fd309462297-metrics-certs") pod "openstack-operator-controller-manager-7dc7d5d6ff-dnrvh" (UID: "33f773e2-304c-4d0b-98e9-6fd309462297") : secret "metrics-server-cert" not found Dec 07 19:31:45 crc kubenswrapper[4815]: E1207 19:31:45.671926 4815 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/designate-operator@sha256:900050d3501c0785b227db34b89883efe68247816e5c7427cacb74f8aa10605a" Dec 07 19:31:45 crc kubenswrapper[4815]: E1207 19:31:45.672435 4815 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/designate-operator@sha256:900050d3501c0785b227db34b89883efe68247816e5c7427cacb74f8aa10605a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dwhbn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-697fb699cf-psfhw_openstack-operators(cbeacff5-ac96-4444-aa57-04a320582348): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 07 19:31:46 crc kubenswrapper[4815]: E1207 19:31:46.381336 4815 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168" Dec 07 19:31:46 crc kubenswrapper[4815]: E1207 19:31:46.381552 4815 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-t97cc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-998648c74-m6p4v_openstack-operators(2cd3ca37-5d31-4068-88df-1344ebfad5e7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 07 19:31:47 crc kubenswrapper[4815]: E1207 19:31:47.663908 4815 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/barbican-operator@sha256:f6059a0fbf031d34dcf086d14ce8c0546caeaee23c5780e90b5037c5feee9fea" Dec 07 19:31:47 crc kubenswrapper[4815]: E1207 19:31:47.664483 4815 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/barbican-operator@sha256:f6059a0fbf031d34dcf086d14ce8c0546caeaee23c5780e90b5037c5feee9fea,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bdf77,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-operator-controller-manager-7d9dfd778-jr22f_openstack-operators(4c8e53f4-6dec-4655-b931-b8d0b8ddc8da): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 07 19:31:51 crc kubenswrapper[4815]: E1207 19:31:51.258592 4815 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/manila-operator@sha256:44126f9c6b1d2bf752ddf989e20a4fc4cc1c07723d4fcb78465ccb2f55da6b3a" Dec 07 19:31:51 crc kubenswrapper[4815]: E1207 19:31:51.259303 4815 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:44126f9c6b1d2bf752ddf989e20a4fc4cc1c07723d4fcb78465ccb2f55da6b3a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jb55l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-5b5fd79c9c-7d9wv_openstack-operators(b7efe346-4e91-45a1-84d6-6a5aac1c739c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 07 19:31:52 crc kubenswrapper[4815]: E1207 19:31:52.716600 4815 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ironic-operator@sha256:5bdb3685be3ddc1efd62e16aaf2fa96ead64315e26d52b1b2a7d8ac01baa1e87" Dec 07 19:31:52 crc kubenswrapper[4815]: E1207 19:31:52.716857 4815 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ironic-operator@sha256:5bdb3685be3ddc1efd62e16aaf2fa96ead64315e26d52b1b2a7d8ac01baa1e87,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kk997,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-967d97867-z7p7t_openstack-operators(954764fa-df14-4604-96d2-6ddc12155406): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 07 19:31:53 crc kubenswrapper[4815]: E1207 19:31:53.345267 4815 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/swift-operator@sha256:3aa109bb973253ae9dcf339b9b65abbd1176cdb4be672c93e538a5f113816991" Dec 07 19:31:53 crc kubenswrapper[4815]: E1207 19:31:53.345535 4815 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:3aa109bb973253ae9dcf339b9b65abbd1176cdb4be672c93e538a5f113816991,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zsg8g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-9d58d64bc-jsdgf_openstack-operators(2e427dde-fbfa-4b36-9749-e83080d8733a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 07 19:31:55 crc kubenswrapper[4815]: E1207 19:31:55.009548 4815 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94" Dec 07 19:31:55 crc kubenswrapper[4815]: E1207 19:31:55.010159 4815 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-szc7k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-gfvgj_openstack-operators(bed3f9f7-d38b-4987-90fd-1c4a380165f4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 07 19:31:55 crc kubenswrapper[4815]: E1207 19:31:55.569470 4815 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/glance-operator@sha256:5370dc4a8e776923eec00bb50cbdb2e390e9dde50be26bdc04a216bd2d6b5027" Dec 07 19:31:55 crc kubenswrapper[4815]: E1207 19:31:55.569642 4815 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/glance-operator@sha256:5370dc4a8e776923eec00bb50cbdb2e390e9dde50be26bdc04a216bd2d6b5027,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4hg9l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-operator-controller-manager-5697bb5779-7wwz9_openstack-operators(603612f4-25fa-4356-a99c-b054645d8919): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 07 19:31:57 crc kubenswrapper[4815]: E1207 19:31:57.107464 4815 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/heat-operator@sha256:c4abfc148600dfa85915f3dc911d988ea2335f26cb6b8d749fe79bfe53e5e429" Dec 07 19:31:57 crc kubenswrapper[4815]: E1207 19:31:57.108320 4815 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/heat-operator@sha256:c4abfc148600dfa85915f3dc911d988ea2335f26cb6b8d749fe79bfe53e5e429,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6n5rr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-5f64f6f8bb-trnlm_openstack-operators(d6eb6a40-b713-4a4a-9554-749f94cf1137): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 07 19:31:58 crc kubenswrapper[4815]: I1207 19:31:58.487825 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/33f773e2-304c-4d0b-98e9-6fd309462297-metrics-certs\") pod \"openstack-operator-controller-manager-7dc7d5d6ff-dnrvh\" (UID: \"33f773e2-304c-4d0b-98e9-6fd309462297\") " pod="openstack-operators/openstack-operator-controller-manager-7dc7d5d6ff-dnrvh" Dec 07 19:31:58 crc kubenswrapper[4815]: I1207 19:31:58.488071 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/33f773e2-304c-4d0b-98e9-6fd309462297-webhook-certs\") pod \"openstack-operator-controller-manager-7dc7d5d6ff-dnrvh\" (UID: \"33f773e2-304c-4d0b-98e9-6fd309462297\") " pod="openstack-operators/openstack-operator-controller-manager-7dc7d5d6ff-dnrvh" Dec 07 19:31:58 crc kubenswrapper[4815]: I1207 19:31:58.494048 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/33f773e2-304c-4d0b-98e9-6fd309462297-webhook-certs\") pod \"openstack-operator-controller-manager-7dc7d5d6ff-dnrvh\" (UID: \"33f773e2-304c-4d0b-98e9-6fd309462297\") " pod="openstack-operators/openstack-operator-controller-manager-7dc7d5d6ff-dnrvh" Dec 07 19:31:58 crc kubenswrapper[4815]: I1207 19:31:58.495051 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/33f773e2-304c-4d0b-98e9-6fd309462297-metrics-certs\") pod \"openstack-operator-controller-manager-7dc7d5d6ff-dnrvh\" (UID: \"33f773e2-304c-4d0b-98e9-6fd309462297\") " pod="openstack-operators/openstack-operator-controller-manager-7dc7d5d6ff-dnrvh" Dec 07 19:31:58 crc kubenswrapper[4815]: I1207 19:31:58.506508 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-8mk6z" Dec 07 19:31:58 crc kubenswrapper[4815]: I1207 19:31:58.515392 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-7dc7d5d6ff-dnrvh" Dec 07 19:32:00 crc kubenswrapper[4815]: E1207 19:32:00.185609 4815 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/mariadb-operator@sha256:424da951f13f1fbe9083215dc9f5088f90676dd813f01fdf3c1a8639b61cbaad" Dec 07 19:32:00 crc kubenswrapper[4815]: E1207 19:32:00.186352 4815 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:424da951f13f1fbe9083215dc9f5088f90676dd813f01fdf3c1a8639b61cbaad,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vjvr2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-79c8c4686c-4wf77_openstack-operators(c727b772-a57c-4564-bf8d-7c8917b4bb0d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 07 19:32:00 crc kubenswrapper[4815]: I1207 19:32:00.217334 4815 scope.go:117] "RemoveContainer" containerID="e91f67df91f1c2442cd8312085dd66a659e66aba5c9e0d84e0a9d0a91cbe51d9" Dec 07 19:32:00 crc kubenswrapper[4815]: E1207 19:32:00.864522 4815 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/neutron-operator@sha256:0b3fb69f35c151895d3dffd514974a9f9fe1c77c3bca69b78b81efb183cf4557" Dec 07 19:32:00 crc kubenswrapper[4815]: E1207 19:32:00.865070 4815 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:0b3fb69f35c151895d3dffd514974a9f9fe1c77c3bca69b78b81efb183cf4557,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-pff4n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-5fdfd5b6b5-x59hf_openstack-operators(8055e37c-efd4-4c82-a9df-4d5e2a12ef63): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 07 19:32:01 crc kubenswrapper[4815]: E1207 19:32:01.381635 4815 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f" Dec 07 19:32:01 crc kubenswrapper[4815]: E1207 19:32:01.381843 4815 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-97w9j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-78f8948974-4sb78_openstack-operators(94d115be-3d8c-46e7-9a22-e09bb888afc8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 07 19:32:01 crc kubenswrapper[4815]: E1207 19:32:01.880020 4815 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7" Dec 07 19:32:01 crc kubenswrapper[4815]: E1207 19:32:01.880240 4815 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-tqzhj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-7765d96ddf-5shnd_openstack-operators(0404b56e-87cb-40c5-b11c-64dc7c960718): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 07 19:32:02 crc kubenswrapper[4815]: E1207 19:32:02.461451 4815 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670" Dec 07 19:32:02 crc kubenswrapper[4815]: E1207 19:32:02.461621 4815 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-q95zs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-697bc559fc-8mg5j_openstack-operators(0dc4f082-f669-444f-a46e-9bca4cc20f31): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 07 19:32:02 crc kubenswrapper[4815]: E1207 19:32:02.927441 4815 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Dec 07 19:32:02 crc kubenswrapper[4815]: E1207 19:32:02.927628 4815 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-slbrg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-xx86c_openstack-operators(2aeb4ccc-f6a6-4614-917c-55cf0a46c3cd): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 07 19:32:02 crc kubenswrapper[4815]: E1207 19:32:02.928732 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xx86c" podUID="2aeb4ccc-f6a6-4614-917c-55cf0a46c3cd" Dec 07 19:32:03 crc kubenswrapper[4815]: I1207 19:32:03.098654 4815 scope.go:117] "RemoveContainer" containerID="53e871fe6dfb5c13e672331f9acfb0b424eb8b53b9f15e0cd10ac0be0eb423df" Dec 07 19:32:03 crc kubenswrapper[4815]: I1207 19:32:03.558054 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-78d48bff9d-5rmk7"] Dec 07 19:32:03 crc kubenswrapper[4815]: I1207 19:32:03.565555 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fvvqrm"] Dec 07 19:32:03 crc kubenswrapper[4815]: E1207 19:32:03.601530 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xx86c" podUID="2aeb4ccc-f6a6-4614-917c-55cf0a46c3cd" Dec 07 19:32:03 crc kubenswrapper[4815]: I1207 19:32:03.942121 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7dc7d5d6ff-dnrvh"] Dec 07 19:32:04 crc kubenswrapper[4815]: I1207 19:32:04.234603 4815 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 07 19:32:04 crc kubenswrapper[4815]: I1207 19:32:04.627827 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-dtpkv" event={"ID":"34bdeed9-8374-4eea-a1d9-562d933324e9","Type":"ContainerStarted","Data":"0b75effa2d5d71861897094f381e6dde17ad7a5f745ef6bd5fbd7cebf286a7b7"} Dec 07 19:32:04 crc kubenswrapper[4815]: I1207 19:32:04.630243 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fvvqrm" event={"ID":"097ed41c-7445-4a2f-ba72-c9ff11bb0e28","Type":"ContainerStarted","Data":"e527e8018afa86410e847bd59f69a39b65499eccd485058e5eb20c2bc067030c"} Dec 07 19:32:04 crc kubenswrapper[4815]: I1207 19:32:04.648686 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-spddl" event={"ID":"33bdbbd1-62ad-42d5-a10a-a5da1344af19","Type":"ContainerStarted","Data":"26634af7b977efc5b2fd060db76a84ebc0be3cf6638959adf4042c8fc1ee0f20"} Dec 07 19:32:04 crc kubenswrapper[4815]: I1207 19:32:04.662204 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-5rmk7" event={"ID":"01adf042-9afe-46be-ba0c-0c1a3f86ed8d","Type":"ContainerStarted","Data":"2c91e5a59433a92f96e40171bc202d046b504a947b084515a22e35f1d1ecd03f"} Dec 07 19:32:04 crc kubenswrapper[4815]: I1207 19:32:04.663790 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-bq2qd" event={"ID":"0cc9e387-27e4-4b5d-ac7a-d9f098acb973","Type":"ContainerStarted","Data":"0eee154d84e530c5266a011db744b97f296eeb55ed0892b9f4b926c636708caf"} Dec 07 19:32:04 crc kubenswrapper[4815]: I1207 19:32:04.664518 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7dc7d5d6ff-dnrvh" event={"ID":"33f773e2-304c-4d0b-98e9-6fd309462297","Type":"ContainerStarted","Data":"adcca908b8d0f0b57b5d3fd6574b733688ce8f8d0963345dee3084c038897f2f"} Dec 07 19:32:09 crc kubenswrapper[4815]: I1207 19:32:09.819772 4815 generic.go:334] "Generic (PLEG): container finished" podID="9b60f96b-6202-45f2-b805-dfa1dbd47004" containerID="266f3b0fd592a4440604d1fbb020bcf83107356108c140b300bfc8166ab1753f" exitCode=0 Dec 07 19:32:09 crc kubenswrapper[4815]: I1207 19:32:09.819881 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4hccq" event={"ID":"9b60f96b-6202-45f2-b805-dfa1dbd47004","Type":"ContainerDied","Data":"266f3b0fd592a4440604d1fbb020bcf83107356108c140b300bfc8166ab1753f"} Dec 07 19:32:09 crc kubenswrapper[4815]: I1207 19:32:09.824247 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7dc7d5d6ff-dnrvh" event={"ID":"33f773e2-304c-4d0b-98e9-6fd309462297","Type":"ContainerStarted","Data":"578f894ab61b8ba602f66874718dff5f5624e79e13366eaa977458e5204478e6"} Dec 07 19:32:09 crc kubenswrapper[4815]: I1207 19:32:09.824866 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-7dc7d5d6ff-dnrvh" Dec 07 19:32:09 crc kubenswrapper[4815]: I1207 19:32:09.829197 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-8spbl" event={"ID":"8e51367a-70c8-4b67-b15f-ee4202171e38","Type":"ContainerStarted","Data":"314c8cce9e336e05088feb0445e867803e9d50f36c323bcf974e7959b7dd8bdb"} Dec 07 19:32:09 crc kubenswrapper[4815]: I1207 19:32:09.832170 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-kdbct" event={"ID":"21b81ae2-586e-418c-867d-0d10c3c094eb","Type":"ContainerStarted","Data":"b234eb4d6ede571a4d15c8587fabcd68d1add64aa020d8100db50b9478c728bc"} Dec 07 19:32:09 crc kubenswrapper[4815]: I1207 19:32:09.888896 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-7dc7d5d6ff-dnrvh" podStartSLOduration=44.888876891 podStartE2EDuration="44.888876891s" podCreationTimestamp="2025-12-07 19:31:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:32:09.872115829 +0000 UTC m=+1034.451105894" watchObservedRunningTime="2025-12-07 19:32:09.888876891 +0000 UTC m=+1034.467866936" Dec 07 19:32:11 crc kubenswrapper[4815]: E1207 19:32:11.624396 4815 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 07 19:32:11 crc kubenswrapper[4815]: E1207 19:32:11.625211 4815 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bdf77,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-operator-controller-manager-7d9dfd778-jr22f_openstack-operators(4c8e53f4-6dec-4655-b931-b8d0b8ddc8da): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 07 19:32:11 crc kubenswrapper[4815]: E1207 19:32:11.626877 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"]" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-jr22f" podUID="4c8e53f4-6dec-4655-b931-b8d0b8ddc8da" Dec 07 19:32:11 crc kubenswrapper[4815]: E1207 19:32:11.778757 4815 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 07 19:32:11 crc kubenswrapper[4815]: E1207 19:32:11.778939 4815 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dwhbn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-697fb699cf-psfhw_openstack-operators(cbeacff5-ac96-4444-aa57-04a320582348): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 07 19:32:11 crc kubenswrapper[4815]: E1207 19:32:11.780872 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"]" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-psfhw" podUID="cbeacff5-ac96-4444-aa57-04a320582348" Dec 07 19:32:11 crc kubenswrapper[4815]: I1207 19:32:11.853785 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-bq2qd" event={"ID":"0cc9e387-27e4-4b5d-ac7a-d9f098acb973","Type":"ContainerStarted","Data":"b1bb4c5c06ab284c1eddec58e82f52d2b87282e11c24e1dc04d4cd8c5a69030c"} Dec 07 19:32:11 crc kubenswrapper[4815]: I1207 19:32:11.855144 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-bq2qd" Dec 07 19:32:11 crc kubenswrapper[4815]: I1207 19:32:11.859114 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-bq2qd" Dec 07 19:32:11 crc kubenswrapper[4815]: I1207 19:32:11.917446 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-bq2qd" podStartSLOduration=4.359663219 podStartE2EDuration="48.917422494s" podCreationTimestamp="2025-12-07 19:31:23 +0000 UTC" firstStartedPulling="2025-12-07 19:31:27.090296718 +0000 UTC m=+991.669286763" lastFinishedPulling="2025-12-07 19:32:11.648055983 +0000 UTC m=+1036.227046038" observedRunningTime="2025-12-07 19:32:11.906374286 +0000 UTC m=+1036.485364331" watchObservedRunningTime="2025-12-07 19:32:11.917422494 +0000 UTC m=+1036.496412539" Dec 07 19:32:13 crc kubenswrapper[4815]: E1207 19:32:13.845158 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-trnlm" podUID="d6eb6a40-b713-4a4a-9554-749f94cf1137" Dec 07 19:32:13 crc kubenswrapper[4815]: E1207 19:32:13.881445 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-8mg5j" podUID="0dc4f082-f669-444f-a46e-9bca4cc20f31" Dec 07 19:32:13 crc kubenswrapper[4815]: I1207 19:32:13.886740 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4hccq" event={"ID":"9b60f96b-6202-45f2-b805-dfa1dbd47004","Type":"ContainerStarted","Data":"765b231ea00ae84118f9254b6c77d9dcd83bc14cd027c91dbbbdaced0eb1f065"} Dec 07 19:32:13 crc kubenswrapper[4815]: I1207 19:32:13.910521 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-trnlm" event={"ID":"d6eb6a40-b713-4a4a-9554-749f94cf1137","Type":"ContainerStarted","Data":"d8ae8387d24ea1b9c1bd10088427a24f59880a0adba37fcf238126fc7c4bdd79"} Dec 07 19:32:13 crc kubenswrapper[4815]: I1207 19:32:13.916761 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4hccq" podStartSLOduration=6.400181945 podStartE2EDuration="43.91674548s" podCreationTimestamp="2025-12-07 19:31:30 +0000 UTC" firstStartedPulling="2025-12-07 19:31:34.213627401 +0000 UTC m=+998.792617446" lastFinishedPulling="2025-12-07 19:32:11.730190926 +0000 UTC m=+1036.309180981" observedRunningTime="2025-12-07 19:32:13.916354049 +0000 UTC m=+1038.495344094" watchObservedRunningTime="2025-12-07 19:32:13.91674548 +0000 UTC m=+1038.495735526" Dec 07 19:32:13 crc kubenswrapper[4815]: E1207 19:32:13.982074 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-jsdgf" podUID="2e427dde-fbfa-4b36-9749-e83080d8733a" Dec 07 19:32:14 crc kubenswrapper[4815]: E1207 19:32:14.083230 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/octavia-operator-controller-manager-998648c74-m6p4v" podUID="2cd3ca37-5d31-4068-88df-1344ebfad5e7" Dec 07 19:32:14 crc kubenswrapper[4815]: E1207 19:32:14.196839 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/test-operator-controller-manager-5854674fcc-gfvgj" podUID="bed3f9f7-d38b-4987-90fd-1c4a380165f4" Dec 07 19:32:14 crc kubenswrapper[4815]: E1207 19:32:14.407482 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-7wwz9" podUID="603612f4-25fa-4356-a99c-b054645d8919" Dec 07 19:32:14 crc kubenswrapper[4815]: E1207 19:32:14.763731 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ironic-operator-controller-manager-967d97867-z7p7t" podUID="954764fa-df14-4604-96d2-6ddc12155406" Dec 07 19:32:14 crc kubenswrapper[4815]: E1207 19:32:14.863614 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/placement-operator-controller-manager-78f8948974-4sb78" podUID="94d115be-3d8c-46e7-9a22-e09bb888afc8" Dec 07 19:32:14 crc kubenswrapper[4815]: I1207 19:32:14.919545 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-jsdgf" event={"ID":"2e427dde-fbfa-4b36-9749-e83080d8733a","Type":"ContainerStarted","Data":"c9e710bfd7553c37e2160d3dbecc761c95c5289a7f10e86d087937e57f7f5710"} Dec 07 19:32:14 crc kubenswrapper[4815]: I1207 19:32:14.922455 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-psfhw" event={"ID":"cbeacff5-ac96-4444-aa57-04a320582348","Type":"ContainerStarted","Data":"f473c5d4b5a261ad4209981ec2f87aa4837ff965c58b43867db0e2dda965934e"} Dec 07 19:32:14 crc kubenswrapper[4815]: I1207 19:32:14.926001 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-jr22f" event={"ID":"4c8e53f4-6dec-4655-b931-b8d0b8ddc8da","Type":"ContainerStarted","Data":"320b358bd8879299f4c1262d461d18859f7b3d9cd3ee7fdf45e8e538a7a23f85"} Dec 07 19:32:14 crc kubenswrapper[4815]: I1207 19:32:14.934891 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-m6p4v" event={"ID":"2cd3ca37-5d31-4068-88df-1344ebfad5e7","Type":"ContainerStarted","Data":"6f2d277fcd9c66a2864dd6d744dba6924ef5476e41bca549c8b3bb7bf3c7c030"} Dec 07 19:32:14 crc kubenswrapper[4815]: I1207 19:32:14.950596 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-spddl" event={"ID":"33bdbbd1-62ad-42d5-a10a-a5da1344af19","Type":"ContainerStarted","Data":"fb4e0e72d51c03c4ffaa247bb314c766ec43048d14d405d9c2ceb5657560b2d8"} Dec 07 19:32:14 crc kubenswrapper[4815]: I1207 19:32:14.951512 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-spddl" Dec 07 19:32:14 crc kubenswrapper[4815]: I1207 19:32:14.955329 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-spddl" Dec 07 19:32:14 crc kubenswrapper[4815]: I1207 19:32:14.964601 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-gfvgj" event={"ID":"bed3f9f7-d38b-4987-90fd-1c4a380165f4","Type":"ContainerStarted","Data":"a098f84d6a5fc06c5cf824380141b7d99119ea0ff805c53ec53f30db7be50feb"} Dec 07 19:32:14 crc kubenswrapper[4815]: I1207 19:32:14.974119 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-7wwz9" event={"ID":"603612f4-25fa-4356-a99c-b054645d8919","Type":"ContainerStarted","Data":"ac487d2997e3acabfc86dbb70710ee21f2a217461d4fabc59147041abed691f1"} Dec 07 19:32:14 crc kubenswrapper[4815]: I1207 19:32:14.983631 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-dtpkv" event={"ID":"34bdeed9-8374-4eea-a1d9-562d933324e9","Type":"ContainerStarted","Data":"52b1fc127b88d703b25985f2d912cd1b2281e9c7a8104aedecda7a8d0e42162d"} Dec 07 19:32:14 crc kubenswrapper[4815]: I1207 19:32:14.985324 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-dtpkv" Dec 07 19:32:14 crc kubenswrapper[4815]: E1207 19:32:14.985581 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-4wf77" podUID="c727b772-a57c-4564-bf8d-7c8917b4bb0d" Dec 07 19:32:14 crc kubenswrapper[4815]: I1207 19:32:14.988266 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-dtpkv" Dec 07 19:32:15 crc kubenswrapper[4815]: I1207 19:32:15.006830 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-8mg5j" event={"ID":"0dc4f082-f669-444f-a46e-9bca4cc20f31","Type":"ContainerStarted","Data":"5a61a2363f5155a7b00737fc2942b737ad018d8723033be4c123a1ee5fed3f04"} Dec 07 19:32:15 crc kubenswrapper[4815]: I1207 19:32:15.024596 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fvvqrm" event={"ID":"097ed41c-7445-4a2f-ba72-c9ff11bb0e28","Type":"ContainerStarted","Data":"b54875ff42fa1f450d5cef8b1eefb290c05ef68128c0e0b4d72c6a9ca8c458f6"} Dec 07 19:32:15 crc kubenswrapper[4815]: I1207 19:32:15.029670 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-967d97867-z7p7t" event={"ID":"954764fa-df14-4604-96d2-6ddc12155406","Type":"ContainerStarted","Data":"34a13b0802bd74d2207e9b29052c291536be87220645b1c3a3432fd7d3b5cea7"} Dec 07 19:32:15 crc kubenswrapper[4815]: I1207 19:32:15.037038 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-spddl" podStartSLOduration=5.160183921 podStartE2EDuration="51.037025815s" podCreationTimestamp="2025-12-07 19:31:24 +0000 UTC" firstStartedPulling="2025-12-07 19:31:27.759826234 +0000 UTC m=+992.338816279" lastFinishedPulling="2025-12-07 19:32:13.636668108 +0000 UTC m=+1038.215658173" observedRunningTime="2025-12-07 19:32:15.03369813 +0000 UTC m=+1039.612688165" watchObservedRunningTime="2025-12-07 19:32:15.037025815 +0000 UTC m=+1039.616015860" Dec 07 19:32:15 crc kubenswrapper[4815]: I1207 19:32:15.042241 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-5rmk7" event={"ID":"01adf042-9afe-46be-ba0c-0c1a3f86ed8d","Type":"ContainerStarted","Data":"67362f0999ecf6345a9faf2694a67359d438efbb0113d55aff274ca6b3cf09fb"} Dec 07 19:32:15 crc kubenswrapper[4815]: I1207 19:32:15.058607 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-kdbct" event={"ID":"21b81ae2-586e-418c-867d-0d10c3c094eb","Type":"ContainerStarted","Data":"f9773928ea2ebc04e92443145bc398c4adc4a1b3a94f413485f88c4852bba843"} Dec 07 19:32:15 crc kubenswrapper[4815]: I1207 19:32:15.059806 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-kdbct" Dec 07 19:32:15 crc kubenswrapper[4815]: I1207 19:32:15.063511 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-kdbct" Dec 07 19:32:15 crc kubenswrapper[4815]: I1207 19:32:15.064237 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-4sb78" event={"ID":"94d115be-3d8c-46e7-9a22-e09bb888afc8","Type":"ContainerStarted","Data":"5b96c3e37d57479efccd10f901ac6714b40403618722133cc39b6a2e41a0342a"} Dec 07 19:32:15 crc kubenswrapper[4815]: E1207 19:32:15.071450 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f\\\"\"" pod="openstack-operators/placement-operator-controller-manager-78f8948974-4sb78" podUID="94d115be-3d8c-46e7-9a22-e09bb888afc8" Dec 07 19:32:15 crc kubenswrapper[4815]: I1207 19:32:15.122102 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-dtpkv" podStartSLOduration=5.724893278 podStartE2EDuration="51.122085492s" podCreationTimestamp="2025-12-07 19:31:24 +0000 UTC" firstStartedPulling="2025-12-07 19:31:28.228296775 +0000 UTC m=+992.807286820" lastFinishedPulling="2025-12-07 19:32:13.625488989 +0000 UTC m=+1038.204479034" observedRunningTime="2025-12-07 19:32:15.121386162 +0000 UTC m=+1039.700376207" watchObservedRunningTime="2025-12-07 19:32:15.122085492 +0000 UTC m=+1039.701075537" Dec 07 19:32:15 crc kubenswrapper[4815]: I1207 19:32:15.228519 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-kdbct" podStartSLOduration=4.97698415 podStartE2EDuration="50.22850516s" podCreationTimestamp="2025-12-07 19:31:25 +0000 UTC" firstStartedPulling="2025-12-07 19:31:28.384722586 +0000 UTC m=+992.963712631" lastFinishedPulling="2025-12-07 19:32:13.636243566 +0000 UTC m=+1038.215233641" observedRunningTime="2025-12-07 19:32:15.224996019 +0000 UTC m=+1039.803986064" watchObservedRunningTime="2025-12-07 19:32:15.22850516 +0000 UTC m=+1039.807495205" Dec 07 19:32:15 crc kubenswrapper[4815]: E1207 19:32:15.427866 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-5shnd" podUID="0404b56e-87cb-40c5-b11c-64dc7c960718" Dec 07 19:32:16 crc kubenswrapper[4815]: I1207 19:32:16.070445 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-4wf77" event={"ID":"c727b772-a57c-4564-bf8d-7c8917b4bb0d","Type":"ContainerStarted","Data":"3bb995d8afebd66cb205ecb273a6978ecf3683cf1854bbb254968b35209ff5ec"} Dec 07 19:32:16 crc kubenswrapper[4815]: I1207 19:32:16.073163 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-x59hf" event={"ID":"8055e37c-efd4-4c82-a9df-4d5e2a12ef63","Type":"ContainerStarted","Data":"095e9477afb813109e8c17e65d1b2bb3015d3368d86761ccd98c1ad56e8fa52e"} Dec 07 19:32:16 crc kubenswrapper[4815]: I1207 19:32:16.074670 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-jr22f" event={"ID":"4c8e53f4-6dec-4655-b931-b8d0b8ddc8da","Type":"ContainerStarted","Data":"f76a282a08f18c02e74e4a48c4812d7981a1c043e329b4ef6506e889d5196ba6"} Dec 07 19:32:16 crc kubenswrapper[4815]: I1207 19:32:16.074845 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-jr22f" Dec 07 19:32:16 crc kubenswrapper[4815]: I1207 19:32:16.076024 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-5shnd" event={"ID":"0404b56e-87cb-40c5-b11c-64dc7c960718","Type":"ContainerStarted","Data":"700d8d352f1463a23259156ee8ebd707d78850c78940ebfeb218434af653bd89"} Dec 07 19:32:16 crc kubenswrapper[4815]: I1207 19:32:16.116036 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-jr22f" podStartSLOduration=6.107946962 podStartE2EDuration="53.11602072s" podCreationTimestamp="2025-12-07 19:31:23 +0000 UTC" firstStartedPulling="2025-12-07 19:31:26.61880429 +0000 UTC m=+991.197794335" lastFinishedPulling="2025-12-07 19:32:13.626878048 +0000 UTC m=+1038.205868093" observedRunningTime="2025-12-07 19:32:16.115249328 +0000 UTC m=+1040.694239383" watchObservedRunningTime="2025-12-07 19:32:16.11602072 +0000 UTC m=+1040.695010765" Dec 07 19:32:16 crc kubenswrapper[4815]: E1207 19:32:16.493868 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-x59hf" podUID="8055e37c-efd4-4c82-a9df-4d5e2a12ef63" Dec 07 19:32:16 crc kubenswrapper[4815]: E1207 19:32:16.613706 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-7d9wv" podUID="b7efe346-4e91-45a1-84d6-6a5aac1c739c" Dec 07 19:32:17 crc kubenswrapper[4815]: I1207 19:32:17.101100 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-7d9wv" event={"ID":"b7efe346-4e91-45a1-84d6-6a5aac1c739c","Type":"ContainerStarted","Data":"23d15d4efc0a92d0085a2adf0eb418bb264201f691e356329bf5d482065c0d9d"} Dec 07 19:32:17 crc kubenswrapper[4815]: I1207 19:32:17.147138 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-8mg5j" event={"ID":"0dc4f082-f669-444f-a46e-9bca4cc20f31","Type":"ContainerStarted","Data":"48bf6b2d87bf3582287dff9ef45b4ec0f0d3de1c946d26f2155a24a202e64392"} Dec 07 19:32:17 crc kubenswrapper[4815]: I1207 19:32:17.147432 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-8mg5j" Dec 07 19:32:17 crc kubenswrapper[4815]: I1207 19:32:17.178612 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fvvqrm" event={"ID":"097ed41c-7445-4a2f-ba72-c9ff11bb0e28","Type":"ContainerStarted","Data":"014c724d02af5fe2474ed4b30c831d061f9ae7789972a32043d4d3af6fd66aa6"} Dec 07 19:32:17 crc kubenswrapper[4815]: I1207 19:32:17.178741 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fvvqrm" Dec 07 19:32:17 crc kubenswrapper[4815]: I1207 19:32:17.188223 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-5rmk7" event={"ID":"01adf042-9afe-46be-ba0c-0c1a3f86ed8d","Type":"ContainerStarted","Data":"c29f22ae0daaf4003f1f5002ed54e561212ffc04aef258511fd328ff8ffd93e6"} Dec 07 19:32:17 crc kubenswrapper[4815]: I1207 19:32:17.188761 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-5rmk7" Dec 07 19:32:17 crc kubenswrapper[4815]: I1207 19:32:17.197078 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-8spbl" event={"ID":"8e51367a-70c8-4b67-b15f-ee4202171e38","Type":"ContainerStarted","Data":"a7796a7718609f9ad7cea8ee94a61d4cbc580abb02d2302d47743a99f6cdfc9c"} Dec 07 19:32:17 crc kubenswrapper[4815]: I1207 19:32:17.198755 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-8spbl" Dec 07 19:32:17 crc kubenswrapper[4815]: I1207 19:32:17.211674 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-8spbl" Dec 07 19:32:17 crc kubenswrapper[4815]: I1207 19:32:17.213733 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-trnlm" event={"ID":"d6eb6a40-b713-4a4a-9554-749f94cf1137","Type":"ContainerStarted","Data":"54378c2c9affadd07cfd967bf16d394c2831ec031b1c09b1fd3da5a16e7750e0"} Dec 07 19:32:17 crc kubenswrapper[4815]: I1207 19:32:17.214139 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-trnlm" Dec 07 19:32:17 crc kubenswrapper[4815]: I1207 19:32:17.234528 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-8mg5j" podStartSLOduration=6.815850428 podStartE2EDuration="53.234511774s" podCreationTimestamp="2025-12-07 19:31:24 +0000 UTC" firstStartedPulling="2025-12-07 19:31:28.020169195 +0000 UTC m=+992.599159240" lastFinishedPulling="2025-12-07 19:32:14.438830541 +0000 UTC m=+1039.017820586" observedRunningTime="2025-12-07 19:32:17.187404319 +0000 UTC m=+1041.766394364" watchObservedRunningTime="2025-12-07 19:32:17.234511774 +0000 UTC m=+1041.813501819" Dec 07 19:32:17 crc kubenswrapper[4815]: I1207 19:32:17.243365 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-psfhw" event={"ID":"cbeacff5-ac96-4444-aa57-04a320582348","Type":"ContainerStarted","Data":"2cc7aa657b35e74342111f53f1aa979b3a526d9cd3106e2d2f59e2609dc67300"} Dec 07 19:32:17 crc kubenswrapper[4815]: I1207 19:32:17.244097 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-psfhw" Dec 07 19:32:17 crc kubenswrapper[4815]: I1207 19:32:17.247104 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fvvqrm" podStartSLOduration=43.906642169 podStartE2EDuration="53.247081942s" podCreationTimestamp="2025-12-07 19:31:24 +0000 UTC" firstStartedPulling="2025-12-07 19:32:04.250562011 +0000 UTC m=+1028.829552056" lastFinishedPulling="2025-12-07 19:32:13.591001734 +0000 UTC m=+1038.169991829" observedRunningTime="2025-12-07 19:32:17.233490194 +0000 UTC m=+1041.812480239" watchObservedRunningTime="2025-12-07 19:32:17.247081942 +0000 UTC m=+1041.826071987" Dec 07 19:32:17 crc kubenswrapper[4815]: I1207 19:32:17.266558 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-5rmk7" podStartSLOduration=43.96952071 podStartE2EDuration="53.266540898s" podCreationTimestamp="2025-12-07 19:31:24 +0000 UTC" firstStartedPulling="2025-12-07 19:32:04.234398506 +0000 UTC m=+1028.813388551" lastFinishedPulling="2025-12-07 19:32:13.531418654 +0000 UTC m=+1038.110408739" observedRunningTime="2025-12-07 19:32:17.260287569 +0000 UTC m=+1041.839277634" watchObservedRunningTime="2025-12-07 19:32:17.266540898 +0000 UTC m=+1041.845530933" Dec 07 19:32:17 crc kubenswrapper[4815]: I1207 19:32:17.282934 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-trnlm" podStartSLOduration=6.005543832 podStartE2EDuration="53.282903395s" podCreationTimestamp="2025-12-07 19:31:24 +0000 UTC" firstStartedPulling="2025-12-07 19:31:27.213429571 +0000 UTC m=+991.792419616" lastFinishedPulling="2025-12-07 19:32:14.490789134 +0000 UTC m=+1039.069779179" observedRunningTime="2025-12-07 19:32:17.282210025 +0000 UTC m=+1041.861200070" watchObservedRunningTime="2025-12-07 19:32:17.282903395 +0000 UTC m=+1041.861893440" Dec 07 19:32:17 crc kubenswrapper[4815]: I1207 19:32:17.378799 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-psfhw" podStartSLOduration=7.840679257 podStartE2EDuration="54.378784621s" podCreationTimestamp="2025-12-07 19:31:23 +0000 UTC" firstStartedPulling="2025-12-07 19:31:27.089264728 +0000 UTC m=+991.668254773" lastFinishedPulling="2025-12-07 19:32:13.627370102 +0000 UTC m=+1038.206360137" observedRunningTime="2025-12-07 19:32:17.364825123 +0000 UTC m=+1041.943815168" watchObservedRunningTime="2025-12-07 19:32:17.378784621 +0000 UTC m=+1041.957774666" Dec 07 19:32:17 crc kubenswrapper[4815]: I1207 19:32:17.383494 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-8spbl" podStartSLOduration=7.1575273 podStartE2EDuration="52.383476784s" podCreationTimestamp="2025-12-07 19:31:25 +0000 UTC" firstStartedPulling="2025-12-07 19:31:28.410615331 +0000 UTC m=+992.989605376" lastFinishedPulling="2025-12-07 19:32:13.636564775 +0000 UTC m=+1038.215554860" observedRunningTime="2025-12-07 19:32:17.326210071 +0000 UTC m=+1041.905200116" watchObservedRunningTime="2025-12-07 19:32:17.383476784 +0000 UTC m=+1041.962466829" Dec 07 19:32:18 crc kubenswrapper[4815]: I1207 19:32:18.258718 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-967d97867-z7p7t" event={"ID":"954764fa-df14-4604-96d2-6ddc12155406","Type":"ContainerStarted","Data":"2c513638ef975ac4ae2f56543166bb524d3520552b89bb7765ed38e8478d738d"} Dec 07 19:32:18 crc kubenswrapper[4815]: I1207 19:32:18.259441 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-967d97867-z7p7t" Dec 07 19:32:18 crc kubenswrapper[4815]: I1207 19:32:18.260507 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-5shnd" event={"ID":"0404b56e-87cb-40c5-b11c-64dc7c960718","Type":"ContainerStarted","Data":"64d6eb4a388d164404891d5c31522cb151ea4ac7a702ad24b490d5a08a5fde32"} Dec 07 19:32:18 crc kubenswrapper[4815]: I1207 19:32:18.260660 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-5shnd" Dec 07 19:32:18 crc kubenswrapper[4815]: I1207 19:32:18.261894 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-gfvgj" event={"ID":"bed3f9f7-d38b-4987-90fd-1c4a380165f4","Type":"ContainerStarted","Data":"208bfc27060e4e9003c2d60447bc28dd17622e82c2b9adc1c387b3d935987542"} Dec 07 19:32:18 crc kubenswrapper[4815]: I1207 19:32:18.262198 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5854674fcc-gfvgj" Dec 07 19:32:18 crc kubenswrapper[4815]: I1207 19:32:18.263230 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-x59hf" event={"ID":"8055e37c-efd4-4c82-a9df-4d5e2a12ef63","Type":"ContainerStarted","Data":"26f79126684d49e30921a8e4b3594ff8a2a38f47a1d0d54f477e050c41942168"} Dec 07 19:32:18 crc kubenswrapper[4815]: I1207 19:32:18.263436 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-x59hf" Dec 07 19:32:18 crc kubenswrapper[4815]: I1207 19:32:18.264694 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-jsdgf" event={"ID":"2e427dde-fbfa-4b36-9749-e83080d8733a","Type":"ContainerStarted","Data":"67fdb691208c9eb003b1b93f7f43de9aa5a3a0090f9fee093b4fee365be5749e"} Dec 07 19:32:18 crc kubenswrapper[4815]: I1207 19:32:18.265860 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-7wwz9" event={"ID":"603612f4-25fa-4356-a99c-b054645d8919","Type":"ContainerStarted","Data":"16eac63e820ec327271de24117342d2024204908e711fdb1826f10f79222a67a"} Dec 07 19:32:18 crc kubenswrapper[4815]: I1207 19:32:18.265970 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-7wwz9" Dec 07 19:32:18 crc kubenswrapper[4815]: I1207 19:32:18.268396 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xx86c" event={"ID":"2aeb4ccc-f6a6-4614-917c-55cf0a46c3cd","Type":"ContainerStarted","Data":"7ed70ad65e038599974154a0bc65b6952171cc0f4f4e84a443de58188683cbe7"} Dec 07 19:32:18 crc kubenswrapper[4815]: I1207 19:32:18.270467 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-7d9wv" event={"ID":"b7efe346-4e91-45a1-84d6-6a5aac1c739c","Type":"ContainerStarted","Data":"aec2a45edc873c1c59960e8daffc82c22bd3327317d050b5f5baab23adce8b56"} Dec 07 19:32:18 crc kubenswrapper[4815]: I1207 19:32:18.270524 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-7d9wv" Dec 07 19:32:18 crc kubenswrapper[4815]: I1207 19:32:18.275196 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-4wf77" event={"ID":"c727b772-a57c-4564-bf8d-7c8917b4bb0d","Type":"ContainerStarted","Data":"d4e13493d8d8dd2fd297af8a0671198d31d3f789a59cf5d471773832812ad36f"} Dec 07 19:32:18 crc kubenswrapper[4815]: I1207 19:32:18.275316 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-4wf77" Dec 07 19:32:18 crc kubenswrapper[4815]: I1207 19:32:18.277721 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-m6p4v" event={"ID":"2cd3ca37-5d31-4068-88df-1344ebfad5e7","Type":"ContainerStarted","Data":"e48a4009076b76a24a1d5a697319a1b51492a37a510427a417bc5ea9505af65d"} Dec 07 19:32:18 crc kubenswrapper[4815]: I1207 19:32:18.278325 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-998648c74-m6p4v" Dec 07 19:32:18 crc kubenswrapper[4815]: I1207 19:32:18.283994 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-84b575879fvvqrm" Dec 07 19:32:18 crc kubenswrapper[4815]: I1207 19:32:18.289260 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-967d97867-z7p7t" podStartSLOduration=5.288136555 podStartE2EDuration="54.289245846s" podCreationTimestamp="2025-12-07 19:31:24 +0000 UTC" firstStartedPulling="2025-12-07 19:31:27.898983708 +0000 UTC m=+992.477973753" lastFinishedPulling="2025-12-07 19:32:16.900092999 +0000 UTC m=+1041.479083044" observedRunningTime="2025-12-07 19:32:18.286144908 +0000 UTC m=+1042.865134953" watchObservedRunningTime="2025-12-07 19:32:18.289245846 +0000 UTC m=+1042.868235891" Dec 07 19:32:18 crc kubenswrapper[4815]: I1207 19:32:18.317941 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-998648c74-m6p4v" podStartSLOduration=5.388451357 podStartE2EDuration="54.317923795s" podCreationTimestamp="2025-12-07 19:31:24 +0000 UTC" firstStartedPulling="2025-12-07 19:31:27.932321588 +0000 UTC m=+992.511311633" lastFinishedPulling="2025-12-07 19:32:16.861794026 +0000 UTC m=+1041.440784071" observedRunningTime="2025-12-07 19:32:18.317271756 +0000 UTC m=+1042.896261801" watchObservedRunningTime="2025-12-07 19:32:18.317923795 +0000 UTC m=+1042.896913840" Dec 07 19:32:18 crc kubenswrapper[4815]: I1207 19:32:18.393738 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-7d9wv" podStartSLOduration=4.734966526 podStartE2EDuration="54.393719448s" podCreationTimestamp="2025-12-07 19:31:24 +0000 UTC" firstStartedPulling="2025-12-07 19:31:27.931874115 +0000 UTC m=+992.510864160" lastFinishedPulling="2025-12-07 19:32:17.590627027 +0000 UTC m=+1042.169617082" observedRunningTime="2025-12-07 19:32:18.387868521 +0000 UTC m=+1042.966858566" watchObservedRunningTime="2025-12-07 19:32:18.393719448 +0000 UTC m=+1042.972709493" Dec 07 19:32:18 crc kubenswrapper[4815]: I1207 19:32:18.446727 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xx86c" podStartSLOduration=5.421090253 podStartE2EDuration="52.44671277s" podCreationTimestamp="2025-12-07 19:31:26 +0000 UTC" firstStartedPulling="2025-12-07 19:31:28.372435432 +0000 UTC m=+992.951425477" lastFinishedPulling="2025-12-07 19:32:15.398057949 +0000 UTC m=+1039.977047994" observedRunningTime="2025-12-07 19:32:18.443140668 +0000 UTC m=+1043.022130713" watchObservedRunningTime="2025-12-07 19:32:18.44671277 +0000 UTC m=+1043.025702815" Dec 07 19:32:18 crc kubenswrapper[4815]: I1207 19:32:18.501301 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-jsdgf" podStartSLOduration=5.870866077 podStartE2EDuration="54.501277878s" podCreationTimestamp="2025-12-07 19:31:24 +0000 UTC" firstStartedPulling="2025-12-07 19:31:28.237436288 +0000 UTC m=+992.816426333" lastFinishedPulling="2025-12-07 19:32:16.867848089 +0000 UTC m=+1041.446838134" observedRunningTime="2025-12-07 19:32:18.495014439 +0000 UTC m=+1043.074004484" watchObservedRunningTime="2025-12-07 19:32:18.501277878 +0000 UTC m=+1043.080267923" Dec 07 19:32:18 crc kubenswrapper[4815]: I1207 19:32:18.522742 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-7dc7d5d6ff-dnrvh" Dec 07 19:32:18 crc kubenswrapper[4815]: I1207 19:32:18.537368 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5854674fcc-gfvgj" podStartSLOduration=4.963962087 podStartE2EDuration="53.537353367s" podCreationTimestamp="2025-12-07 19:31:25 +0000 UTC" firstStartedPulling="2025-12-07 19:31:28.292558674 +0000 UTC m=+992.871548709" lastFinishedPulling="2025-12-07 19:32:16.865949944 +0000 UTC m=+1041.444939989" observedRunningTime="2025-12-07 19:32:18.536197995 +0000 UTC m=+1043.115188040" watchObservedRunningTime="2025-12-07 19:32:18.537353367 +0000 UTC m=+1043.116343412" Dec 07 19:32:18 crc kubenswrapper[4815]: I1207 19:32:18.581232 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-4wf77" podStartSLOduration=5.462248199 podStartE2EDuration="54.581216749s" podCreationTimestamp="2025-12-07 19:31:24 +0000 UTC" firstStartedPulling="2025-12-07 19:31:28.256548948 +0000 UTC m=+992.835538993" lastFinishedPulling="2025-12-07 19:32:17.375517498 +0000 UTC m=+1041.954507543" observedRunningTime="2025-12-07 19:32:18.579774838 +0000 UTC m=+1043.158764883" watchObservedRunningTime="2025-12-07 19:32:18.581216749 +0000 UTC m=+1043.160206794" Dec 07 19:32:18 crc kubenswrapper[4815]: I1207 19:32:18.677427 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-7wwz9" podStartSLOduration=6.018427784 podStartE2EDuration="55.677403965s" podCreationTimestamp="2025-12-07 19:31:23 +0000 UTC" firstStartedPulling="2025-12-07 19:31:27.239699237 +0000 UTC m=+991.818689282" lastFinishedPulling="2025-12-07 19:32:16.898675418 +0000 UTC m=+1041.477665463" observedRunningTime="2025-12-07 19:32:18.675962884 +0000 UTC m=+1043.254952929" watchObservedRunningTime="2025-12-07 19:32:18.677403965 +0000 UTC m=+1043.256394030" Dec 07 19:32:18 crc kubenswrapper[4815]: I1207 19:32:18.718332 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-5shnd" podStartSLOduration=5.256147732 podStartE2EDuration="54.718315422s" podCreationTimestamp="2025-12-07 19:31:24 +0000 UTC" firstStartedPulling="2025-12-07 19:31:27.899148793 +0000 UTC m=+992.478138838" lastFinishedPulling="2025-12-07 19:32:17.361316483 +0000 UTC m=+1041.940306528" observedRunningTime="2025-12-07 19:32:18.711510628 +0000 UTC m=+1043.290500673" watchObservedRunningTime="2025-12-07 19:32:18.718315422 +0000 UTC m=+1043.297305467" Dec 07 19:32:18 crc kubenswrapper[4815]: I1207 19:32:18.766517 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-x59hf" podStartSLOduration=4.922461038 podStartE2EDuration="54.766500568s" podCreationTimestamp="2025-12-07 19:31:24 +0000 UTC" firstStartedPulling="2025-12-07 19:31:27.973864873 +0000 UTC m=+992.552854918" lastFinishedPulling="2025-12-07 19:32:17.817904403 +0000 UTC m=+1042.396894448" observedRunningTime="2025-12-07 19:32:18.763031119 +0000 UTC m=+1043.342021174" watchObservedRunningTime="2025-12-07 19:32:18.766500568 +0000 UTC m=+1043.345490613" Dec 07 19:32:19 crc kubenswrapper[4815]: I1207 19:32:19.287668 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-jsdgf" Dec 07 19:32:19 crc kubenswrapper[4815]: I1207 19:32:19.288693 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-psfhw" Dec 07 19:32:20 crc kubenswrapper[4815]: I1207 19:32:20.751709 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-5rmk7" Dec 07 19:32:21 crc kubenswrapper[4815]: I1207 19:32:21.267304 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4hccq" Dec 07 19:32:21 crc kubenswrapper[4815]: I1207 19:32:21.267359 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4hccq" Dec 07 19:32:21 crc kubenswrapper[4815]: I1207 19:32:21.308503 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4hccq" Dec 07 19:32:21 crc kubenswrapper[4815]: I1207 19:32:21.360975 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4hccq" Dec 07 19:32:21 crc kubenswrapper[4815]: I1207 19:32:21.547603 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4hccq"] Dec 07 19:32:23 crc kubenswrapper[4815]: I1207 19:32:23.315949 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4hccq" podUID="9b60f96b-6202-45f2-b805-dfa1dbd47004" containerName="registry-server" containerID="cri-o://765b231ea00ae84118f9254b6c77d9dcd83bc14cd027c91dbbbdaced0eb1f065" gracePeriod=2 Dec 07 19:32:23 crc kubenswrapper[4815]: I1207 19:32:23.740813 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4hccq" Dec 07 19:32:23 crc kubenswrapper[4815]: I1207 19:32:23.894468 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b60f96b-6202-45f2-b805-dfa1dbd47004-catalog-content\") pod \"9b60f96b-6202-45f2-b805-dfa1dbd47004\" (UID: \"9b60f96b-6202-45f2-b805-dfa1dbd47004\") " Dec 07 19:32:23 crc kubenswrapper[4815]: I1207 19:32:23.894511 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b60f96b-6202-45f2-b805-dfa1dbd47004-utilities\") pod \"9b60f96b-6202-45f2-b805-dfa1dbd47004\" (UID: \"9b60f96b-6202-45f2-b805-dfa1dbd47004\") " Dec 07 19:32:23 crc kubenswrapper[4815]: I1207 19:32:23.894604 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dcf62\" (UniqueName: \"kubernetes.io/projected/9b60f96b-6202-45f2-b805-dfa1dbd47004-kube-api-access-dcf62\") pod \"9b60f96b-6202-45f2-b805-dfa1dbd47004\" (UID: \"9b60f96b-6202-45f2-b805-dfa1dbd47004\") " Dec 07 19:32:23 crc kubenswrapper[4815]: I1207 19:32:23.895976 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b60f96b-6202-45f2-b805-dfa1dbd47004-utilities" (OuterVolumeSpecName: "utilities") pod "9b60f96b-6202-45f2-b805-dfa1dbd47004" (UID: "9b60f96b-6202-45f2-b805-dfa1dbd47004"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 07 19:32:23 crc kubenswrapper[4815]: I1207 19:32:23.903000 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b60f96b-6202-45f2-b805-dfa1dbd47004-kube-api-access-dcf62" (OuterVolumeSpecName: "kube-api-access-dcf62") pod "9b60f96b-6202-45f2-b805-dfa1dbd47004" (UID: "9b60f96b-6202-45f2-b805-dfa1dbd47004"). InnerVolumeSpecName "kube-api-access-dcf62". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:32:23 crc kubenswrapper[4815]: I1207 19:32:23.920124 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b60f96b-6202-45f2-b805-dfa1dbd47004-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9b60f96b-6202-45f2-b805-dfa1dbd47004" (UID: "9b60f96b-6202-45f2-b805-dfa1dbd47004"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 07 19:32:23 crc kubenswrapper[4815]: I1207 19:32:23.996123 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dcf62\" (UniqueName: \"kubernetes.io/projected/9b60f96b-6202-45f2-b805-dfa1dbd47004-kube-api-access-dcf62\") on node \"crc\" DevicePath \"\"" Dec 07 19:32:23 crc kubenswrapper[4815]: I1207 19:32:23.996163 4815 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b60f96b-6202-45f2-b805-dfa1dbd47004-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 07 19:32:23 crc kubenswrapper[4815]: I1207 19:32:23.996180 4815 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b60f96b-6202-45f2-b805-dfa1dbd47004-utilities\") on node \"crc\" DevicePath \"\"" Dec 07 19:32:24 crc kubenswrapper[4815]: I1207 19:32:24.328142 4815 generic.go:334] "Generic (PLEG): container finished" podID="9b60f96b-6202-45f2-b805-dfa1dbd47004" containerID="765b231ea00ae84118f9254b6c77d9dcd83bc14cd027c91dbbbdaced0eb1f065" exitCode=0 Dec 07 19:32:24 crc kubenswrapper[4815]: I1207 19:32:24.328190 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4hccq" event={"ID":"9b60f96b-6202-45f2-b805-dfa1dbd47004","Type":"ContainerDied","Data":"765b231ea00ae84118f9254b6c77d9dcd83bc14cd027c91dbbbdaced0eb1f065"} Dec 07 19:32:24 crc kubenswrapper[4815]: I1207 19:32:24.328220 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4hccq" event={"ID":"9b60f96b-6202-45f2-b805-dfa1dbd47004","Type":"ContainerDied","Data":"7bf61a096267b25a66c6d2b9d610acd25418af867890e77b4ccb77264d985222"} Dec 07 19:32:24 crc kubenswrapper[4815]: I1207 19:32:24.328240 4815 scope.go:117] "RemoveContainer" containerID="765b231ea00ae84118f9254b6c77d9dcd83bc14cd027c91dbbbdaced0eb1f065" Dec 07 19:32:24 crc kubenswrapper[4815]: I1207 19:32:24.328262 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4hccq" Dec 07 19:32:24 crc kubenswrapper[4815]: I1207 19:32:24.350444 4815 scope.go:117] "RemoveContainer" containerID="266f3b0fd592a4440604d1fbb020bcf83107356108c140b300bfc8166ab1753f" Dec 07 19:32:24 crc kubenswrapper[4815]: I1207 19:32:24.382511 4815 scope.go:117] "RemoveContainer" containerID="f2be0ccc605fbb4bfcc86b4a56886049a15eb05b6b29894fb43cca9a6613fe46" Dec 07 19:32:24 crc kubenswrapper[4815]: I1207 19:32:24.403660 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4hccq"] Dec 07 19:32:24 crc kubenswrapper[4815]: I1207 19:32:24.423644 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4hccq"] Dec 07 19:32:24 crc kubenswrapper[4815]: I1207 19:32:24.429535 4815 scope.go:117] "RemoveContainer" containerID="765b231ea00ae84118f9254b6c77d9dcd83bc14cd027c91dbbbdaced0eb1f065" Dec 07 19:32:24 crc kubenswrapper[4815]: E1207 19:32:24.433947 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"765b231ea00ae84118f9254b6c77d9dcd83bc14cd027c91dbbbdaced0eb1f065\": container with ID starting with 765b231ea00ae84118f9254b6c77d9dcd83bc14cd027c91dbbbdaced0eb1f065 not found: ID does not exist" containerID="765b231ea00ae84118f9254b6c77d9dcd83bc14cd027c91dbbbdaced0eb1f065" Dec 07 19:32:24 crc kubenswrapper[4815]: I1207 19:32:24.434085 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"765b231ea00ae84118f9254b6c77d9dcd83bc14cd027c91dbbbdaced0eb1f065"} err="failed to get container status \"765b231ea00ae84118f9254b6c77d9dcd83bc14cd027c91dbbbdaced0eb1f065\": rpc error: code = NotFound desc = could not find container \"765b231ea00ae84118f9254b6c77d9dcd83bc14cd027c91dbbbdaced0eb1f065\": container with ID starting with 765b231ea00ae84118f9254b6c77d9dcd83bc14cd027c91dbbbdaced0eb1f065 not found: ID does not exist" Dec 07 19:32:24 crc kubenswrapper[4815]: I1207 19:32:24.434173 4815 scope.go:117] "RemoveContainer" containerID="266f3b0fd592a4440604d1fbb020bcf83107356108c140b300bfc8166ab1753f" Dec 07 19:32:24 crc kubenswrapper[4815]: E1207 19:32:24.434811 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"266f3b0fd592a4440604d1fbb020bcf83107356108c140b300bfc8166ab1753f\": container with ID starting with 266f3b0fd592a4440604d1fbb020bcf83107356108c140b300bfc8166ab1753f not found: ID does not exist" containerID="266f3b0fd592a4440604d1fbb020bcf83107356108c140b300bfc8166ab1753f" Dec 07 19:32:24 crc kubenswrapper[4815]: I1207 19:32:24.434907 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"266f3b0fd592a4440604d1fbb020bcf83107356108c140b300bfc8166ab1753f"} err="failed to get container status \"266f3b0fd592a4440604d1fbb020bcf83107356108c140b300bfc8166ab1753f\": rpc error: code = NotFound desc = could not find container \"266f3b0fd592a4440604d1fbb020bcf83107356108c140b300bfc8166ab1753f\": container with ID starting with 266f3b0fd592a4440604d1fbb020bcf83107356108c140b300bfc8166ab1753f not found: ID does not exist" Dec 07 19:32:24 crc kubenswrapper[4815]: I1207 19:32:24.434992 4815 scope.go:117] "RemoveContainer" containerID="f2be0ccc605fbb4bfcc86b4a56886049a15eb05b6b29894fb43cca9a6613fe46" Dec 07 19:32:24 crc kubenswrapper[4815]: E1207 19:32:24.435420 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2be0ccc605fbb4bfcc86b4a56886049a15eb05b6b29894fb43cca9a6613fe46\": container with ID starting with f2be0ccc605fbb4bfcc86b4a56886049a15eb05b6b29894fb43cca9a6613fe46 not found: ID does not exist" containerID="f2be0ccc605fbb4bfcc86b4a56886049a15eb05b6b29894fb43cca9a6613fe46" Dec 07 19:32:24 crc kubenswrapper[4815]: I1207 19:32:24.435497 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2be0ccc605fbb4bfcc86b4a56886049a15eb05b6b29894fb43cca9a6613fe46"} err="failed to get container status \"f2be0ccc605fbb4bfcc86b4a56886049a15eb05b6b29894fb43cca9a6613fe46\": rpc error: code = NotFound desc = could not find container \"f2be0ccc605fbb4bfcc86b4a56886049a15eb05b6b29894fb43cca9a6613fe46\": container with ID starting with f2be0ccc605fbb4bfcc86b4a56886049a15eb05b6b29894fb43cca9a6613fe46 not found: ID does not exist" Dec 07 19:32:24 crc kubenswrapper[4815]: I1207 19:32:24.563383 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-jr22f" Dec 07 19:32:24 crc kubenswrapper[4815]: I1207 19:32:24.712783 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-7wwz9" Dec 07 19:32:24 crc kubenswrapper[4815]: I1207 19:32:24.807367 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-trnlm" Dec 07 19:32:25 crc kubenswrapper[4815]: I1207 19:32:25.331980 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-967d97867-z7p7t" Dec 07 19:32:25 crc kubenswrapper[4815]: I1207 19:32:25.370659 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-x59hf" Dec 07 19:32:25 crc kubenswrapper[4815]: I1207 19:32:25.382389 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-5shnd" Dec 07 19:32:25 crc kubenswrapper[4815]: I1207 19:32:25.439761 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-7d9wv" Dec 07 19:32:25 crc kubenswrapper[4815]: I1207 19:32:25.653965 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-4wf77" Dec 07 19:32:25 crc kubenswrapper[4815]: I1207 19:32:25.738522 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-998648c74-m6p4v" Dec 07 19:32:25 crc kubenswrapper[4815]: I1207 19:32:25.777971 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b60f96b-6202-45f2-b805-dfa1dbd47004" path="/var/lib/kubelet/pods/9b60f96b-6202-45f2-b805-dfa1dbd47004/volumes" Dec 07 19:32:25 crc kubenswrapper[4815]: I1207 19:32:25.831931 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-8mg5j" Dec 07 19:32:26 crc kubenswrapper[4815]: I1207 19:32:26.027163 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-jsdgf" Dec 07 19:32:26 crc kubenswrapper[4815]: I1207 19:32:26.247279 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5854674fcc-gfvgj" Dec 07 19:32:30 crc kubenswrapper[4815]: I1207 19:32:30.475937 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-4sb78" event={"ID":"94d115be-3d8c-46e7-9a22-e09bb888afc8","Type":"ContainerStarted","Data":"69c967657a4b72007974ecaa8eeb673baf39236816ca44ae221bf0e1eab7f463"} Dec 07 19:32:30 crc kubenswrapper[4815]: I1207 19:32:30.476759 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-78f8948974-4sb78" Dec 07 19:32:30 crc kubenswrapper[4815]: I1207 19:32:30.498062 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-78f8948974-4sb78" podStartSLOduration=4.791221125 podStartE2EDuration="1m6.498037248s" podCreationTimestamp="2025-12-07 19:31:24 +0000 UTC" firstStartedPulling="2025-12-07 19:31:28.457455149 +0000 UTC m=+993.036445184" lastFinishedPulling="2025-12-07 19:32:30.164271262 +0000 UTC m=+1054.743261307" observedRunningTime="2025-12-07 19:32:30.494484856 +0000 UTC m=+1055.073474911" watchObservedRunningTime="2025-12-07 19:32:30.498037248 +0000 UTC m=+1055.077027293" Dec 07 19:32:36 crc kubenswrapper[4815]: I1207 19:32:36.036802 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-78f8948974-4sb78" Dec 07 19:32:51 crc kubenswrapper[4815]: I1207 19:32:51.223193 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-zdbbg"] Dec 07 19:32:51 crc kubenswrapper[4815]: E1207 19:32:51.224039 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b60f96b-6202-45f2-b805-dfa1dbd47004" containerName="extract-utilities" Dec 07 19:32:51 crc kubenswrapper[4815]: I1207 19:32:51.224057 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b60f96b-6202-45f2-b805-dfa1dbd47004" containerName="extract-utilities" Dec 07 19:32:51 crc kubenswrapper[4815]: E1207 19:32:51.224091 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b1c55a7-0911-4926-92da-1710e03daba2" containerName="extract-content" Dec 07 19:32:51 crc kubenswrapper[4815]: I1207 19:32:51.224100 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b1c55a7-0911-4926-92da-1710e03daba2" containerName="extract-content" Dec 07 19:32:51 crc kubenswrapper[4815]: E1207 19:32:51.224120 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b60f96b-6202-45f2-b805-dfa1dbd47004" containerName="registry-server" Dec 07 19:32:51 crc kubenswrapper[4815]: I1207 19:32:51.224128 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b60f96b-6202-45f2-b805-dfa1dbd47004" containerName="registry-server" Dec 07 19:32:51 crc kubenswrapper[4815]: E1207 19:32:51.224139 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b1c55a7-0911-4926-92da-1710e03daba2" containerName="registry-server" Dec 07 19:32:51 crc kubenswrapper[4815]: I1207 19:32:51.224145 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b1c55a7-0911-4926-92da-1710e03daba2" containerName="registry-server" Dec 07 19:32:51 crc kubenswrapper[4815]: E1207 19:32:51.224171 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b1c55a7-0911-4926-92da-1710e03daba2" containerName="extract-utilities" Dec 07 19:32:51 crc kubenswrapper[4815]: I1207 19:32:51.224179 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b1c55a7-0911-4926-92da-1710e03daba2" containerName="extract-utilities" Dec 07 19:32:51 crc kubenswrapper[4815]: E1207 19:32:51.224195 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b60f96b-6202-45f2-b805-dfa1dbd47004" containerName="extract-content" Dec 07 19:32:51 crc kubenswrapper[4815]: I1207 19:32:51.224202 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b60f96b-6202-45f2-b805-dfa1dbd47004" containerName="extract-content" Dec 07 19:32:51 crc kubenswrapper[4815]: I1207 19:32:51.224377 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b1c55a7-0911-4926-92da-1710e03daba2" containerName="registry-server" Dec 07 19:32:51 crc kubenswrapper[4815]: I1207 19:32:51.224401 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b60f96b-6202-45f2-b805-dfa1dbd47004" containerName="registry-server" Dec 07 19:32:51 crc kubenswrapper[4815]: I1207 19:32:51.225273 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-zdbbg" Dec 07 19:32:51 crc kubenswrapper[4815]: I1207 19:32:51.227651 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Dec 07 19:32:51 crc kubenswrapper[4815]: I1207 19:32:51.227651 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Dec 07 19:32:51 crc kubenswrapper[4815]: I1207 19:32:51.233209 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Dec 07 19:32:51 crc kubenswrapper[4815]: I1207 19:32:51.240422 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-75dnx" Dec 07 19:32:51 crc kubenswrapper[4815]: I1207 19:32:51.251474 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-zdbbg"] Dec 07 19:32:51 crc kubenswrapper[4815]: I1207 19:32:51.260851 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsb4x\" (UniqueName: \"kubernetes.io/projected/20f3d558-f5be-4f87-84c2-627a3e9dadde-kube-api-access-wsb4x\") pod \"dnsmasq-dns-675f4bcbfc-zdbbg\" (UID: \"20f3d558-f5be-4f87-84c2-627a3e9dadde\") " pod="openstack/dnsmasq-dns-675f4bcbfc-zdbbg" Dec 07 19:32:51 crc kubenswrapper[4815]: I1207 19:32:51.260906 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20f3d558-f5be-4f87-84c2-627a3e9dadde-config\") pod \"dnsmasq-dns-675f4bcbfc-zdbbg\" (UID: \"20f3d558-f5be-4f87-84c2-627a3e9dadde\") " pod="openstack/dnsmasq-dns-675f4bcbfc-zdbbg" Dec 07 19:32:51 crc kubenswrapper[4815]: I1207 19:32:51.327146 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-ts9zd"] Dec 07 19:32:51 crc kubenswrapper[4815]: I1207 19:32:51.328403 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-ts9zd" Dec 07 19:32:51 crc kubenswrapper[4815]: I1207 19:32:51.332952 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Dec 07 19:32:51 crc kubenswrapper[4815]: I1207 19:32:51.362065 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jnsn\" (UniqueName: \"kubernetes.io/projected/75e23e74-8711-40ca-82e9-ff6b9854a60e-kube-api-access-2jnsn\") pod \"dnsmasq-dns-78dd6ddcc-ts9zd\" (UID: \"75e23e74-8711-40ca-82e9-ff6b9854a60e\") " pod="openstack/dnsmasq-dns-78dd6ddcc-ts9zd" Dec 07 19:32:51 crc kubenswrapper[4815]: I1207 19:32:51.362125 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75e23e74-8711-40ca-82e9-ff6b9854a60e-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-ts9zd\" (UID: \"75e23e74-8711-40ca-82e9-ff6b9854a60e\") " pod="openstack/dnsmasq-dns-78dd6ddcc-ts9zd" Dec 07 19:32:51 crc kubenswrapper[4815]: I1207 19:32:51.362165 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wsb4x\" (UniqueName: \"kubernetes.io/projected/20f3d558-f5be-4f87-84c2-627a3e9dadde-kube-api-access-wsb4x\") pod \"dnsmasq-dns-675f4bcbfc-zdbbg\" (UID: \"20f3d558-f5be-4f87-84c2-627a3e9dadde\") " pod="openstack/dnsmasq-dns-675f4bcbfc-zdbbg" Dec 07 19:32:51 crc kubenswrapper[4815]: I1207 19:32:51.362307 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20f3d558-f5be-4f87-84c2-627a3e9dadde-config\") pod \"dnsmasq-dns-675f4bcbfc-zdbbg\" (UID: \"20f3d558-f5be-4f87-84c2-627a3e9dadde\") " pod="openstack/dnsmasq-dns-675f4bcbfc-zdbbg" Dec 07 19:32:51 crc kubenswrapper[4815]: I1207 19:32:51.362347 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75e23e74-8711-40ca-82e9-ff6b9854a60e-config\") pod \"dnsmasq-dns-78dd6ddcc-ts9zd\" (UID: \"75e23e74-8711-40ca-82e9-ff6b9854a60e\") " pod="openstack/dnsmasq-dns-78dd6ddcc-ts9zd" Dec 07 19:32:51 crc kubenswrapper[4815]: I1207 19:32:51.363128 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20f3d558-f5be-4f87-84c2-627a3e9dadde-config\") pod \"dnsmasq-dns-675f4bcbfc-zdbbg\" (UID: \"20f3d558-f5be-4f87-84c2-627a3e9dadde\") " pod="openstack/dnsmasq-dns-675f4bcbfc-zdbbg" Dec 07 19:32:51 crc kubenswrapper[4815]: I1207 19:32:51.397162 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-ts9zd"] Dec 07 19:32:51 crc kubenswrapper[4815]: I1207 19:32:51.408518 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsb4x\" (UniqueName: \"kubernetes.io/projected/20f3d558-f5be-4f87-84c2-627a3e9dadde-kube-api-access-wsb4x\") pod \"dnsmasq-dns-675f4bcbfc-zdbbg\" (UID: \"20f3d558-f5be-4f87-84c2-627a3e9dadde\") " pod="openstack/dnsmasq-dns-675f4bcbfc-zdbbg" Dec 07 19:32:51 crc kubenswrapper[4815]: I1207 19:32:51.463250 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75e23e74-8711-40ca-82e9-ff6b9854a60e-config\") pod \"dnsmasq-dns-78dd6ddcc-ts9zd\" (UID: \"75e23e74-8711-40ca-82e9-ff6b9854a60e\") " pod="openstack/dnsmasq-dns-78dd6ddcc-ts9zd" Dec 07 19:32:51 crc kubenswrapper[4815]: I1207 19:32:51.463300 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jnsn\" (UniqueName: \"kubernetes.io/projected/75e23e74-8711-40ca-82e9-ff6b9854a60e-kube-api-access-2jnsn\") pod \"dnsmasq-dns-78dd6ddcc-ts9zd\" (UID: \"75e23e74-8711-40ca-82e9-ff6b9854a60e\") " pod="openstack/dnsmasq-dns-78dd6ddcc-ts9zd" Dec 07 19:32:51 crc kubenswrapper[4815]: I1207 19:32:51.463348 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75e23e74-8711-40ca-82e9-ff6b9854a60e-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-ts9zd\" (UID: \"75e23e74-8711-40ca-82e9-ff6b9854a60e\") " pod="openstack/dnsmasq-dns-78dd6ddcc-ts9zd" Dec 07 19:32:51 crc kubenswrapper[4815]: I1207 19:32:51.464163 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75e23e74-8711-40ca-82e9-ff6b9854a60e-config\") pod \"dnsmasq-dns-78dd6ddcc-ts9zd\" (UID: \"75e23e74-8711-40ca-82e9-ff6b9854a60e\") " pod="openstack/dnsmasq-dns-78dd6ddcc-ts9zd" Dec 07 19:32:51 crc kubenswrapper[4815]: I1207 19:32:51.464195 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75e23e74-8711-40ca-82e9-ff6b9854a60e-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-ts9zd\" (UID: \"75e23e74-8711-40ca-82e9-ff6b9854a60e\") " pod="openstack/dnsmasq-dns-78dd6ddcc-ts9zd" Dec 07 19:32:51 crc kubenswrapper[4815]: I1207 19:32:51.485780 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jnsn\" (UniqueName: \"kubernetes.io/projected/75e23e74-8711-40ca-82e9-ff6b9854a60e-kube-api-access-2jnsn\") pod \"dnsmasq-dns-78dd6ddcc-ts9zd\" (UID: \"75e23e74-8711-40ca-82e9-ff6b9854a60e\") " pod="openstack/dnsmasq-dns-78dd6ddcc-ts9zd" Dec 07 19:32:51 crc kubenswrapper[4815]: I1207 19:32:51.540358 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-zdbbg" Dec 07 19:32:51 crc kubenswrapper[4815]: I1207 19:32:51.645616 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-ts9zd" Dec 07 19:32:52 crc kubenswrapper[4815]: I1207 19:32:52.049229 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-zdbbg"] Dec 07 19:32:52 crc kubenswrapper[4815]: I1207 19:32:52.108172 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-ts9zd"] Dec 07 19:32:52 crc kubenswrapper[4815]: I1207 19:32:52.673425 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-zdbbg" event={"ID":"20f3d558-f5be-4f87-84c2-627a3e9dadde","Type":"ContainerStarted","Data":"4e863b62718c51f22196f68c80b467d91d6dc1fb4a21ce5053074a0028498423"} Dec 07 19:32:52 crc kubenswrapper[4815]: I1207 19:32:52.676742 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-ts9zd" event={"ID":"75e23e74-8711-40ca-82e9-ff6b9854a60e","Type":"ContainerStarted","Data":"1904d44a412c2b79e197f3b5802cfd452d4e81e866a465cb268c16b588ae6366"} Dec 07 19:32:54 crc kubenswrapper[4815]: I1207 19:32:54.769753 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-zdbbg"] Dec 07 19:32:54 crc kubenswrapper[4815]: I1207 19:32:54.799214 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-2wpl6"] Dec 07 19:32:54 crc kubenswrapper[4815]: I1207 19:32:54.800866 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-2wpl6" Dec 07 19:32:54 crc kubenswrapper[4815]: I1207 19:32:54.813560 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-2wpl6"] Dec 07 19:32:54 crc kubenswrapper[4815]: I1207 19:32:54.887998 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5847a638-115d-4f64-afe1-2935c623a28e-config\") pod \"dnsmasq-dns-666b6646f7-2wpl6\" (UID: \"5847a638-115d-4f64-afe1-2935c623a28e\") " pod="openstack/dnsmasq-dns-666b6646f7-2wpl6" Dec 07 19:32:54 crc kubenswrapper[4815]: I1207 19:32:54.889129 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5847a638-115d-4f64-afe1-2935c623a28e-dns-svc\") pod \"dnsmasq-dns-666b6646f7-2wpl6\" (UID: \"5847a638-115d-4f64-afe1-2935c623a28e\") " pod="openstack/dnsmasq-dns-666b6646f7-2wpl6" Dec 07 19:32:54 crc kubenswrapper[4815]: I1207 19:32:54.889210 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gckx\" (UniqueName: \"kubernetes.io/projected/5847a638-115d-4f64-afe1-2935c623a28e-kube-api-access-9gckx\") pod \"dnsmasq-dns-666b6646f7-2wpl6\" (UID: \"5847a638-115d-4f64-afe1-2935c623a28e\") " pod="openstack/dnsmasq-dns-666b6646f7-2wpl6" Dec 07 19:32:54 crc kubenswrapper[4815]: I1207 19:32:54.991974 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5847a638-115d-4f64-afe1-2935c623a28e-dns-svc\") pod \"dnsmasq-dns-666b6646f7-2wpl6\" (UID: \"5847a638-115d-4f64-afe1-2935c623a28e\") " pod="openstack/dnsmasq-dns-666b6646f7-2wpl6" Dec 07 19:32:54 crc kubenswrapper[4815]: I1207 19:32:54.992046 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gckx\" (UniqueName: \"kubernetes.io/projected/5847a638-115d-4f64-afe1-2935c623a28e-kube-api-access-9gckx\") pod \"dnsmasq-dns-666b6646f7-2wpl6\" (UID: \"5847a638-115d-4f64-afe1-2935c623a28e\") " pod="openstack/dnsmasq-dns-666b6646f7-2wpl6" Dec 07 19:32:54 crc kubenswrapper[4815]: I1207 19:32:54.992108 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5847a638-115d-4f64-afe1-2935c623a28e-config\") pod \"dnsmasq-dns-666b6646f7-2wpl6\" (UID: \"5847a638-115d-4f64-afe1-2935c623a28e\") " pod="openstack/dnsmasq-dns-666b6646f7-2wpl6" Dec 07 19:32:54 crc kubenswrapper[4815]: I1207 19:32:54.993691 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5847a638-115d-4f64-afe1-2935c623a28e-config\") pod \"dnsmasq-dns-666b6646f7-2wpl6\" (UID: \"5847a638-115d-4f64-afe1-2935c623a28e\") " pod="openstack/dnsmasq-dns-666b6646f7-2wpl6" Dec 07 19:32:54 crc kubenswrapper[4815]: I1207 19:32:54.994431 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5847a638-115d-4f64-afe1-2935c623a28e-dns-svc\") pod \"dnsmasq-dns-666b6646f7-2wpl6\" (UID: \"5847a638-115d-4f64-afe1-2935c623a28e\") " pod="openstack/dnsmasq-dns-666b6646f7-2wpl6" Dec 07 19:32:55 crc kubenswrapper[4815]: I1207 19:32:55.055328 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gckx\" (UniqueName: \"kubernetes.io/projected/5847a638-115d-4f64-afe1-2935c623a28e-kube-api-access-9gckx\") pod \"dnsmasq-dns-666b6646f7-2wpl6\" (UID: \"5847a638-115d-4f64-afe1-2935c623a28e\") " pod="openstack/dnsmasq-dns-666b6646f7-2wpl6" Dec 07 19:32:55 crc kubenswrapper[4815]: I1207 19:32:55.122176 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-ts9zd"] Dec 07 19:32:55 crc kubenswrapper[4815]: I1207 19:32:55.129402 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-2wpl6" Dec 07 19:32:55 crc kubenswrapper[4815]: I1207 19:32:55.165965 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-2gxx9"] Dec 07 19:32:55 crc kubenswrapper[4815]: I1207 19:32:55.169206 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-2gxx9" Dec 07 19:32:55 crc kubenswrapper[4815]: I1207 19:32:55.186590 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-2gxx9"] Dec 07 19:32:55 crc kubenswrapper[4815]: I1207 19:32:55.296195 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dgzw\" (UniqueName: \"kubernetes.io/projected/3864f9d5-9110-45d5-a5a1-9790beeae00c-kube-api-access-4dgzw\") pod \"dnsmasq-dns-57d769cc4f-2gxx9\" (UID: \"3864f9d5-9110-45d5-a5a1-9790beeae00c\") " pod="openstack/dnsmasq-dns-57d769cc4f-2gxx9" Dec 07 19:32:55 crc kubenswrapper[4815]: I1207 19:32:55.296275 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3864f9d5-9110-45d5-a5a1-9790beeae00c-config\") pod \"dnsmasq-dns-57d769cc4f-2gxx9\" (UID: \"3864f9d5-9110-45d5-a5a1-9790beeae00c\") " pod="openstack/dnsmasq-dns-57d769cc4f-2gxx9" Dec 07 19:32:55 crc kubenswrapper[4815]: I1207 19:32:55.296329 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3864f9d5-9110-45d5-a5a1-9790beeae00c-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-2gxx9\" (UID: \"3864f9d5-9110-45d5-a5a1-9790beeae00c\") " pod="openstack/dnsmasq-dns-57d769cc4f-2gxx9" Dec 07 19:32:55 crc kubenswrapper[4815]: I1207 19:32:55.397596 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dgzw\" (UniqueName: \"kubernetes.io/projected/3864f9d5-9110-45d5-a5a1-9790beeae00c-kube-api-access-4dgzw\") pod \"dnsmasq-dns-57d769cc4f-2gxx9\" (UID: \"3864f9d5-9110-45d5-a5a1-9790beeae00c\") " pod="openstack/dnsmasq-dns-57d769cc4f-2gxx9" Dec 07 19:32:55 crc kubenswrapper[4815]: I1207 19:32:55.397649 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3864f9d5-9110-45d5-a5a1-9790beeae00c-config\") pod \"dnsmasq-dns-57d769cc4f-2gxx9\" (UID: \"3864f9d5-9110-45d5-a5a1-9790beeae00c\") " pod="openstack/dnsmasq-dns-57d769cc4f-2gxx9" Dec 07 19:32:55 crc kubenswrapper[4815]: I1207 19:32:55.397680 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3864f9d5-9110-45d5-a5a1-9790beeae00c-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-2gxx9\" (UID: \"3864f9d5-9110-45d5-a5a1-9790beeae00c\") " pod="openstack/dnsmasq-dns-57d769cc4f-2gxx9" Dec 07 19:32:55 crc kubenswrapper[4815]: I1207 19:32:55.399102 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3864f9d5-9110-45d5-a5a1-9790beeae00c-config\") pod \"dnsmasq-dns-57d769cc4f-2gxx9\" (UID: \"3864f9d5-9110-45d5-a5a1-9790beeae00c\") " pod="openstack/dnsmasq-dns-57d769cc4f-2gxx9" Dec 07 19:32:55 crc kubenswrapper[4815]: I1207 19:32:55.400278 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3864f9d5-9110-45d5-a5a1-9790beeae00c-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-2gxx9\" (UID: \"3864f9d5-9110-45d5-a5a1-9790beeae00c\") " pod="openstack/dnsmasq-dns-57d769cc4f-2gxx9" Dec 07 19:32:55 crc kubenswrapper[4815]: I1207 19:32:55.439131 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dgzw\" (UniqueName: \"kubernetes.io/projected/3864f9d5-9110-45d5-a5a1-9790beeae00c-kube-api-access-4dgzw\") pod \"dnsmasq-dns-57d769cc4f-2gxx9\" (UID: \"3864f9d5-9110-45d5-a5a1-9790beeae00c\") " pod="openstack/dnsmasq-dns-57d769cc4f-2gxx9" Dec 07 19:32:55 crc kubenswrapper[4815]: I1207 19:32:55.570059 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-2gxx9" Dec 07 19:32:55 crc kubenswrapper[4815]: I1207 19:32:55.799999 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-2wpl6"] Dec 07 19:32:55 crc kubenswrapper[4815]: W1207 19:32:55.831223 4815 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5847a638_115d_4f64_afe1_2935c623a28e.slice/crio-e17efe4f2a4d42448fefd6384ac55e70b2a532e25131b0821bb06a58b41ec55c WatchSource:0}: Error finding container e17efe4f2a4d42448fefd6384ac55e70b2a532e25131b0821bb06a58b41ec55c: Status 404 returned error can't find the container with id e17efe4f2a4d42448fefd6384ac55e70b2a532e25131b0821bb06a58b41ec55c Dec 07 19:32:56 crc kubenswrapper[4815]: I1207 19:32:56.069763 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 07 19:32:56 crc kubenswrapper[4815]: I1207 19:32:56.071883 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 07 19:32:56 crc kubenswrapper[4815]: I1207 19:32:56.117356 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-ccddp" Dec 07 19:32:56 crc kubenswrapper[4815]: I1207 19:32:56.119804 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 07 19:32:56 crc kubenswrapper[4815]: I1207 19:32:56.120145 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 07 19:32:56 crc kubenswrapper[4815]: I1207 19:32:56.120455 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 07 19:32:56 crc kubenswrapper[4815]: I1207 19:32:56.122319 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 07 19:32:56 crc kubenswrapper[4815]: I1207 19:32:56.122493 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 07 19:32:56 crc kubenswrapper[4815]: I1207 19:32:56.128420 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 07 19:32:56 crc kubenswrapper[4815]: I1207 19:32:56.140165 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 07 19:32:56 crc kubenswrapper[4815]: I1207 19:32:56.220723 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c2f348fe-7af1-4260-9946-27b3e711400d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c2f348fe-7af1-4260-9946-27b3e711400d\") " pod="openstack/rabbitmq-server-0" Dec 07 19:32:56 crc kubenswrapper[4815]: I1207 19:32:56.220775 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c2f348fe-7af1-4260-9946-27b3e711400d-config-data\") pod \"rabbitmq-server-0\" (UID: \"c2f348fe-7af1-4260-9946-27b3e711400d\") " pod="openstack/rabbitmq-server-0" Dec 07 19:32:56 crc kubenswrapper[4815]: I1207 19:32:56.220798 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vh75p\" (UniqueName: \"kubernetes.io/projected/c2f348fe-7af1-4260-9946-27b3e711400d-kube-api-access-vh75p\") pod \"rabbitmq-server-0\" (UID: \"c2f348fe-7af1-4260-9946-27b3e711400d\") " pod="openstack/rabbitmq-server-0" Dec 07 19:32:56 crc kubenswrapper[4815]: I1207 19:32:56.220831 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c2f348fe-7af1-4260-9946-27b3e711400d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c2f348fe-7af1-4260-9946-27b3e711400d\") " pod="openstack/rabbitmq-server-0" Dec 07 19:32:56 crc kubenswrapper[4815]: I1207 19:32:56.220863 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c2f348fe-7af1-4260-9946-27b3e711400d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c2f348fe-7af1-4260-9946-27b3e711400d\") " pod="openstack/rabbitmq-server-0" Dec 07 19:32:56 crc kubenswrapper[4815]: I1207 19:32:56.220880 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"c2f348fe-7af1-4260-9946-27b3e711400d\") " pod="openstack/rabbitmq-server-0" Dec 07 19:32:56 crc kubenswrapper[4815]: I1207 19:32:56.220893 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c2f348fe-7af1-4260-9946-27b3e711400d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c2f348fe-7af1-4260-9946-27b3e711400d\") " pod="openstack/rabbitmq-server-0" Dec 07 19:32:56 crc kubenswrapper[4815]: I1207 19:32:56.220908 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c2f348fe-7af1-4260-9946-27b3e711400d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c2f348fe-7af1-4260-9946-27b3e711400d\") " pod="openstack/rabbitmq-server-0" Dec 07 19:32:56 crc kubenswrapper[4815]: I1207 19:32:56.220939 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c2f348fe-7af1-4260-9946-27b3e711400d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c2f348fe-7af1-4260-9946-27b3e711400d\") " pod="openstack/rabbitmq-server-0" Dec 07 19:32:56 crc kubenswrapper[4815]: I1207 19:32:56.220957 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c2f348fe-7af1-4260-9946-27b3e711400d-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c2f348fe-7af1-4260-9946-27b3e711400d\") " pod="openstack/rabbitmq-server-0" Dec 07 19:32:56 crc kubenswrapper[4815]: I1207 19:32:56.220983 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c2f348fe-7af1-4260-9946-27b3e711400d-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c2f348fe-7af1-4260-9946-27b3e711400d\") " pod="openstack/rabbitmq-server-0" Dec 07 19:32:56 crc kubenswrapper[4815]: I1207 19:32:56.301436 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 07 19:32:56 crc kubenswrapper[4815]: I1207 19:32:56.302708 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 07 19:32:56 crc kubenswrapper[4815]: I1207 19:32:56.312632 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 07 19:32:56 crc kubenswrapper[4815]: I1207 19:32:56.313064 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 07 19:32:56 crc kubenswrapper[4815]: I1207 19:32:56.313196 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 07 19:32:56 crc kubenswrapper[4815]: I1207 19:32:56.313348 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-mpdsj" Dec 07 19:32:56 crc kubenswrapper[4815]: I1207 19:32:56.313534 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 07 19:32:56 crc kubenswrapper[4815]: I1207 19:32:56.313687 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 07 19:32:56 crc kubenswrapper[4815]: I1207 19:32:56.313811 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 07 19:32:56 crc kubenswrapper[4815]: I1207 19:32:56.322127 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c2f348fe-7af1-4260-9946-27b3e711400d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c2f348fe-7af1-4260-9946-27b3e711400d\") " pod="openstack/rabbitmq-server-0" Dec 07 19:32:56 crc kubenswrapper[4815]: I1207 19:32:56.322165 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"c2f348fe-7af1-4260-9946-27b3e711400d\") " pod="openstack/rabbitmq-server-0" Dec 07 19:32:56 crc kubenswrapper[4815]: I1207 19:32:56.322186 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c2f348fe-7af1-4260-9946-27b3e711400d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c2f348fe-7af1-4260-9946-27b3e711400d\") " pod="openstack/rabbitmq-server-0" Dec 07 19:32:56 crc kubenswrapper[4815]: I1207 19:32:56.322202 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c2f348fe-7af1-4260-9946-27b3e711400d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c2f348fe-7af1-4260-9946-27b3e711400d\") " pod="openstack/rabbitmq-server-0" Dec 07 19:32:56 crc kubenswrapper[4815]: I1207 19:32:56.322223 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c2f348fe-7af1-4260-9946-27b3e711400d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c2f348fe-7af1-4260-9946-27b3e711400d\") " pod="openstack/rabbitmq-server-0" Dec 07 19:32:56 crc kubenswrapper[4815]: I1207 19:32:56.322243 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c2f348fe-7af1-4260-9946-27b3e711400d-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c2f348fe-7af1-4260-9946-27b3e711400d\") " pod="openstack/rabbitmq-server-0" Dec 07 19:32:56 crc kubenswrapper[4815]: I1207 19:32:56.322269 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c2f348fe-7af1-4260-9946-27b3e711400d-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c2f348fe-7af1-4260-9946-27b3e711400d\") " pod="openstack/rabbitmq-server-0" Dec 07 19:32:56 crc kubenswrapper[4815]: I1207 19:32:56.322306 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c2f348fe-7af1-4260-9946-27b3e711400d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c2f348fe-7af1-4260-9946-27b3e711400d\") " pod="openstack/rabbitmq-server-0" Dec 07 19:32:56 crc kubenswrapper[4815]: I1207 19:32:56.322330 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c2f348fe-7af1-4260-9946-27b3e711400d-config-data\") pod \"rabbitmq-server-0\" (UID: \"c2f348fe-7af1-4260-9946-27b3e711400d\") " pod="openstack/rabbitmq-server-0" Dec 07 19:32:56 crc kubenswrapper[4815]: I1207 19:32:56.322349 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vh75p\" (UniqueName: \"kubernetes.io/projected/c2f348fe-7af1-4260-9946-27b3e711400d-kube-api-access-vh75p\") pod \"rabbitmq-server-0\" (UID: \"c2f348fe-7af1-4260-9946-27b3e711400d\") " pod="openstack/rabbitmq-server-0" Dec 07 19:32:56 crc kubenswrapper[4815]: I1207 19:32:56.322379 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c2f348fe-7af1-4260-9946-27b3e711400d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c2f348fe-7af1-4260-9946-27b3e711400d\") " pod="openstack/rabbitmq-server-0" Dec 07 19:32:56 crc kubenswrapper[4815]: I1207 19:32:56.323788 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c2f348fe-7af1-4260-9946-27b3e711400d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c2f348fe-7af1-4260-9946-27b3e711400d\") " pod="openstack/rabbitmq-server-0" Dec 07 19:32:56 crc kubenswrapper[4815]: I1207 19:32:56.324964 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c2f348fe-7af1-4260-9946-27b3e711400d-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c2f348fe-7af1-4260-9946-27b3e711400d\") " pod="openstack/rabbitmq-server-0" Dec 07 19:32:56 crc kubenswrapper[4815]: I1207 19:32:56.326828 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 07 19:32:56 crc kubenswrapper[4815]: I1207 19:32:56.328102 4815 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"c2f348fe-7af1-4260-9946-27b3e711400d\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-server-0" Dec 07 19:32:56 crc kubenswrapper[4815]: I1207 19:32:56.329373 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c2f348fe-7af1-4260-9946-27b3e711400d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c2f348fe-7af1-4260-9946-27b3e711400d\") " pod="openstack/rabbitmq-server-0" Dec 07 19:32:56 crc kubenswrapper[4815]: I1207 19:32:56.329853 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c2f348fe-7af1-4260-9946-27b3e711400d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c2f348fe-7af1-4260-9946-27b3e711400d\") " pod="openstack/rabbitmq-server-0" Dec 07 19:32:56 crc kubenswrapper[4815]: I1207 19:32:56.333954 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c2f348fe-7af1-4260-9946-27b3e711400d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c2f348fe-7af1-4260-9946-27b3e711400d\") " pod="openstack/rabbitmq-server-0" Dec 07 19:32:56 crc kubenswrapper[4815]: I1207 19:32:56.331824 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c2f348fe-7af1-4260-9946-27b3e711400d-config-data\") pod \"rabbitmq-server-0\" (UID: \"c2f348fe-7af1-4260-9946-27b3e711400d\") " pod="openstack/rabbitmq-server-0" Dec 07 19:32:56 crc kubenswrapper[4815]: I1207 19:32:56.339596 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c2f348fe-7af1-4260-9946-27b3e711400d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c2f348fe-7af1-4260-9946-27b3e711400d\") " pod="openstack/rabbitmq-server-0" Dec 07 19:32:56 crc kubenswrapper[4815]: I1207 19:32:56.360391 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c2f348fe-7af1-4260-9946-27b3e711400d-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c2f348fe-7af1-4260-9946-27b3e711400d\") " pod="openstack/rabbitmq-server-0" Dec 07 19:32:56 crc kubenswrapper[4815]: I1207 19:32:56.365786 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-2gxx9"] Dec 07 19:32:56 crc kubenswrapper[4815]: I1207 19:32:56.371302 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c2f348fe-7af1-4260-9946-27b3e711400d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c2f348fe-7af1-4260-9946-27b3e711400d\") " pod="openstack/rabbitmq-server-0" Dec 07 19:32:56 crc kubenswrapper[4815]: I1207 19:32:56.371771 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vh75p\" (UniqueName: \"kubernetes.io/projected/c2f348fe-7af1-4260-9946-27b3e711400d-kube-api-access-vh75p\") pod \"rabbitmq-server-0\" (UID: \"c2f348fe-7af1-4260-9946-27b3e711400d\") " pod="openstack/rabbitmq-server-0" Dec 07 19:32:56 crc kubenswrapper[4815]: I1207 19:32:56.371969 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"c2f348fe-7af1-4260-9946-27b3e711400d\") " pod="openstack/rabbitmq-server-0" Dec 07 19:32:56 crc kubenswrapper[4815]: I1207 19:32:56.423894 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/814a06c9-c432-4a32-835e-59a4831cf335-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"814a06c9-c432-4a32-835e-59a4831cf335\") " pod="openstack/rabbitmq-cell1-server-0" Dec 07 19:32:56 crc kubenswrapper[4815]: I1207 19:32:56.424014 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/814a06c9-c432-4a32-835e-59a4831cf335-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"814a06c9-c432-4a32-835e-59a4831cf335\") " pod="openstack/rabbitmq-cell1-server-0" Dec 07 19:32:56 crc kubenswrapper[4815]: I1207 19:32:56.424031 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75xhf\" (UniqueName: \"kubernetes.io/projected/814a06c9-c432-4a32-835e-59a4831cf335-kube-api-access-75xhf\") pod \"rabbitmq-cell1-server-0\" (UID: \"814a06c9-c432-4a32-835e-59a4831cf335\") " pod="openstack/rabbitmq-cell1-server-0" Dec 07 19:32:56 crc kubenswrapper[4815]: I1207 19:32:56.424056 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/814a06c9-c432-4a32-835e-59a4831cf335-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"814a06c9-c432-4a32-835e-59a4831cf335\") " pod="openstack/rabbitmq-cell1-server-0" Dec 07 19:32:56 crc kubenswrapper[4815]: I1207 19:32:56.424075 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/814a06c9-c432-4a32-835e-59a4831cf335-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"814a06c9-c432-4a32-835e-59a4831cf335\") " pod="openstack/rabbitmq-cell1-server-0" Dec 07 19:32:56 crc kubenswrapper[4815]: I1207 19:32:56.424111 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/814a06c9-c432-4a32-835e-59a4831cf335-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"814a06c9-c432-4a32-835e-59a4831cf335\") " pod="openstack/rabbitmq-cell1-server-0" Dec 07 19:32:56 crc kubenswrapper[4815]: I1207 19:32:56.424172 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/814a06c9-c432-4a32-835e-59a4831cf335-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"814a06c9-c432-4a32-835e-59a4831cf335\") " pod="openstack/rabbitmq-cell1-server-0" Dec 07 19:32:56 crc kubenswrapper[4815]: I1207 19:32:56.424211 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/814a06c9-c432-4a32-835e-59a4831cf335-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"814a06c9-c432-4a32-835e-59a4831cf335\") " pod="openstack/rabbitmq-cell1-server-0" Dec 07 19:32:56 crc kubenswrapper[4815]: I1207 19:32:56.424239 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"814a06c9-c432-4a32-835e-59a4831cf335\") " pod="openstack/rabbitmq-cell1-server-0" Dec 07 19:32:56 crc kubenswrapper[4815]: I1207 19:32:56.424271 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/814a06c9-c432-4a32-835e-59a4831cf335-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"814a06c9-c432-4a32-835e-59a4831cf335\") " pod="openstack/rabbitmq-cell1-server-0" Dec 07 19:32:56 crc kubenswrapper[4815]: I1207 19:32:56.424330 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/814a06c9-c432-4a32-835e-59a4831cf335-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"814a06c9-c432-4a32-835e-59a4831cf335\") " pod="openstack/rabbitmq-cell1-server-0" Dec 07 19:32:56 crc kubenswrapper[4815]: I1207 19:32:56.453007 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 07 19:32:56 crc kubenswrapper[4815]: I1207 19:32:56.525765 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/814a06c9-c432-4a32-835e-59a4831cf335-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"814a06c9-c432-4a32-835e-59a4831cf335\") " pod="openstack/rabbitmq-cell1-server-0" Dec 07 19:32:56 crc kubenswrapper[4815]: I1207 19:32:56.525826 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/814a06c9-c432-4a32-835e-59a4831cf335-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"814a06c9-c432-4a32-835e-59a4831cf335\") " pod="openstack/rabbitmq-cell1-server-0" Dec 07 19:32:56 crc kubenswrapper[4815]: I1207 19:32:56.525866 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"814a06c9-c432-4a32-835e-59a4831cf335\") " pod="openstack/rabbitmq-cell1-server-0" Dec 07 19:32:56 crc kubenswrapper[4815]: I1207 19:32:56.525888 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/814a06c9-c432-4a32-835e-59a4831cf335-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"814a06c9-c432-4a32-835e-59a4831cf335\") " pod="openstack/rabbitmq-cell1-server-0" Dec 07 19:32:56 crc kubenswrapper[4815]: I1207 19:32:56.525959 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/814a06c9-c432-4a32-835e-59a4831cf335-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"814a06c9-c432-4a32-835e-59a4831cf335\") " pod="openstack/rabbitmq-cell1-server-0" Dec 07 19:32:56 crc kubenswrapper[4815]: I1207 19:32:56.525998 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/814a06c9-c432-4a32-835e-59a4831cf335-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"814a06c9-c432-4a32-835e-59a4831cf335\") " pod="openstack/rabbitmq-cell1-server-0" Dec 07 19:32:56 crc kubenswrapper[4815]: I1207 19:32:56.526034 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/814a06c9-c432-4a32-835e-59a4831cf335-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"814a06c9-c432-4a32-835e-59a4831cf335\") " pod="openstack/rabbitmq-cell1-server-0" Dec 07 19:32:56 crc kubenswrapper[4815]: I1207 19:32:56.526055 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75xhf\" (UniqueName: \"kubernetes.io/projected/814a06c9-c432-4a32-835e-59a4831cf335-kube-api-access-75xhf\") pod \"rabbitmq-cell1-server-0\" (UID: \"814a06c9-c432-4a32-835e-59a4831cf335\") " pod="openstack/rabbitmq-cell1-server-0" Dec 07 19:32:56 crc kubenswrapper[4815]: I1207 19:32:56.526078 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/814a06c9-c432-4a32-835e-59a4831cf335-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"814a06c9-c432-4a32-835e-59a4831cf335\") " pod="openstack/rabbitmq-cell1-server-0" Dec 07 19:32:56 crc kubenswrapper[4815]: I1207 19:32:56.526096 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/814a06c9-c432-4a32-835e-59a4831cf335-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"814a06c9-c432-4a32-835e-59a4831cf335\") " pod="openstack/rabbitmq-cell1-server-0" Dec 07 19:32:56 crc kubenswrapper[4815]: I1207 19:32:56.526117 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/814a06c9-c432-4a32-835e-59a4831cf335-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"814a06c9-c432-4a32-835e-59a4831cf335\") " pod="openstack/rabbitmq-cell1-server-0" Dec 07 19:32:56 crc kubenswrapper[4815]: I1207 19:32:56.526584 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/814a06c9-c432-4a32-835e-59a4831cf335-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"814a06c9-c432-4a32-835e-59a4831cf335\") " pod="openstack/rabbitmq-cell1-server-0" Dec 07 19:32:56 crc kubenswrapper[4815]: I1207 19:32:56.527173 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/814a06c9-c432-4a32-835e-59a4831cf335-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"814a06c9-c432-4a32-835e-59a4831cf335\") " pod="openstack/rabbitmq-cell1-server-0" Dec 07 19:32:56 crc kubenswrapper[4815]: I1207 19:32:56.527450 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/814a06c9-c432-4a32-835e-59a4831cf335-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"814a06c9-c432-4a32-835e-59a4831cf335\") " pod="openstack/rabbitmq-cell1-server-0" Dec 07 19:32:56 crc kubenswrapper[4815]: I1207 19:32:56.527650 4815 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"814a06c9-c432-4a32-835e-59a4831cf335\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/rabbitmq-cell1-server-0" Dec 07 19:32:56 crc kubenswrapper[4815]: I1207 19:32:56.531715 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/814a06c9-c432-4a32-835e-59a4831cf335-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"814a06c9-c432-4a32-835e-59a4831cf335\") " pod="openstack/rabbitmq-cell1-server-0" Dec 07 19:32:56 crc kubenswrapper[4815]: I1207 19:32:56.534439 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/814a06c9-c432-4a32-835e-59a4831cf335-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"814a06c9-c432-4a32-835e-59a4831cf335\") " pod="openstack/rabbitmq-cell1-server-0" Dec 07 19:32:56 crc kubenswrapper[4815]: I1207 19:32:56.535731 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/814a06c9-c432-4a32-835e-59a4831cf335-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"814a06c9-c432-4a32-835e-59a4831cf335\") " pod="openstack/rabbitmq-cell1-server-0" Dec 07 19:32:56 crc kubenswrapper[4815]: I1207 19:32:56.537432 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/814a06c9-c432-4a32-835e-59a4831cf335-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"814a06c9-c432-4a32-835e-59a4831cf335\") " pod="openstack/rabbitmq-cell1-server-0" Dec 07 19:32:56 crc kubenswrapper[4815]: I1207 19:32:56.539476 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/814a06c9-c432-4a32-835e-59a4831cf335-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"814a06c9-c432-4a32-835e-59a4831cf335\") " pod="openstack/rabbitmq-cell1-server-0" Dec 07 19:32:56 crc kubenswrapper[4815]: I1207 19:32:56.541241 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/814a06c9-c432-4a32-835e-59a4831cf335-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"814a06c9-c432-4a32-835e-59a4831cf335\") " pod="openstack/rabbitmq-cell1-server-0" Dec 07 19:32:56 crc kubenswrapper[4815]: I1207 19:32:56.548784 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75xhf\" (UniqueName: \"kubernetes.io/projected/814a06c9-c432-4a32-835e-59a4831cf335-kube-api-access-75xhf\") pod \"rabbitmq-cell1-server-0\" (UID: \"814a06c9-c432-4a32-835e-59a4831cf335\") " pod="openstack/rabbitmq-cell1-server-0" Dec 07 19:32:56 crc kubenswrapper[4815]: I1207 19:32:56.562404 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"814a06c9-c432-4a32-835e-59a4831cf335\") " pod="openstack/rabbitmq-cell1-server-0" Dec 07 19:32:56 crc kubenswrapper[4815]: I1207 19:32:56.646850 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 07 19:32:56 crc kubenswrapper[4815]: I1207 19:32:56.715208 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-2gxx9" event={"ID":"3864f9d5-9110-45d5-a5a1-9790beeae00c","Type":"ContainerStarted","Data":"f8d3eb4163f8c7d7e11ec5e82d5b1b582ddeab0a08eb1502232d7ba79566396a"} Dec 07 19:32:56 crc kubenswrapper[4815]: I1207 19:32:56.720781 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-2wpl6" event={"ID":"5847a638-115d-4f64-afe1-2935c623a28e","Type":"ContainerStarted","Data":"e17efe4f2a4d42448fefd6384ac55e70b2a532e25131b0821bb06a58b41ec55c"} Dec 07 19:32:57 crc kubenswrapper[4815]: I1207 19:32:57.437460 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Dec 07 19:32:57 crc kubenswrapper[4815]: I1207 19:32:57.439379 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 07 19:32:57 crc kubenswrapper[4815]: I1207 19:32:57.442177 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Dec 07 19:32:57 crc kubenswrapper[4815]: I1207 19:32:57.443439 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-kxfzk" Dec 07 19:32:57 crc kubenswrapper[4815]: I1207 19:32:57.443761 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Dec 07 19:32:57 crc kubenswrapper[4815]: I1207 19:32:57.452440 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Dec 07 19:32:57 crc kubenswrapper[4815]: I1207 19:32:57.462345 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Dec 07 19:32:57 crc kubenswrapper[4815]: I1207 19:32:57.469504 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 07 19:32:57 crc kubenswrapper[4815]: I1207 19:32:57.548865 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b428ca75-b6e8-428e-be32-eb320bacbdda-config-data-default\") pod \"openstack-galera-0\" (UID: \"b428ca75-b6e8-428e-be32-eb320bacbdda\") " pod="openstack/openstack-galera-0" Dec 07 19:32:57 crc kubenswrapper[4815]: I1207 19:32:57.549288 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-0\" (UID: \"b428ca75-b6e8-428e-be32-eb320bacbdda\") " pod="openstack/openstack-galera-0" Dec 07 19:32:57 crc kubenswrapper[4815]: I1207 19:32:57.549373 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b428ca75-b6e8-428e-be32-eb320bacbdda-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"b428ca75-b6e8-428e-be32-eb320bacbdda\") " pod="openstack/openstack-galera-0" Dec 07 19:32:57 crc kubenswrapper[4815]: I1207 19:32:57.549468 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b428ca75-b6e8-428e-be32-eb320bacbdda-config-data-generated\") pod \"openstack-galera-0\" (UID: \"b428ca75-b6e8-428e-be32-eb320bacbdda\") " pod="openstack/openstack-galera-0" Dec 07 19:32:57 crc kubenswrapper[4815]: I1207 19:32:57.549546 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b428ca75-b6e8-428e-be32-eb320bacbdda-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"b428ca75-b6e8-428e-be32-eb320bacbdda\") " pod="openstack/openstack-galera-0" Dec 07 19:32:57 crc kubenswrapper[4815]: I1207 19:32:57.549642 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b428ca75-b6e8-428e-be32-eb320bacbdda-kolla-config\") pod \"openstack-galera-0\" (UID: \"b428ca75-b6e8-428e-be32-eb320bacbdda\") " pod="openstack/openstack-galera-0" Dec 07 19:32:57 crc kubenswrapper[4815]: I1207 19:32:57.549709 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b428ca75-b6e8-428e-be32-eb320bacbdda-operator-scripts\") pod \"openstack-galera-0\" (UID: \"b428ca75-b6e8-428e-be32-eb320bacbdda\") " pod="openstack/openstack-galera-0" Dec 07 19:32:57 crc kubenswrapper[4815]: I1207 19:32:57.549777 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6h2cg\" (UniqueName: \"kubernetes.io/projected/b428ca75-b6e8-428e-be32-eb320bacbdda-kube-api-access-6h2cg\") pod \"openstack-galera-0\" (UID: \"b428ca75-b6e8-428e-be32-eb320bacbdda\") " pod="openstack/openstack-galera-0" Dec 07 19:32:57 crc kubenswrapper[4815]: I1207 19:32:57.651094 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b428ca75-b6e8-428e-be32-eb320bacbdda-config-data-generated\") pod \"openstack-galera-0\" (UID: \"b428ca75-b6e8-428e-be32-eb320bacbdda\") " pod="openstack/openstack-galera-0" Dec 07 19:32:57 crc kubenswrapper[4815]: I1207 19:32:57.651136 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b428ca75-b6e8-428e-be32-eb320bacbdda-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"b428ca75-b6e8-428e-be32-eb320bacbdda\") " pod="openstack/openstack-galera-0" Dec 07 19:32:57 crc kubenswrapper[4815]: I1207 19:32:57.651174 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b428ca75-b6e8-428e-be32-eb320bacbdda-kolla-config\") pod \"openstack-galera-0\" (UID: \"b428ca75-b6e8-428e-be32-eb320bacbdda\") " pod="openstack/openstack-galera-0" Dec 07 19:32:57 crc kubenswrapper[4815]: I1207 19:32:57.651192 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b428ca75-b6e8-428e-be32-eb320bacbdda-operator-scripts\") pod \"openstack-galera-0\" (UID: \"b428ca75-b6e8-428e-be32-eb320bacbdda\") " pod="openstack/openstack-galera-0" Dec 07 19:32:57 crc kubenswrapper[4815]: I1207 19:32:57.651213 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6h2cg\" (UniqueName: \"kubernetes.io/projected/b428ca75-b6e8-428e-be32-eb320bacbdda-kube-api-access-6h2cg\") pod \"openstack-galera-0\" (UID: \"b428ca75-b6e8-428e-be32-eb320bacbdda\") " pod="openstack/openstack-galera-0" Dec 07 19:32:57 crc kubenswrapper[4815]: I1207 19:32:57.651255 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b428ca75-b6e8-428e-be32-eb320bacbdda-config-data-default\") pod \"openstack-galera-0\" (UID: \"b428ca75-b6e8-428e-be32-eb320bacbdda\") " pod="openstack/openstack-galera-0" Dec 07 19:32:57 crc kubenswrapper[4815]: I1207 19:32:57.651280 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-0\" (UID: \"b428ca75-b6e8-428e-be32-eb320bacbdda\") " pod="openstack/openstack-galera-0" Dec 07 19:32:57 crc kubenswrapper[4815]: I1207 19:32:57.651296 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b428ca75-b6e8-428e-be32-eb320bacbdda-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"b428ca75-b6e8-428e-be32-eb320bacbdda\") " pod="openstack/openstack-galera-0" Dec 07 19:32:57 crc kubenswrapper[4815]: I1207 19:32:57.653179 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b428ca75-b6e8-428e-be32-eb320bacbdda-config-data-default\") pod \"openstack-galera-0\" (UID: \"b428ca75-b6e8-428e-be32-eb320bacbdda\") " pod="openstack/openstack-galera-0" Dec 07 19:32:57 crc kubenswrapper[4815]: I1207 19:32:57.653685 4815 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-0\" (UID: \"b428ca75-b6e8-428e-be32-eb320bacbdda\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/openstack-galera-0" Dec 07 19:32:57 crc kubenswrapper[4815]: I1207 19:32:57.653709 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b428ca75-b6e8-428e-be32-eb320bacbdda-operator-scripts\") pod \"openstack-galera-0\" (UID: \"b428ca75-b6e8-428e-be32-eb320bacbdda\") " pod="openstack/openstack-galera-0" Dec 07 19:32:57 crc kubenswrapper[4815]: I1207 19:32:57.653828 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b428ca75-b6e8-428e-be32-eb320bacbdda-kolla-config\") pod \"openstack-galera-0\" (UID: \"b428ca75-b6e8-428e-be32-eb320bacbdda\") " pod="openstack/openstack-galera-0" Dec 07 19:32:57 crc kubenswrapper[4815]: I1207 19:32:57.653989 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b428ca75-b6e8-428e-be32-eb320bacbdda-config-data-generated\") pod \"openstack-galera-0\" (UID: \"b428ca75-b6e8-428e-be32-eb320bacbdda\") " pod="openstack/openstack-galera-0" Dec 07 19:32:57 crc kubenswrapper[4815]: I1207 19:32:57.672644 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b428ca75-b6e8-428e-be32-eb320bacbdda-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"b428ca75-b6e8-428e-be32-eb320bacbdda\") " pod="openstack/openstack-galera-0" Dec 07 19:32:57 crc kubenswrapper[4815]: I1207 19:32:57.675386 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b428ca75-b6e8-428e-be32-eb320bacbdda-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"b428ca75-b6e8-428e-be32-eb320bacbdda\") " pod="openstack/openstack-galera-0" Dec 07 19:32:57 crc kubenswrapper[4815]: I1207 19:32:57.676792 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6h2cg\" (UniqueName: \"kubernetes.io/projected/b428ca75-b6e8-428e-be32-eb320bacbdda-kube-api-access-6h2cg\") pod \"openstack-galera-0\" (UID: \"b428ca75-b6e8-428e-be32-eb320bacbdda\") " pod="openstack/openstack-galera-0" Dec 07 19:32:57 crc kubenswrapper[4815]: I1207 19:32:57.677733 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-0\" (UID: \"b428ca75-b6e8-428e-be32-eb320bacbdda\") " pod="openstack/openstack-galera-0" Dec 07 19:32:57 crc kubenswrapper[4815]: I1207 19:32:57.806112 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 07 19:32:58 crc kubenswrapper[4815]: I1207 19:32:58.654005 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 07 19:32:58 crc kubenswrapper[4815]: I1207 19:32:58.655315 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 07 19:32:58 crc kubenswrapper[4815]: I1207 19:32:58.657189 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Dec 07 19:32:58 crc kubenswrapper[4815]: I1207 19:32:58.657368 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-4sskn" Dec 07 19:32:58 crc kubenswrapper[4815]: I1207 19:32:58.661462 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Dec 07 19:32:58 crc kubenswrapper[4815]: I1207 19:32:58.661664 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Dec 07 19:32:58 crc kubenswrapper[4815]: I1207 19:32:58.661906 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 07 19:32:58 crc kubenswrapper[4815]: I1207 19:32:58.802557 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/1dbe253c-608d-4711-904b-44926572c998-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"1dbe253c-608d-4711-904b-44926572c998\") " pod="openstack/openstack-cell1-galera-0" Dec 07 19:32:58 crc kubenswrapper[4815]: I1207 19:32:58.802596 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1dbe253c-608d-4711-904b-44926572c998-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"1dbe253c-608d-4711-904b-44926572c998\") " pod="openstack/openstack-cell1-galera-0" Dec 07 19:32:58 crc kubenswrapper[4815]: I1207 19:32:58.802638 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1dbe253c-608d-4711-904b-44926572c998-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"1dbe253c-608d-4711-904b-44926572c998\") " pod="openstack/openstack-cell1-galera-0" Dec 07 19:32:58 crc kubenswrapper[4815]: I1207 19:32:58.802661 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"1dbe253c-608d-4711-904b-44926572c998\") " pod="openstack/openstack-cell1-galera-0" Dec 07 19:32:58 crc kubenswrapper[4815]: I1207 19:32:58.802689 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clk69\" (UniqueName: \"kubernetes.io/projected/1dbe253c-608d-4711-904b-44926572c998-kube-api-access-clk69\") pod \"openstack-cell1-galera-0\" (UID: \"1dbe253c-608d-4711-904b-44926572c998\") " pod="openstack/openstack-cell1-galera-0" Dec 07 19:32:58 crc kubenswrapper[4815]: I1207 19:32:58.802710 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dbe253c-608d-4711-904b-44926572c998-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"1dbe253c-608d-4711-904b-44926572c998\") " pod="openstack/openstack-cell1-galera-0" Dec 07 19:32:58 crc kubenswrapper[4815]: I1207 19:32:58.802735 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/1dbe253c-608d-4711-904b-44926572c998-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"1dbe253c-608d-4711-904b-44926572c998\") " pod="openstack/openstack-cell1-galera-0" Dec 07 19:32:58 crc kubenswrapper[4815]: I1207 19:32:58.802766 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/1dbe253c-608d-4711-904b-44926572c998-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"1dbe253c-608d-4711-904b-44926572c998\") " pod="openstack/openstack-cell1-galera-0" Dec 07 19:32:58 crc kubenswrapper[4815]: I1207 19:32:58.904861 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/1dbe253c-608d-4711-904b-44926572c998-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"1dbe253c-608d-4711-904b-44926572c998\") " pod="openstack/openstack-cell1-galera-0" Dec 07 19:32:58 crc kubenswrapper[4815]: I1207 19:32:58.904908 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1dbe253c-608d-4711-904b-44926572c998-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"1dbe253c-608d-4711-904b-44926572c998\") " pod="openstack/openstack-cell1-galera-0" Dec 07 19:32:58 crc kubenswrapper[4815]: I1207 19:32:58.904989 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1dbe253c-608d-4711-904b-44926572c998-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"1dbe253c-608d-4711-904b-44926572c998\") " pod="openstack/openstack-cell1-galera-0" Dec 07 19:32:58 crc kubenswrapper[4815]: I1207 19:32:58.905022 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"1dbe253c-608d-4711-904b-44926572c998\") " pod="openstack/openstack-cell1-galera-0" Dec 07 19:32:58 crc kubenswrapper[4815]: I1207 19:32:58.905061 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clk69\" (UniqueName: \"kubernetes.io/projected/1dbe253c-608d-4711-904b-44926572c998-kube-api-access-clk69\") pod \"openstack-cell1-galera-0\" (UID: \"1dbe253c-608d-4711-904b-44926572c998\") " pod="openstack/openstack-cell1-galera-0" Dec 07 19:32:58 crc kubenswrapper[4815]: I1207 19:32:58.905090 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dbe253c-608d-4711-904b-44926572c998-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"1dbe253c-608d-4711-904b-44926572c998\") " pod="openstack/openstack-cell1-galera-0" Dec 07 19:32:58 crc kubenswrapper[4815]: I1207 19:32:58.905121 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/1dbe253c-608d-4711-904b-44926572c998-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"1dbe253c-608d-4711-904b-44926572c998\") " pod="openstack/openstack-cell1-galera-0" Dec 07 19:32:58 crc kubenswrapper[4815]: I1207 19:32:58.905172 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/1dbe253c-608d-4711-904b-44926572c998-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"1dbe253c-608d-4711-904b-44926572c998\") " pod="openstack/openstack-cell1-galera-0" Dec 07 19:32:58 crc kubenswrapper[4815]: I1207 19:32:58.905495 4815 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"1dbe253c-608d-4711-904b-44926572c998\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/openstack-cell1-galera-0" Dec 07 19:32:58 crc kubenswrapper[4815]: I1207 19:32:58.907530 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/1dbe253c-608d-4711-904b-44926572c998-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"1dbe253c-608d-4711-904b-44926572c998\") " pod="openstack/openstack-cell1-galera-0" Dec 07 19:32:58 crc kubenswrapper[4815]: I1207 19:32:58.910582 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1dbe253c-608d-4711-904b-44926572c998-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"1dbe253c-608d-4711-904b-44926572c998\") " pod="openstack/openstack-cell1-galera-0" Dec 07 19:32:58 crc kubenswrapper[4815]: I1207 19:32:58.910688 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/1dbe253c-608d-4711-904b-44926572c998-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"1dbe253c-608d-4711-904b-44926572c998\") " pod="openstack/openstack-cell1-galera-0" Dec 07 19:32:58 crc kubenswrapper[4815]: I1207 19:32:58.914230 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dbe253c-608d-4711-904b-44926572c998-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"1dbe253c-608d-4711-904b-44926572c998\") " pod="openstack/openstack-cell1-galera-0" Dec 07 19:32:58 crc kubenswrapper[4815]: I1207 19:32:58.915531 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1dbe253c-608d-4711-904b-44926572c998-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"1dbe253c-608d-4711-904b-44926572c998\") " pod="openstack/openstack-cell1-galera-0" Dec 07 19:32:58 crc kubenswrapper[4815]: I1207 19:32:58.916611 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/1dbe253c-608d-4711-904b-44926572c998-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"1dbe253c-608d-4711-904b-44926572c998\") " pod="openstack/openstack-cell1-galera-0" Dec 07 19:32:58 crc kubenswrapper[4815]: I1207 19:32:58.927609 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clk69\" (UniqueName: \"kubernetes.io/projected/1dbe253c-608d-4711-904b-44926572c998-kube-api-access-clk69\") pod \"openstack-cell1-galera-0\" (UID: \"1dbe253c-608d-4711-904b-44926572c998\") " pod="openstack/openstack-cell1-galera-0" Dec 07 19:32:58 crc kubenswrapper[4815]: I1207 19:32:58.944793 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"1dbe253c-608d-4711-904b-44926572c998\") " pod="openstack/openstack-cell1-galera-0" Dec 07 19:32:58 crc kubenswrapper[4815]: I1207 19:32:58.985019 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 07 19:32:59 crc kubenswrapper[4815]: I1207 19:32:59.184015 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Dec 07 19:32:59 crc kubenswrapper[4815]: I1207 19:32:59.185347 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 07 19:32:59 crc kubenswrapper[4815]: I1207 19:32:59.192105 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-gnp7r" Dec 07 19:32:59 crc kubenswrapper[4815]: I1207 19:32:59.193383 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Dec 07 19:32:59 crc kubenswrapper[4815]: I1207 19:32:59.195708 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Dec 07 19:32:59 crc kubenswrapper[4815]: I1207 19:32:59.239956 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77640a82-dd82-4fe6-89f8-58d8ab094b71-combined-ca-bundle\") pod \"memcached-0\" (UID: \"77640a82-dd82-4fe6-89f8-58d8ab094b71\") " pod="openstack/memcached-0" Dec 07 19:32:59 crc kubenswrapper[4815]: I1207 19:32:59.240016 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/77640a82-dd82-4fe6-89f8-58d8ab094b71-config-data\") pod \"memcached-0\" (UID: \"77640a82-dd82-4fe6-89f8-58d8ab094b71\") " pod="openstack/memcached-0" Dec 07 19:32:59 crc kubenswrapper[4815]: I1207 19:32:59.240038 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/77640a82-dd82-4fe6-89f8-58d8ab094b71-kolla-config\") pod \"memcached-0\" (UID: \"77640a82-dd82-4fe6-89f8-58d8ab094b71\") " pod="openstack/memcached-0" Dec 07 19:32:59 crc kubenswrapper[4815]: I1207 19:32:59.240057 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2l9g2\" (UniqueName: \"kubernetes.io/projected/77640a82-dd82-4fe6-89f8-58d8ab094b71-kube-api-access-2l9g2\") pod \"memcached-0\" (UID: \"77640a82-dd82-4fe6-89f8-58d8ab094b71\") " pod="openstack/memcached-0" Dec 07 19:32:59 crc kubenswrapper[4815]: I1207 19:32:59.240082 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/77640a82-dd82-4fe6-89f8-58d8ab094b71-memcached-tls-certs\") pod \"memcached-0\" (UID: \"77640a82-dd82-4fe6-89f8-58d8ab094b71\") " pod="openstack/memcached-0" Dec 07 19:32:59 crc kubenswrapper[4815]: I1207 19:32:59.252366 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 07 19:32:59 crc kubenswrapper[4815]: I1207 19:32:59.341352 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77640a82-dd82-4fe6-89f8-58d8ab094b71-combined-ca-bundle\") pod \"memcached-0\" (UID: \"77640a82-dd82-4fe6-89f8-58d8ab094b71\") " pod="openstack/memcached-0" Dec 07 19:32:59 crc kubenswrapper[4815]: I1207 19:32:59.341407 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/77640a82-dd82-4fe6-89f8-58d8ab094b71-config-data\") pod \"memcached-0\" (UID: \"77640a82-dd82-4fe6-89f8-58d8ab094b71\") " pod="openstack/memcached-0" Dec 07 19:32:59 crc kubenswrapper[4815]: I1207 19:32:59.341429 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/77640a82-dd82-4fe6-89f8-58d8ab094b71-kolla-config\") pod \"memcached-0\" (UID: \"77640a82-dd82-4fe6-89f8-58d8ab094b71\") " pod="openstack/memcached-0" Dec 07 19:32:59 crc kubenswrapper[4815]: I1207 19:32:59.341450 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2l9g2\" (UniqueName: \"kubernetes.io/projected/77640a82-dd82-4fe6-89f8-58d8ab094b71-kube-api-access-2l9g2\") pod \"memcached-0\" (UID: \"77640a82-dd82-4fe6-89f8-58d8ab094b71\") " pod="openstack/memcached-0" Dec 07 19:32:59 crc kubenswrapper[4815]: I1207 19:32:59.341475 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/77640a82-dd82-4fe6-89f8-58d8ab094b71-memcached-tls-certs\") pod \"memcached-0\" (UID: \"77640a82-dd82-4fe6-89f8-58d8ab094b71\") " pod="openstack/memcached-0" Dec 07 19:32:59 crc kubenswrapper[4815]: I1207 19:32:59.343792 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/77640a82-dd82-4fe6-89f8-58d8ab094b71-config-data\") pod \"memcached-0\" (UID: \"77640a82-dd82-4fe6-89f8-58d8ab094b71\") " pod="openstack/memcached-0" Dec 07 19:32:59 crc kubenswrapper[4815]: I1207 19:32:59.344012 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/77640a82-dd82-4fe6-89f8-58d8ab094b71-kolla-config\") pod \"memcached-0\" (UID: \"77640a82-dd82-4fe6-89f8-58d8ab094b71\") " pod="openstack/memcached-0" Dec 07 19:32:59 crc kubenswrapper[4815]: I1207 19:32:59.347438 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/77640a82-dd82-4fe6-89f8-58d8ab094b71-memcached-tls-certs\") pod \"memcached-0\" (UID: \"77640a82-dd82-4fe6-89f8-58d8ab094b71\") " pod="openstack/memcached-0" Dec 07 19:32:59 crc kubenswrapper[4815]: I1207 19:32:59.347592 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77640a82-dd82-4fe6-89f8-58d8ab094b71-combined-ca-bundle\") pod \"memcached-0\" (UID: \"77640a82-dd82-4fe6-89f8-58d8ab094b71\") " pod="openstack/memcached-0" Dec 07 19:32:59 crc kubenswrapper[4815]: I1207 19:32:59.360783 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2l9g2\" (UniqueName: \"kubernetes.io/projected/77640a82-dd82-4fe6-89f8-58d8ab094b71-kube-api-access-2l9g2\") pod \"memcached-0\" (UID: \"77640a82-dd82-4fe6-89f8-58d8ab094b71\") " pod="openstack/memcached-0" Dec 07 19:32:59 crc kubenswrapper[4815]: I1207 19:32:59.499143 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 07 19:33:00 crc kubenswrapper[4815]: I1207 19:33:00.921531 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 07 19:33:00 crc kubenswrapper[4815]: I1207 19:33:00.922674 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 07 19:33:00 crc kubenswrapper[4815]: I1207 19:33:00.928385 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-cblbj" Dec 07 19:33:00 crc kubenswrapper[4815]: I1207 19:33:00.929438 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 07 19:33:01 crc kubenswrapper[4815]: I1207 19:33:01.100867 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79qvf\" (UniqueName: \"kubernetes.io/projected/8a52cbc1-9245-48e5-8b22-0cdf96dc671b-kube-api-access-79qvf\") pod \"kube-state-metrics-0\" (UID: \"8a52cbc1-9245-48e5-8b22-0cdf96dc671b\") " pod="openstack/kube-state-metrics-0" Dec 07 19:33:01 crc kubenswrapper[4815]: I1207 19:33:01.202557 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79qvf\" (UniqueName: \"kubernetes.io/projected/8a52cbc1-9245-48e5-8b22-0cdf96dc671b-kube-api-access-79qvf\") pod \"kube-state-metrics-0\" (UID: \"8a52cbc1-9245-48e5-8b22-0cdf96dc671b\") " pod="openstack/kube-state-metrics-0" Dec 07 19:33:01 crc kubenswrapper[4815]: I1207 19:33:01.232723 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79qvf\" (UniqueName: \"kubernetes.io/projected/8a52cbc1-9245-48e5-8b22-0cdf96dc671b-kube-api-access-79qvf\") pod \"kube-state-metrics-0\" (UID: \"8a52cbc1-9245-48e5-8b22-0cdf96dc671b\") " pod="openstack/kube-state-metrics-0" Dec 07 19:33:01 crc kubenswrapper[4815]: I1207 19:33:01.250853 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 07 19:33:05 crc kubenswrapper[4815]: I1207 19:33:05.898528 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 07 19:33:06 crc kubenswrapper[4815]: I1207 19:33:06.173259 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 07 19:33:06 crc kubenswrapper[4815]: I1207 19:33:06.174503 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 07 19:33:06 crc kubenswrapper[4815]: I1207 19:33:06.177030 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Dec 07 19:33:06 crc kubenswrapper[4815]: I1207 19:33:06.179903 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Dec 07 19:33:06 crc kubenswrapper[4815]: I1207 19:33:06.179933 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Dec 07 19:33:06 crc kubenswrapper[4815]: I1207 19:33:06.180090 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-pmlnt" Dec 07 19:33:06 crc kubenswrapper[4815]: I1207 19:33:06.182299 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Dec 07 19:33:06 crc kubenswrapper[4815]: I1207 19:33:06.195455 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 07 19:33:06 crc kubenswrapper[4815]: I1207 19:33:06.317825 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4ebfeb2d-526f-4f02-be1d-def3f49c555c-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"4ebfeb2d-526f-4f02-be1d-def3f49c555c\") " pod="openstack/ovsdbserver-nb-0" Dec 07 19:33:06 crc kubenswrapper[4815]: I1207 19:33:06.318204 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ebfeb2d-526f-4f02-be1d-def3f49c555c-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"4ebfeb2d-526f-4f02-be1d-def3f49c555c\") " pod="openstack/ovsdbserver-nb-0" Dec 07 19:33:06 crc kubenswrapper[4815]: I1207 19:33:06.318234 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"4ebfeb2d-526f-4f02-be1d-def3f49c555c\") " pod="openstack/ovsdbserver-nb-0" Dec 07 19:33:06 crc kubenswrapper[4815]: I1207 19:33:06.318277 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ebfeb2d-526f-4f02-be1d-def3f49c555c-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"4ebfeb2d-526f-4f02-be1d-def3f49c555c\") " pod="openstack/ovsdbserver-nb-0" Dec 07 19:33:06 crc kubenswrapper[4815]: I1207 19:33:06.318330 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ebfeb2d-526f-4f02-be1d-def3f49c555c-config\") pod \"ovsdbserver-nb-0\" (UID: \"4ebfeb2d-526f-4f02-be1d-def3f49c555c\") " pod="openstack/ovsdbserver-nb-0" Dec 07 19:33:06 crc kubenswrapper[4815]: I1207 19:33:06.318357 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ebfeb2d-526f-4f02-be1d-def3f49c555c-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"4ebfeb2d-526f-4f02-be1d-def3f49c555c\") " pod="openstack/ovsdbserver-nb-0" Dec 07 19:33:06 crc kubenswrapper[4815]: I1207 19:33:06.318375 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m568d\" (UniqueName: \"kubernetes.io/projected/4ebfeb2d-526f-4f02-be1d-def3f49c555c-kube-api-access-m568d\") pod \"ovsdbserver-nb-0\" (UID: \"4ebfeb2d-526f-4f02-be1d-def3f49c555c\") " pod="openstack/ovsdbserver-nb-0" Dec 07 19:33:06 crc kubenswrapper[4815]: I1207 19:33:06.318407 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4ebfeb2d-526f-4f02-be1d-def3f49c555c-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"4ebfeb2d-526f-4f02-be1d-def3f49c555c\") " pod="openstack/ovsdbserver-nb-0" Dec 07 19:33:06 crc kubenswrapper[4815]: I1207 19:33:06.419251 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"4ebfeb2d-526f-4f02-be1d-def3f49c555c\") " pod="openstack/ovsdbserver-nb-0" Dec 07 19:33:06 crc kubenswrapper[4815]: I1207 19:33:06.419312 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ebfeb2d-526f-4f02-be1d-def3f49c555c-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"4ebfeb2d-526f-4f02-be1d-def3f49c555c\") " pod="openstack/ovsdbserver-nb-0" Dec 07 19:33:06 crc kubenswrapper[4815]: I1207 19:33:06.419365 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ebfeb2d-526f-4f02-be1d-def3f49c555c-config\") pod \"ovsdbserver-nb-0\" (UID: \"4ebfeb2d-526f-4f02-be1d-def3f49c555c\") " pod="openstack/ovsdbserver-nb-0" Dec 07 19:33:06 crc kubenswrapper[4815]: I1207 19:33:06.419388 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ebfeb2d-526f-4f02-be1d-def3f49c555c-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"4ebfeb2d-526f-4f02-be1d-def3f49c555c\") " pod="openstack/ovsdbserver-nb-0" Dec 07 19:33:06 crc kubenswrapper[4815]: I1207 19:33:06.419410 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m568d\" (UniqueName: \"kubernetes.io/projected/4ebfeb2d-526f-4f02-be1d-def3f49c555c-kube-api-access-m568d\") pod \"ovsdbserver-nb-0\" (UID: \"4ebfeb2d-526f-4f02-be1d-def3f49c555c\") " pod="openstack/ovsdbserver-nb-0" Dec 07 19:33:06 crc kubenswrapper[4815]: I1207 19:33:06.419446 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4ebfeb2d-526f-4f02-be1d-def3f49c555c-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"4ebfeb2d-526f-4f02-be1d-def3f49c555c\") " pod="openstack/ovsdbserver-nb-0" Dec 07 19:33:06 crc kubenswrapper[4815]: I1207 19:33:06.419470 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4ebfeb2d-526f-4f02-be1d-def3f49c555c-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"4ebfeb2d-526f-4f02-be1d-def3f49c555c\") " pod="openstack/ovsdbserver-nb-0" Dec 07 19:33:06 crc kubenswrapper[4815]: I1207 19:33:06.419489 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ebfeb2d-526f-4f02-be1d-def3f49c555c-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"4ebfeb2d-526f-4f02-be1d-def3f49c555c\") " pod="openstack/ovsdbserver-nb-0" Dec 07 19:33:06 crc kubenswrapper[4815]: I1207 19:33:06.425373 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ebfeb2d-526f-4f02-be1d-def3f49c555c-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"4ebfeb2d-526f-4f02-be1d-def3f49c555c\") " pod="openstack/ovsdbserver-nb-0" Dec 07 19:33:06 crc kubenswrapper[4815]: I1207 19:33:06.426472 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4ebfeb2d-526f-4f02-be1d-def3f49c555c-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"4ebfeb2d-526f-4f02-be1d-def3f49c555c\") " pod="openstack/ovsdbserver-nb-0" Dec 07 19:33:06 crc kubenswrapper[4815]: I1207 19:33:06.426778 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4ebfeb2d-526f-4f02-be1d-def3f49c555c-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"4ebfeb2d-526f-4f02-be1d-def3f49c555c\") " pod="openstack/ovsdbserver-nb-0" Dec 07 19:33:06 crc kubenswrapper[4815]: I1207 19:33:06.426983 4815 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"4ebfeb2d-526f-4f02-be1d-def3f49c555c\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/ovsdbserver-nb-0" Dec 07 19:33:06 crc kubenswrapper[4815]: I1207 19:33:06.427835 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ebfeb2d-526f-4f02-be1d-def3f49c555c-config\") pod \"ovsdbserver-nb-0\" (UID: \"4ebfeb2d-526f-4f02-be1d-def3f49c555c\") " pod="openstack/ovsdbserver-nb-0" Dec 07 19:33:06 crc kubenswrapper[4815]: I1207 19:33:06.428735 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4ebfeb2d-526f-4f02-be1d-def3f49c555c-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"4ebfeb2d-526f-4f02-be1d-def3f49c555c\") " pod="openstack/ovsdbserver-nb-0" Dec 07 19:33:06 crc kubenswrapper[4815]: I1207 19:33:06.452553 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"4ebfeb2d-526f-4f02-be1d-def3f49c555c\") " pod="openstack/ovsdbserver-nb-0" Dec 07 19:33:06 crc kubenswrapper[4815]: I1207 19:33:06.459708 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m568d\" (UniqueName: \"kubernetes.io/projected/4ebfeb2d-526f-4f02-be1d-def3f49c555c-kube-api-access-m568d\") pod \"ovsdbserver-nb-0\" (UID: \"4ebfeb2d-526f-4f02-be1d-def3f49c555c\") " pod="openstack/ovsdbserver-nb-0" Dec 07 19:33:06 crc kubenswrapper[4815]: I1207 19:33:06.464234 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-24gkx"] Dec 07 19:33:06 crc kubenswrapper[4815]: I1207 19:33:06.469005 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ebfeb2d-526f-4f02-be1d-def3f49c555c-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"4ebfeb2d-526f-4f02-be1d-def3f49c555c\") " pod="openstack/ovsdbserver-nb-0" Dec 07 19:33:06 crc kubenswrapper[4815]: I1207 19:33:06.469742 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-24gkx" Dec 07 19:33:06 crc kubenswrapper[4815]: I1207 19:33:06.472315 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Dec 07 19:33:06 crc kubenswrapper[4815]: I1207 19:33:06.473835 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Dec 07 19:33:06 crc kubenswrapper[4815]: I1207 19:33:06.474263 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-vw4gg" Dec 07 19:33:06 crc kubenswrapper[4815]: I1207 19:33:06.474643 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-bhvc5"] Dec 07 19:33:06 crc kubenswrapper[4815]: I1207 19:33:06.476970 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-bhvc5" Dec 07 19:33:06 crc kubenswrapper[4815]: I1207 19:33:06.483875 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-24gkx"] Dec 07 19:33:06 crc kubenswrapper[4815]: I1207 19:33:06.497628 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 07 19:33:06 crc kubenswrapper[4815]: I1207 19:33:06.499885 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-bhvc5"] Dec 07 19:33:06 crc kubenswrapper[4815]: I1207 19:33:06.622654 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5d3ed0f7-1ea2-48e7-bab4-f5a709da4850-scripts\") pod \"ovn-controller-24gkx\" (UID: \"5d3ed0f7-1ea2-48e7-bab4-f5a709da4850\") " pod="openstack/ovn-controller-24gkx" Dec 07 19:33:06 crc kubenswrapper[4815]: I1207 19:33:06.622707 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wg4p\" (UniqueName: \"kubernetes.io/projected/5d3ed0f7-1ea2-48e7-bab4-f5a709da4850-kube-api-access-5wg4p\") pod \"ovn-controller-24gkx\" (UID: \"5d3ed0f7-1ea2-48e7-bab4-f5a709da4850\") " pod="openstack/ovn-controller-24gkx" Dec 07 19:33:06 crc kubenswrapper[4815]: I1207 19:33:06.622754 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5d3ed0f7-1ea2-48e7-bab4-f5a709da4850-var-run-ovn\") pod \"ovn-controller-24gkx\" (UID: \"5d3ed0f7-1ea2-48e7-bab4-f5a709da4850\") " pod="openstack/ovn-controller-24gkx" Dec 07 19:33:06 crc kubenswrapper[4815]: I1207 19:33:06.622774 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/2889719b-c3b6-4a36-8c6e-4b3e9f9a2c21-etc-ovs\") pod \"ovn-controller-ovs-bhvc5\" (UID: \"2889719b-c3b6-4a36-8c6e-4b3e9f9a2c21\") " pod="openstack/ovn-controller-ovs-bhvc5" Dec 07 19:33:06 crc kubenswrapper[4815]: I1207 19:33:06.622811 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/2889719b-c3b6-4a36-8c6e-4b3e9f9a2c21-var-log\") pod \"ovn-controller-ovs-bhvc5\" (UID: \"2889719b-c3b6-4a36-8c6e-4b3e9f9a2c21\") " pod="openstack/ovn-controller-ovs-bhvc5" Dec 07 19:33:06 crc kubenswrapper[4815]: I1207 19:33:06.622879 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/2889719b-c3b6-4a36-8c6e-4b3e9f9a2c21-var-lib\") pod \"ovn-controller-ovs-bhvc5\" (UID: \"2889719b-c3b6-4a36-8c6e-4b3e9f9a2c21\") " pod="openstack/ovn-controller-ovs-bhvc5" Dec 07 19:33:06 crc kubenswrapper[4815]: I1207 19:33:06.622960 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2889719b-c3b6-4a36-8c6e-4b3e9f9a2c21-scripts\") pod \"ovn-controller-ovs-bhvc5\" (UID: \"2889719b-c3b6-4a36-8c6e-4b3e9f9a2c21\") " pod="openstack/ovn-controller-ovs-bhvc5" Dec 07 19:33:06 crc kubenswrapper[4815]: I1207 19:33:06.622990 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d3ed0f7-1ea2-48e7-bab4-f5a709da4850-combined-ca-bundle\") pod \"ovn-controller-24gkx\" (UID: \"5d3ed0f7-1ea2-48e7-bab4-f5a709da4850\") " pod="openstack/ovn-controller-24gkx" Dec 07 19:33:06 crc kubenswrapper[4815]: I1207 19:33:06.623012 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d3ed0f7-1ea2-48e7-bab4-f5a709da4850-ovn-controller-tls-certs\") pod \"ovn-controller-24gkx\" (UID: \"5d3ed0f7-1ea2-48e7-bab4-f5a709da4850\") " pod="openstack/ovn-controller-24gkx" Dec 07 19:33:06 crc kubenswrapper[4815]: I1207 19:33:06.623033 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8c4w9\" (UniqueName: \"kubernetes.io/projected/2889719b-c3b6-4a36-8c6e-4b3e9f9a2c21-kube-api-access-8c4w9\") pod \"ovn-controller-ovs-bhvc5\" (UID: \"2889719b-c3b6-4a36-8c6e-4b3e9f9a2c21\") " pod="openstack/ovn-controller-ovs-bhvc5" Dec 07 19:33:06 crc kubenswrapper[4815]: I1207 19:33:06.623131 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2889719b-c3b6-4a36-8c6e-4b3e9f9a2c21-var-run\") pod \"ovn-controller-ovs-bhvc5\" (UID: \"2889719b-c3b6-4a36-8c6e-4b3e9f9a2c21\") " pod="openstack/ovn-controller-ovs-bhvc5" Dec 07 19:33:06 crc kubenswrapper[4815]: I1207 19:33:06.623216 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5d3ed0f7-1ea2-48e7-bab4-f5a709da4850-var-run\") pod \"ovn-controller-24gkx\" (UID: \"5d3ed0f7-1ea2-48e7-bab4-f5a709da4850\") " pod="openstack/ovn-controller-24gkx" Dec 07 19:33:06 crc kubenswrapper[4815]: I1207 19:33:06.623241 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5d3ed0f7-1ea2-48e7-bab4-f5a709da4850-var-log-ovn\") pod \"ovn-controller-24gkx\" (UID: \"5d3ed0f7-1ea2-48e7-bab4-f5a709da4850\") " pod="openstack/ovn-controller-24gkx" Dec 07 19:33:06 crc kubenswrapper[4815]: I1207 19:33:06.725095 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5d3ed0f7-1ea2-48e7-bab4-f5a709da4850-var-run-ovn\") pod \"ovn-controller-24gkx\" (UID: \"5d3ed0f7-1ea2-48e7-bab4-f5a709da4850\") " pod="openstack/ovn-controller-24gkx" Dec 07 19:33:06 crc kubenswrapper[4815]: I1207 19:33:06.725136 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/2889719b-c3b6-4a36-8c6e-4b3e9f9a2c21-etc-ovs\") pod \"ovn-controller-ovs-bhvc5\" (UID: \"2889719b-c3b6-4a36-8c6e-4b3e9f9a2c21\") " pod="openstack/ovn-controller-ovs-bhvc5" Dec 07 19:33:06 crc kubenswrapper[4815]: I1207 19:33:06.725167 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/2889719b-c3b6-4a36-8c6e-4b3e9f9a2c21-var-log\") pod \"ovn-controller-ovs-bhvc5\" (UID: \"2889719b-c3b6-4a36-8c6e-4b3e9f9a2c21\") " pod="openstack/ovn-controller-ovs-bhvc5" Dec 07 19:33:06 crc kubenswrapper[4815]: I1207 19:33:06.725271 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/2889719b-c3b6-4a36-8c6e-4b3e9f9a2c21-var-lib\") pod \"ovn-controller-ovs-bhvc5\" (UID: \"2889719b-c3b6-4a36-8c6e-4b3e9f9a2c21\") " pod="openstack/ovn-controller-ovs-bhvc5" Dec 07 19:33:06 crc kubenswrapper[4815]: I1207 19:33:06.725304 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2889719b-c3b6-4a36-8c6e-4b3e9f9a2c21-scripts\") pod \"ovn-controller-ovs-bhvc5\" (UID: \"2889719b-c3b6-4a36-8c6e-4b3e9f9a2c21\") " pod="openstack/ovn-controller-ovs-bhvc5" Dec 07 19:33:06 crc kubenswrapper[4815]: I1207 19:33:06.725605 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d3ed0f7-1ea2-48e7-bab4-f5a709da4850-combined-ca-bundle\") pod \"ovn-controller-24gkx\" (UID: \"5d3ed0f7-1ea2-48e7-bab4-f5a709da4850\") " pod="openstack/ovn-controller-24gkx" Dec 07 19:33:06 crc kubenswrapper[4815]: I1207 19:33:06.725675 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/2889719b-c3b6-4a36-8c6e-4b3e9f9a2c21-etc-ovs\") pod \"ovn-controller-ovs-bhvc5\" (UID: \"2889719b-c3b6-4a36-8c6e-4b3e9f9a2c21\") " pod="openstack/ovn-controller-ovs-bhvc5" Dec 07 19:33:06 crc kubenswrapper[4815]: I1207 19:33:06.725739 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d3ed0f7-1ea2-48e7-bab4-f5a709da4850-ovn-controller-tls-certs\") pod \"ovn-controller-24gkx\" (UID: \"5d3ed0f7-1ea2-48e7-bab4-f5a709da4850\") " pod="openstack/ovn-controller-24gkx" Dec 07 19:33:06 crc kubenswrapper[4815]: I1207 19:33:06.725776 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8c4w9\" (UniqueName: \"kubernetes.io/projected/2889719b-c3b6-4a36-8c6e-4b3e9f9a2c21-kube-api-access-8c4w9\") pod \"ovn-controller-ovs-bhvc5\" (UID: \"2889719b-c3b6-4a36-8c6e-4b3e9f9a2c21\") " pod="openstack/ovn-controller-ovs-bhvc5" Dec 07 19:33:06 crc kubenswrapper[4815]: I1207 19:33:06.725791 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/2889719b-c3b6-4a36-8c6e-4b3e9f9a2c21-var-lib\") pod \"ovn-controller-ovs-bhvc5\" (UID: \"2889719b-c3b6-4a36-8c6e-4b3e9f9a2c21\") " pod="openstack/ovn-controller-ovs-bhvc5" Dec 07 19:33:06 crc kubenswrapper[4815]: I1207 19:33:06.725977 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/2889719b-c3b6-4a36-8c6e-4b3e9f9a2c21-var-log\") pod \"ovn-controller-ovs-bhvc5\" (UID: \"2889719b-c3b6-4a36-8c6e-4b3e9f9a2c21\") " pod="openstack/ovn-controller-ovs-bhvc5" Dec 07 19:33:06 crc kubenswrapper[4815]: I1207 19:33:06.726078 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2889719b-c3b6-4a36-8c6e-4b3e9f9a2c21-var-run\") pod \"ovn-controller-ovs-bhvc5\" (UID: \"2889719b-c3b6-4a36-8c6e-4b3e9f9a2c21\") " pod="openstack/ovn-controller-ovs-bhvc5" Dec 07 19:33:06 crc kubenswrapper[4815]: I1207 19:33:06.726441 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5d3ed0f7-1ea2-48e7-bab4-f5a709da4850-var-run\") pod \"ovn-controller-24gkx\" (UID: \"5d3ed0f7-1ea2-48e7-bab4-f5a709da4850\") " pod="openstack/ovn-controller-24gkx" Dec 07 19:33:06 crc kubenswrapper[4815]: I1207 19:33:06.726665 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5d3ed0f7-1ea2-48e7-bab4-f5a709da4850-var-log-ovn\") pod \"ovn-controller-24gkx\" (UID: \"5d3ed0f7-1ea2-48e7-bab4-f5a709da4850\") " pod="openstack/ovn-controller-24gkx" Dec 07 19:33:06 crc kubenswrapper[4815]: I1207 19:33:06.726685 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5d3ed0f7-1ea2-48e7-bab4-f5a709da4850-scripts\") pod \"ovn-controller-24gkx\" (UID: \"5d3ed0f7-1ea2-48e7-bab4-f5a709da4850\") " pod="openstack/ovn-controller-24gkx" Dec 07 19:33:06 crc kubenswrapper[4815]: I1207 19:33:06.726703 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5d3ed0f7-1ea2-48e7-bab4-f5a709da4850-var-run-ovn\") pod \"ovn-controller-24gkx\" (UID: \"5d3ed0f7-1ea2-48e7-bab4-f5a709da4850\") " pod="openstack/ovn-controller-24gkx" Dec 07 19:33:06 crc kubenswrapper[4815]: I1207 19:33:06.726574 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2889719b-c3b6-4a36-8c6e-4b3e9f9a2c21-var-run\") pod \"ovn-controller-ovs-bhvc5\" (UID: \"2889719b-c3b6-4a36-8c6e-4b3e9f9a2c21\") " pod="openstack/ovn-controller-ovs-bhvc5" Dec 07 19:33:06 crc kubenswrapper[4815]: I1207 19:33:06.726618 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5d3ed0f7-1ea2-48e7-bab4-f5a709da4850-var-run\") pod \"ovn-controller-24gkx\" (UID: \"5d3ed0f7-1ea2-48e7-bab4-f5a709da4850\") " pod="openstack/ovn-controller-24gkx" Dec 07 19:33:06 crc kubenswrapper[4815]: I1207 19:33:06.726932 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5d3ed0f7-1ea2-48e7-bab4-f5a709da4850-var-log-ovn\") pod \"ovn-controller-24gkx\" (UID: \"5d3ed0f7-1ea2-48e7-bab4-f5a709da4850\") " pod="openstack/ovn-controller-24gkx" Dec 07 19:33:06 crc kubenswrapper[4815]: I1207 19:33:06.727017 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wg4p\" (UniqueName: \"kubernetes.io/projected/5d3ed0f7-1ea2-48e7-bab4-f5a709da4850-kube-api-access-5wg4p\") pod \"ovn-controller-24gkx\" (UID: \"5d3ed0f7-1ea2-48e7-bab4-f5a709da4850\") " pod="openstack/ovn-controller-24gkx" Dec 07 19:33:06 crc kubenswrapper[4815]: I1207 19:33:06.729330 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d3ed0f7-1ea2-48e7-bab4-f5a709da4850-ovn-controller-tls-certs\") pod \"ovn-controller-24gkx\" (UID: \"5d3ed0f7-1ea2-48e7-bab4-f5a709da4850\") " pod="openstack/ovn-controller-24gkx" Dec 07 19:33:06 crc kubenswrapper[4815]: I1207 19:33:06.730258 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5d3ed0f7-1ea2-48e7-bab4-f5a709da4850-scripts\") pod \"ovn-controller-24gkx\" (UID: \"5d3ed0f7-1ea2-48e7-bab4-f5a709da4850\") " pod="openstack/ovn-controller-24gkx" Dec 07 19:33:06 crc kubenswrapper[4815]: I1207 19:33:06.740540 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8c4w9\" (UniqueName: \"kubernetes.io/projected/2889719b-c3b6-4a36-8c6e-4b3e9f9a2c21-kube-api-access-8c4w9\") pod \"ovn-controller-ovs-bhvc5\" (UID: \"2889719b-c3b6-4a36-8c6e-4b3e9f9a2c21\") " pod="openstack/ovn-controller-ovs-bhvc5" Dec 07 19:33:06 crc kubenswrapper[4815]: I1207 19:33:06.740550 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d3ed0f7-1ea2-48e7-bab4-f5a709da4850-combined-ca-bundle\") pod \"ovn-controller-24gkx\" (UID: \"5d3ed0f7-1ea2-48e7-bab4-f5a709da4850\") " pod="openstack/ovn-controller-24gkx" Dec 07 19:33:06 crc kubenswrapper[4815]: I1207 19:33:06.747428 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wg4p\" (UniqueName: \"kubernetes.io/projected/5d3ed0f7-1ea2-48e7-bab4-f5a709da4850-kube-api-access-5wg4p\") pod \"ovn-controller-24gkx\" (UID: \"5d3ed0f7-1ea2-48e7-bab4-f5a709da4850\") " pod="openstack/ovn-controller-24gkx" Dec 07 19:33:06 crc kubenswrapper[4815]: I1207 19:33:06.748216 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2889719b-c3b6-4a36-8c6e-4b3e9f9a2c21-scripts\") pod \"ovn-controller-ovs-bhvc5\" (UID: \"2889719b-c3b6-4a36-8c6e-4b3e9f9a2c21\") " pod="openstack/ovn-controller-ovs-bhvc5" Dec 07 19:33:06 crc kubenswrapper[4815]: I1207 19:33:06.846455 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-24gkx" Dec 07 19:33:06 crc kubenswrapper[4815]: I1207 19:33:06.859616 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-bhvc5" Dec 07 19:33:07 crc kubenswrapper[4815]: I1207 19:33:07.361886 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 07 19:33:07 crc kubenswrapper[4815]: I1207 19:33:07.366647 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 07 19:33:07 crc kubenswrapper[4815]: I1207 19:33:07.369691 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-6ldvl" Dec 07 19:33:07 crc kubenswrapper[4815]: I1207 19:33:07.375804 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Dec 07 19:33:07 crc kubenswrapper[4815]: I1207 19:33:07.376115 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Dec 07 19:33:07 crc kubenswrapper[4815]: I1207 19:33:07.376786 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Dec 07 19:33:07 crc kubenswrapper[4815]: I1207 19:33:07.383206 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 07 19:33:07 crc kubenswrapper[4815]: I1207 19:33:07.480816 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17df86ce-9c36-46d9-b5f2-d5dae8ae4675-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"17df86ce-9c36-46d9-b5f2-d5dae8ae4675\") " pod="openstack/ovsdbserver-sb-0" Dec 07 19:33:07 crc kubenswrapper[4815]: I1207 19:33:07.480870 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17df86ce-9c36-46d9-b5f2-d5dae8ae4675-config\") pod \"ovsdbserver-sb-0\" (UID: \"17df86ce-9c36-46d9-b5f2-d5dae8ae4675\") " pod="openstack/ovsdbserver-sb-0" Dec 07 19:33:07 crc kubenswrapper[4815]: I1207 19:33:07.480925 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/17df86ce-9c36-46d9-b5f2-d5dae8ae4675-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"17df86ce-9c36-46d9-b5f2-d5dae8ae4675\") " pod="openstack/ovsdbserver-sb-0" Dec 07 19:33:07 crc kubenswrapper[4815]: I1207 19:33:07.481180 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/17df86ce-9c36-46d9-b5f2-d5dae8ae4675-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"17df86ce-9c36-46d9-b5f2-d5dae8ae4675\") " pod="openstack/ovsdbserver-sb-0" Dec 07 19:33:07 crc kubenswrapper[4815]: I1207 19:33:07.481255 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/17df86ce-9c36-46d9-b5f2-d5dae8ae4675-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"17df86ce-9c36-46d9-b5f2-d5dae8ae4675\") " pod="openstack/ovsdbserver-sb-0" Dec 07 19:33:07 crc kubenswrapper[4815]: I1207 19:33:07.481298 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/17df86ce-9c36-46d9-b5f2-d5dae8ae4675-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"17df86ce-9c36-46d9-b5f2-d5dae8ae4675\") " pod="openstack/ovsdbserver-sb-0" Dec 07 19:33:07 crc kubenswrapper[4815]: I1207 19:33:07.481347 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-sb-0\" (UID: \"17df86ce-9c36-46d9-b5f2-d5dae8ae4675\") " pod="openstack/ovsdbserver-sb-0" Dec 07 19:33:07 crc kubenswrapper[4815]: I1207 19:33:07.481375 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbzbk\" (UniqueName: \"kubernetes.io/projected/17df86ce-9c36-46d9-b5f2-d5dae8ae4675-kube-api-access-dbzbk\") pod \"ovsdbserver-sb-0\" (UID: \"17df86ce-9c36-46d9-b5f2-d5dae8ae4675\") " pod="openstack/ovsdbserver-sb-0" Dec 07 19:33:07 crc kubenswrapper[4815]: I1207 19:33:07.583335 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17df86ce-9c36-46d9-b5f2-d5dae8ae4675-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"17df86ce-9c36-46d9-b5f2-d5dae8ae4675\") " pod="openstack/ovsdbserver-sb-0" Dec 07 19:33:07 crc kubenswrapper[4815]: I1207 19:33:07.583416 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17df86ce-9c36-46d9-b5f2-d5dae8ae4675-config\") pod \"ovsdbserver-sb-0\" (UID: \"17df86ce-9c36-46d9-b5f2-d5dae8ae4675\") " pod="openstack/ovsdbserver-sb-0" Dec 07 19:33:07 crc kubenswrapper[4815]: I1207 19:33:07.583444 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/17df86ce-9c36-46d9-b5f2-d5dae8ae4675-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"17df86ce-9c36-46d9-b5f2-d5dae8ae4675\") " pod="openstack/ovsdbserver-sb-0" Dec 07 19:33:07 crc kubenswrapper[4815]: I1207 19:33:07.583504 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/17df86ce-9c36-46d9-b5f2-d5dae8ae4675-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"17df86ce-9c36-46d9-b5f2-d5dae8ae4675\") " pod="openstack/ovsdbserver-sb-0" Dec 07 19:33:07 crc kubenswrapper[4815]: I1207 19:33:07.583530 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/17df86ce-9c36-46d9-b5f2-d5dae8ae4675-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"17df86ce-9c36-46d9-b5f2-d5dae8ae4675\") " pod="openstack/ovsdbserver-sb-0" Dec 07 19:33:07 crc kubenswrapper[4815]: I1207 19:33:07.583549 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/17df86ce-9c36-46d9-b5f2-d5dae8ae4675-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"17df86ce-9c36-46d9-b5f2-d5dae8ae4675\") " pod="openstack/ovsdbserver-sb-0" Dec 07 19:33:07 crc kubenswrapper[4815]: I1207 19:33:07.583572 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-sb-0\" (UID: \"17df86ce-9c36-46d9-b5f2-d5dae8ae4675\") " pod="openstack/ovsdbserver-sb-0" Dec 07 19:33:07 crc kubenswrapper[4815]: I1207 19:33:07.583587 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbzbk\" (UniqueName: \"kubernetes.io/projected/17df86ce-9c36-46d9-b5f2-d5dae8ae4675-kube-api-access-dbzbk\") pod \"ovsdbserver-sb-0\" (UID: \"17df86ce-9c36-46d9-b5f2-d5dae8ae4675\") " pod="openstack/ovsdbserver-sb-0" Dec 07 19:33:07 crc kubenswrapper[4815]: I1207 19:33:07.584444 4815 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-sb-0\" (UID: \"17df86ce-9c36-46d9-b5f2-d5dae8ae4675\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/ovsdbserver-sb-0" Dec 07 19:33:07 crc kubenswrapper[4815]: I1207 19:33:07.584740 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/17df86ce-9c36-46d9-b5f2-d5dae8ae4675-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"17df86ce-9c36-46d9-b5f2-d5dae8ae4675\") " pod="openstack/ovsdbserver-sb-0" Dec 07 19:33:07 crc kubenswrapper[4815]: I1207 19:33:07.585114 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17df86ce-9c36-46d9-b5f2-d5dae8ae4675-config\") pod \"ovsdbserver-sb-0\" (UID: \"17df86ce-9c36-46d9-b5f2-d5dae8ae4675\") " pod="openstack/ovsdbserver-sb-0" Dec 07 19:33:07 crc kubenswrapper[4815]: I1207 19:33:07.586204 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/17df86ce-9c36-46d9-b5f2-d5dae8ae4675-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"17df86ce-9c36-46d9-b5f2-d5dae8ae4675\") " pod="openstack/ovsdbserver-sb-0" Dec 07 19:33:07 crc kubenswrapper[4815]: I1207 19:33:07.591747 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/17df86ce-9c36-46d9-b5f2-d5dae8ae4675-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"17df86ce-9c36-46d9-b5f2-d5dae8ae4675\") " pod="openstack/ovsdbserver-sb-0" Dec 07 19:33:07 crc kubenswrapper[4815]: I1207 19:33:07.593818 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17df86ce-9c36-46d9-b5f2-d5dae8ae4675-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"17df86ce-9c36-46d9-b5f2-d5dae8ae4675\") " pod="openstack/ovsdbserver-sb-0" Dec 07 19:33:07 crc kubenswrapper[4815]: I1207 19:33:07.601858 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-sb-0\" (UID: \"17df86ce-9c36-46d9-b5f2-d5dae8ae4675\") " pod="openstack/ovsdbserver-sb-0" Dec 07 19:33:07 crc kubenswrapper[4815]: I1207 19:33:07.604438 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbzbk\" (UniqueName: \"kubernetes.io/projected/17df86ce-9c36-46d9-b5f2-d5dae8ae4675-kube-api-access-dbzbk\") pod \"ovsdbserver-sb-0\" (UID: \"17df86ce-9c36-46d9-b5f2-d5dae8ae4675\") " pod="openstack/ovsdbserver-sb-0" Dec 07 19:33:07 crc kubenswrapper[4815]: I1207 19:33:07.611889 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/17df86ce-9c36-46d9-b5f2-d5dae8ae4675-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"17df86ce-9c36-46d9-b5f2-d5dae8ae4675\") " pod="openstack/ovsdbserver-sb-0" Dec 07 19:33:07 crc kubenswrapper[4815]: I1207 19:33:07.704600 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 07 19:33:12 crc kubenswrapper[4815]: W1207 19:33:12.336307 4815 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod77640a82_dd82_4fe6_89f8_58d8ab094b71.slice/crio-863614b60046d7e5bf9604885db682773ddf964bd634dc214cade501a05c5ac1 WatchSource:0}: Error finding container 863614b60046d7e5bf9604885db682773ddf964bd634dc214cade501a05c5ac1: Status 404 returned error can't find the container with id 863614b60046d7e5bf9604885db682773ddf964bd634dc214cade501a05c5ac1 Dec 07 19:33:12 crc kubenswrapper[4815]: I1207 19:33:12.845962 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 07 19:33:12 crc kubenswrapper[4815]: I1207 19:33:12.873684 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"77640a82-dd82-4fe6-89f8-58d8ab094b71","Type":"ContainerStarted","Data":"863614b60046d7e5bf9604885db682773ddf964bd634dc214cade501a05c5ac1"} Dec 07 19:33:13 crc kubenswrapper[4815]: E1207 19:33:13.325840 4815 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 07 19:33:13 crc kubenswrapper[4815]: E1207 19:33:13.325991 4815 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2jnsn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-ts9zd_openstack(75e23e74-8711-40ca-82e9-ff6b9854a60e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 07 19:33:13 crc kubenswrapper[4815]: E1207 19:33:13.327283 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-ts9zd" podUID="75e23e74-8711-40ca-82e9-ff6b9854a60e" Dec 07 19:33:13 crc kubenswrapper[4815]: E1207 19:33:13.493464 4815 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 07 19:33:13 crc kubenswrapper[4815]: E1207 19:33:13.493728 4815 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wsb4x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-zdbbg_openstack(20f3d558-f5be-4f87-84c2-627a3e9dadde): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 07 19:33:13 crc kubenswrapper[4815]: E1207 19:33:13.495095 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-zdbbg" podUID="20f3d558-f5be-4f87-84c2-627a3e9dadde" Dec 07 19:33:13 crc kubenswrapper[4815]: I1207 19:33:13.834863 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 07 19:33:13 crc kubenswrapper[4815]: W1207 19:33:13.839456 4815 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a52cbc1_9245_48e5_8b22_0cdf96dc671b.slice/crio-5a19d08b809ab8d14824d5dfe004c90a1298db2d2ff70003d6ea5533dbd522ed WatchSource:0}: Error finding container 5a19d08b809ab8d14824d5dfe004c90a1298db2d2ff70003d6ea5533dbd522ed: Status 404 returned error can't find the container with id 5a19d08b809ab8d14824d5dfe004c90a1298db2d2ff70003d6ea5533dbd522ed Dec 07 19:33:13 crc kubenswrapper[4815]: I1207 19:33:13.867030 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 07 19:33:13 crc kubenswrapper[4815]: I1207 19:33:13.888289 4815 generic.go:334] "Generic (PLEG): container finished" podID="5847a638-115d-4f64-afe1-2935c623a28e" containerID="4c7287fe1b5ce4f58b049260fb15eada94d97775d0ebbd77ad06b1f02e140612" exitCode=0 Dec 07 19:33:13 crc kubenswrapper[4815]: I1207 19:33:13.888660 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-2wpl6" event={"ID":"5847a638-115d-4f64-afe1-2935c623a28e","Type":"ContainerDied","Data":"4c7287fe1b5ce4f58b049260fb15eada94d97775d0ebbd77ad06b1f02e140612"} Dec 07 19:33:13 crc kubenswrapper[4815]: I1207 19:33:13.898778 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c2f348fe-7af1-4260-9946-27b3e711400d","Type":"ContainerStarted","Data":"d7fe93dc9250c34e0ee1179b01a5884f7819d6ca8d2a2943b4c221bf26f7d727"} Dec 07 19:33:13 crc kubenswrapper[4815]: I1207 19:33:13.901781 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8a52cbc1-9245-48e5-8b22-0cdf96dc671b","Type":"ContainerStarted","Data":"5a19d08b809ab8d14824d5dfe004c90a1298db2d2ff70003d6ea5533dbd522ed"} Dec 07 19:33:13 crc kubenswrapper[4815]: I1207 19:33:13.921246 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-2gxx9" event={"ID":"3864f9d5-9110-45d5-a5a1-9790beeae00c","Type":"ContainerStarted","Data":"289ff882ed3c27a9575db558b3d0520fa25a5b7096254e4ee4d73a916e9fca58"} Dec 07 19:33:14 crc kubenswrapper[4815]: I1207 19:33:14.070866 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 07 19:33:14 crc kubenswrapper[4815]: I1207 19:33:14.073362 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 07 19:33:14 crc kubenswrapper[4815]: W1207 19:33:14.099300 4815 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod814a06c9_c432_4a32_835e_59a4831cf335.slice/crio-809efcbb49440b5061e03637b002d559b4f1fef02a6cea690806e9481f79a192 WatchSource:0}: Error finding container 809efcbb49440b5061e03637b002d559b4f1fef02a6cea690806e9481f79a192: Status 404 returned error can't find the container with id 809efcbb49440b5061e03637b002d559b4f1fef02a6cea690806e9481f79a192 Dec 07 19:33:14 crc kubenswrapper[4815]: E1207 19:33:14.157602 4815 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Dec 07 19:33:14 crc kubenswrapper[4815]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/5847a638-115d-4f64-afe1-2935c623a28e/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Dec 07 19:33:14 crc kubenswrapper[4815]: > podSandboxID="e17efe4f2a4d42448fefd6384ac55e70b2a532e25131b0821bb06a58b41ec55c" Dec 07 19:33:14 crc kubenswrapper[4815]: E1207 19:33:14.157765 4815 kuberuntime_manager.go:1274] "Unhandled Error" err=< Dec 07 19:33:14 crc kubenswrapper[4815]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9gckx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-2wpl6_openstack(5847a638-115d-4f64-afe1-2935c623a28e): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/5847a638-115d-4f64-afe1-2935c623a28e/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Dec 07 19:33:14 crc kubenswrapper[4815]: > logger="UnhandledError" Dec 07 19:33:14 crc kubenswrapper[4815]: E1207 19:33:14.159042 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/5847a638-115d-4f64-afe1-2935c623a28e/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-666b6646f7-2wpl6" podUID="5847a638-115d-4f64-afe1-2935c623a28e" Dec 07 19:33:14 crc kubenswrapper[4815]: I1207 19:33:14.447859 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-24gkx"] Dec 07 19:33:14 crc kubenswrapper[4815]: I1207 19:33:14.604444 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 07 19:33:14 crc kubenswrapper[4815]: I1207 19:33:14.739969 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-zdbbg" Dec 07 19:33:14 crc kubenswrapper[4815]: I1207 19:33:14.745856 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-ts9zd" Dec 07 19:33:14 crc kubenswrapper[4815]: I1207 19:33:14.931340 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"1dbe253c-608d-4711-904b-44926572c998","Type":"ContainerStarted","Data":"2976d5f567059c99d6ada256bab738f0a253563c1bd0f6195289800dd709d25f"} Dec 07 19:33:14 crc kubenswrapper[4815]: I1207 19:33:14.933371 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-24gkx" event={"ID":"5d3ed0f7-1ea2-48e7-bab4-f5a709da4850","Type":"ContainerStarted","Data":"d92516684cacb8731dec08bb417c48c00fe35eb20ef6a2fa1800053e0adadf8e"} Dec 07 19:33:14 crc kubenswrapper[4815]: I1207 19:33:14.936641 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"17df86ce-9c36-46d9-b5f2-d5dae8ae4675","Type":"ContainerStarted","Data":"79591086b3612ec6e7cd98df8a103808a4af1585879dd41e000603168bd415b9"} Dec 07 19:33:14 crc kubenswrapper[4815]: I1207 19:33:14.943074 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"b428ca75-b6e8-428e-be32-eb320bacbdda","Type":"ContainerStarted","Data":"90208230ccc2dec0370c1fd04383dfad4e12a4334fab563789d643bd4c5dbd6b"} Dec 07 19:33:14 crc kubenswrapper[4815]: I1207 19:33:14.949823 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-ts9zd" event={"ID":"75e23e74-8711-40ca-82e9-ff6b9854a60e","Type":"ContainerDied","Data":"1904d44a412c2b79e197f3b5802cfd452d4e81e866a465cb268c16b588ae6366"} Dec 07 19:33:14 crc kubenswrapper[4815]: I1207 19:33:14.949905 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-ts9zd" Dec 07 19:33:14 crc kubenswrapper[4815]: I1207 19:33:14.951452 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-zdbbg" event={"ID":"20f3d558-f5be-4f87-84c2-627a3e9dadde","Type":"ContainerDied","Data":"4e863b62718c51f22196f68c80b467d91d6dc1fb4a21ce5053074a0028498423"} Dec 07 19:33:14 crc kubenswrapper[4815]: I1207 19:33:14.951490 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-zdbbg" Dec 07 19:33:14 crc kubenswrapper[4815]: I1207 19:33:14.952781 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"814a06c9-c432-4a32-835e-59a4831cf335","Type":"ContainerStarted","Data":"809efcbb49440b5061e03637b002d559b4f1fef02a6cea690806e9481f79a192"} Dec 07 19:33:14 crc kubenswrapper[4815]: I1207 19:33:14.955841 4815 generic.go:334] "Generic (PLEG): container finished" podID="3864f9d5-9110-45d5-a5a1-9790beeae00c" containerID="289ff882ed3c27a9575db558b3d0520fa25a5b7096254e4ee4d73a916e9fca58" exitCode=0 Dec 07 19:33:14 crc kubenswrapper[4815]: I1207 19:33:14.956776 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-2gxx9" event={"ID":"3864f9d5-9110-45d5-a5a1-9790beeae00c","Type":"ContainerDied","Data":"289ff882ed3c27a9575db558b3d0520fa25a5b7096254e4ee4d73a916e9fca58"} Dec 07 19:33:14 crc kubenswrapper[4815]: I1207 19:33:14.956800 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-2gxx9" event={"ID":"3864f9d5-9110-45d5-a5a1-9790beeae00c","Type":"ContainerStarted","Data":"e6f7b669c3d0eacc1604d49bd31ebb64dcc88da5928a03856a46428aa1211ace"} Dec 07 19:33:14 crc kubenswrapper[4815]: I1207 19:33:14.956814 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-2gxx9" Dec 07 19:33:14 crc kubenswrapper[4815]: I1207 19:33:14.999051 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2jnsn\" (UniqueName: \"kubernetes.io/projected/75e23e74-8711-40ca-82e9-ff6b9854a60e-kube-api-access-2jnsn\") pod \"75e23e74-8711-40ca-82e9-ff6b9854a60e\" (UID: \"75e23e74-8711-40ca-82e9-ff6b9854a60e\") " Dec 07 19:33:14 crc kubenswrapper[4815]: I1207 19:33:14.999144 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20f3d558-f5be-4f87-84c2-627a3e9dadde-config\") pod \"20f3d558-f5be-4f87-84c2-627a3e9dadde\" (UID: \"20f3d558-f5be-4f87-84c2-627a3e9dadde\") " Dec 07 19:33:14 crc kubenswrapper[4815]: I1207 19:33:14.999210 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75e23e74-8711-40ca-82e9-ff6b9854a60e-dns-svc\") pod \"75e23e74-8711-40ca-82e9-ff6b9854a60e\" (UID: \"75e23e74-8711-40ca-82e9-ff6b9854a60e\") " Dec 07 19:33:14 crc kubenswrapper[4815]: I1207 19:33:14.999282 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wsb4x\" (UniqueName: \"kubernetes.io/projected/20f3d558-f5be-4f87-84c2-627a3e9dadde-kube-api-access-wsb4x\") pod \"20f3d558-f5be-4f87-84c2-627a3e9dadde\" (UID: \"20f3d558-f5be-4f87-84c2-627a3e9dadde\") " Dec 07 19:33:14 crc kubenswrapper[4815]: I1207 19:33:14.999316 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75e23e74-8711-40ca-82e9-ff6b9854a60e-config\") pod \"75e23e74-8711-40ca-82e9-ff6b9854a60e\" (UID: \"75e23e74-8711-40ca-82e9-ff6b9854a60e\") " Dec 07 19:33:15 crc kubenswrapper[4815]: I1207 19:33:15.002607 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75e23e74-8711-40ca-82e9-ff6b9854a60e-config" (OuterVolumeSpecName: "config") pod "75e23e74-8711-40ca-82e9-ff6b9854a60e" (UID: "75e23e74-8711-40ca-82e9-ff6b9854a60e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:33:15 crc kubenswrapper[4815]: I1207 19:33:15.002644 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75e23e74-8711-40ca-82e9-ff6b9854a60e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "75e23e74-8711-40ca-82e9-ff6b9854a60e" (UID: "75e23e74-8711-40ca-82e9-ff6b9854a60e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:33:15 crc kubenswrapper[4815]: I1207 19:33:15.003011 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20f3d558-f5be-4f87-84c2-627a3e9dadde-config" (OuterVolumeSpecName: "config") pod "20f3d558-f5be-4f87-84c2-627a3e9dadde" (UID: "20f3d558-f5be-4f87-84c2-627a3e9dadde"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:33:15 crc kubenswrapper[4815]: I1207 19:33:15.003929 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75e23e74-8711-40ca-82e9-ff6b9854a60e-kube-api-access-2jnsn" (OuterVolumeSpecName: "kube-api-access-2jnsn") pod "75e23e74-8711-40ca-82e9-ff6b9854a60e" (UID: "75e23e74-8711-40ca-82e9-ff6b9854a60e"). InnerVolumeSpecName "kube-api-access-2jnsn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:33:15 crc kubenswrapper[4815]: I1207 19:33:15.009966 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20f3d558-f5be-4f87-84c2-627a3e9dadde-kube-api-access-wsb4x" (OuterVolumeSpecName: "kube-api-access-wsb4x") pod "20f3d558-f5be-4f87-84c2-627a3e9dadde" (UID: "20f3d558-f5be-4f87-84c2-627a3e9dadde"). InnerVolumeSpecName "kube-api-access-wsb4x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:33:15 crc kubenswrapper[4815]: I1207 19:33:15.031344 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-2gxx9" podStartSLOduration=2.836827775 podStartE2EDuration="20.031323854s" podCreationTimestamp="2025-12-07 19:32:55 +0000 UTC" firstStartedPulling="2025-12-07 19:32:56.3962438 +0000 UTC m=+1080.975233845" lastFinishedPulling="2025-12-07 19:33:13.590739879 +0000 UTC m=+1098.169729924" observedRunningTime="2025-12-07 19:33:15.016954884 +0000 UTC m=+1099.595944929" watchObservedRunningTime="2025-12-07 19:33:15.031323854 +0000 UTC m=+1099.610313909" Dec 07 19:33:15 crc kubenswrapper[4815]: I1207 19:33:15.101840 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2jnsn\" (UniqueName: \"kubernetes.io/projected/75e23e74-8711-40ca-82e9-ff6b9854a60e-kube-api-access-2jnsn\") on node \"crc\" DevicePath \"\"" Dec 07 19:33:15 crc kubenswrapper[4815]: I1207 19:33:15.101870 4815 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20f3d558-f5be-4f87-84c2-627a3e9dadde-config\") on node \"crc\" DevicePath \"\"" Dec 07 19:33:15 crc kubenswrapper[4815]: I1207 19:33:15.101886 4815 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75e23e74-8711-40ca-82e9-ff6b9854a60e-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 07 19:33:15 crc kubenswrapper[4815]: I1207 19:33:15.101894 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wsb4x\" (UniqueName: \"kubernetes.io/projected/20f3d558-f5be-4f87-84c2-627a3e9dadde-kube-api-access-wsb4x\") on node \"crc\" DevicePath \"\"" Dec 07 19:33:15 crc kubenswrapper[4815]: I1207 19:33:15.101904 4815 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75e23e74-8711-40ca-82e9-ff6b9854a60e-config\") on node \"crc\" DevicePath \"\"" Dec 07 19:33:15 crc kubenswrapper[4815]: I1207 19:33:15.375181 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-zdbbg"] Dec 07 19:33:15 crc kubenswrapper[4815]: I1207 19:33:15.379350 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-zdbbg"] Dec 07 19:33:15 crc kubenswrapper[4815]: I1207 19:33:15.423463 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-ts9zd"] Dec 07 19:33:15 crc kubenswrapper[4815]: I1207 19:33:15.431323 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-ts9zd"] Dec 07 19:33:15 crc kubenswrapper[4815]: I1207 19:33:15.460924 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-bhvc5"] Dec 07 19:33:15 crc kubenswrapper[4815]: I1207 19:33:15.838958 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20f3d558-f5be-4f87-84c2-627a3e9dadde" path="/var/lib/kubelet/pods/20f3d558-f5be-4f87-84c2-627a3e9dadde/volumes" Dec 07 19:33:15 crc kubenswrapper[4815]: I1207 19:33:15.839750 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75e23e74-8711-40ca-82e9-ff6b9854a60e" path="/var/lib/kubelet/pods/75e23e74-8711-40ca-82e9-ff6b9854a60e/volumes" Dec 07 19:33:15 crc kubenswrapper[4815]: I1207 19:33:15.995993 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-bhvc5" event={"ID":"2889719b-c3b6-4a36-8c6e-4b3e9f9a2c21","Type":"ContainerStarted","Data":"e6a13fcb7d7d3e9888e8a51e724a37ff57a5bd88a1839d0e86c72e2200e69c4c"} Dec 07 19:33:16 crc kubenswrapper[4815]: I1207 19:33:16.506851 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 07 19:33:17 crc kubenswrapper[4815]: I1207 19:33:17.120264 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-7fzms"] Dec 07 19:33:17 crc kubenswrapper[4815]: I1207 19:33:17.121220 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-7fzms" Dec 07 19:33:17 crc kubenswrapper[4815]: I1207 19:33:17.125032 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Dec 07 19:33:17 crc kubenswrapper[4815]: I1207 19:33:17.153994 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83334f9a-9d33-4229-9552-9f69c525dd82-combined-ca-bundle\") pod \"ovn-controller-metrics-7fzms\" (UID: \"83334f9a-9d33-4229-9552-9f69c525dd82\") " pod="openstack/ovn-controller-metrics-7fzms" Dec 07 19:33:17 crc kubenswrapper[4815]: I1207 19:33:17.154053 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/83334f9a-9d33-4229-9552-9f69c525dd82-ovn-rundir\") pod \"ovn-controller-metrics-7fzms\" (UID: \"83334f9a-9d33-4229-9552-9f69c525dd82\") " pod="openstack/ovn-controller-metrics-7fzms" Dec 07 19:33:17 crc kubenswrapper[4815]: I1207 19:33:17.154085 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/83334f9a-9d33-4229-9552-9f69c525dd82-ovs-rundir\") pod \"ovn-controller-metrics-7fzms\" (UID: \"83334f9a-9d33-4229-9552-9f69c525dd82\") " pod="openstack/ovn-controller-metrics-7fzms" Dec 07 19:33:17 crc kubenswrapper[4815]: I1207 19:33:17.154104 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/83334f9a-9d33-4229-9552-9f69c525dd82-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-7fzms\" (UID: \"83334f9a-9d33-4229-9552-9f69c525dd82\") " pod="openstack/ovn-controller-metrics-7fzms" Dec 07 19:33:17 crc kubenswrapper[4815]: I1207 19:33:17.154128 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxxg9\" (UniqueName: \"kubernetes.io/projected/83334f9a-9d33-4229-9552-9f69c525dd82-kube-api-access-hxxg9\") pod \"ovn-controller-metrics-7fzms\" (UID: \"83334f9a-9d33-4229-9552-9f69c525dd82\") " pod="openstack/ovn-controller-metrics-7fzms" Dec 07 19:33:17 crc kubenswrapper[4815]: I1207 19:33:17.154174 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83334f9a-9d33-4229-9552-9f69c525dd82-config\") pod \"ovn-controller-metrics-7fzms\" (UID: \"83334f9a-9d33-4229-9552-9f69c525dd82\") " pod="openstack/ovn-controller-metrics-7fzms" Dec 07 19:33:17 crc kubenswrapper[4815]: I1207 19:33:17.157779 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-7fzms"] Dec 07 19:33:17 crc kubenswrapper[4815]: I1207 19:33:17.257264 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83334f9a-9d33-4229-9552-9f69c525dd82-combined-ca-bundle\") pod \"ovn-controller-metrics-7fzms\" (UID: \"83334f9a-9d33-4229-9552-9f69c525dd82\") " pod="openstack/ovn-controller-metrics-7fzms" Dec 07 19:33:17 crc kubenswrapper[4815]: I1207 19:33:17.257350 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/83334f9a-9d33-4229-9552-9f69c525dd82-ovn-rundir\") pod \"ovn-controller-metrics-7fzms\" (UID: \"83334f9a-9d33-4229-9552-9f69c525dd82\") " pod="openstack/ovn-controller-metrics-7fzms" Dec 07 19:33:17 crc kubenswrapper[4815]: I1207 19:33:17.257388 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/83334f9a-9d33-4229-9552-9f69c525dd82-ovs-rundir\") pod \"ovn-controller-metrics-7fzms\" (UID: \"83334f9a-9d33-4229-9552-9f69c525dd82\") " pod="openstack/ovn-controller-metrics-7fzms" Dec 07 19:33:17 crc kubenswrapper[4815]: I1207 19:33:17.257435 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/83334f9a-9d33-4229-9552-9f69c525dd82-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-7fzms\" (UID: \"83334f9a-9d33-4229-9552-9f69c525dd82\") " pod="openstack/ovn-controller-metrics-7fzms" Dec 07 19:33:17 crc kubenswrapper[4815]: I1207 19:33:17.257458 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxxg9\" (UniqueName: \"kubernetes.io/projected/83334f9a-9d33-4229-9552-9f69c525dd82-kube-api-access-hxxg9\") pod \"ovn-controller-metrics-7fzms\" (UID: \"83334f9a-9d33-4229-9552-9f69c525dd82\") " pod="openstack/ovn-controller-metrics-7fzms" Dec 07 19:33:17 crc kubenswrapper[4815]: I1207 19:33:17.257529 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83334f9a-9d33-4229-9552-9f69c525dd82-config\") pod \"ovn-controller-metrics-7fzms\" (UID: \"83334f9a-9d33-4229-9552-9f69c525dd82\") " pod="openstack/ovn-controller-metrics-7fzms" Dec 07 19:33:17 crc kubenswrapper[4815]: I1207 19:33:17.258255 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/83334f9a-9d33-4229-9552-9f69c525dd82-ovs-rundir\") pod \"ovn-controller-metrics-7fzms\" (UID: \"83334f9a-9d33-4229-9552-9f69c525dd82\") " pod="openstack/ovn-controller-metrics-7fzms" Dec 07 19:33:17 crc kubenswrapper[4815]: I1207 19:33:17.258403 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/83334f9a-9d33-4229-9552-9f69c525dd82-ovn-rundir\") pod \"ovn-controller-metrics-7fzms\" (UID: \"83334f9a-9d33-4229-9552-9f69c525dd82\") " pod="openstack/ovn-controller-metrics-7fzms" Dec 07 19:33:17 crc kubenswrapper[4815]: I1207 19:33:17.258601 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83334f9a-9d33-4229-9552-9f69c525dd82-config\") pod \"ovn-controller-metrics-7fzms\" (UID: \"83334f9a-9d33-4229-9552-9f69c525dd82\") " pod="openstack/ovn-controller-metrics-7fzms" Dec 07 19:33:17 crc kubenswrapper[4815]: I1207 19:33:17.264801 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83334f9a-9d33-4229-9552-9f69c525dd82-combined-ca-bundle\") pod \"ovn-controller-metrics-7fzms\" (UID: \"83334f9a-9d33-4229-9552-9f69c525dd82\") " pod="openstack/ovn-controller-metrics-7fzms" Dec 07 19:33:17 crc kubenswrapper[4815]: I1207 19:33:17.278364 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/83334f9a-9d33-4229-9552-9f69c525dd82-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-7fzms\" (UID: \"83334f9a-9d33-4229-9552-9f69c525dd82\") " pod="openstack/ovn-controller-metrics-7fzms" Dec 07 19:33:17 crc kubenswrapper[4815]: I1207 19:33:17.286453 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxxg9\" (UniqueName: \"kubernetes.io/projected/83334f9a-9d33-4229-9552-9f69c525dd82-kube-api-access-hxxg9\") pod \"ovn-controller-metrics-7fzms\" (UID: \"83334f9a-9d33-4229-9552-9f69c525dd82\") " pod="openstack/ovn-controller-metrics-7fzms" Dec 07 19:33:17 crc kubenswrapper[4815]: I1207 19:33:17.418386 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-2wpl6"] Dec 07 19:33:17 crc kubenswrapper[4815]: I1207 19:33:17.460799 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-wrhqm"] Dec 07 19:33:17 crc kubenswrapper[4815]: I1207 19:33:17.462623 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-wrhqm" Dec 07 19:33:17 crc kubenswrapper[4815]: I1207 19:33:17.465201 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Dec 07 19:33:17 crc kubenswrapper[4815]: I1207 19:33:17.476068 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-7fzms" Dec 07 19:33:17 crc kubenswrapper[4815]: I1207 19:33:17.493488 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-wrhqm"] Dec 07 19:33:17 crc kubenswrapper[4815]: I1207 19:33:17.568560 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c25272ad-ab34-434f-b8be-a945f45ca00b-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-wrhqm\" (UID: \"c25272ad-ab34-434f-b8be-a945f45ca00b\") " pod="openstack/dnsmasq-dns-5bf47b49b7-wrhqm" Dec 07 19:33:17 crc kubenswrapper[4815]: I1207 19:33:17.568626 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vnq7\" (UniqueName: \"kubernetes.io/projected/c25272ad-ab34-434f-b8be-a945f45ca00b-kube-api-access-5vnq7\") pod \"dnsmasq-dns-5bf47b49b7-wrhqm\" (UID: \"c25272ad-ab34-434f-b8be-a945f45ca00b\") " pod="openstack/dnsmasq-dns-5bf47b49b7-wrhqm" Dec 07 19:33:17 crc kubenswrapper[4815]: I1207 19:33:17.568715 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c25272ad-ab34-434f-b8be-a945f45ca00b-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-wrhqm\" (UID: \"c25272ad-ab34-434f-b8be-a945f45ca00b\") " pod="openstack/dnsmasq-dns-5bf47b49b7-wrhqm" Dec 07 19:33:17 crc kubenswrapper[4815]: I1207 19:33:17.568792 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c25272ad-ab34-434f-b8be-a945f45ca00b-config\") pod \"dnsmasq-dns-5bf47b49b7-wrhqm\" (UID: \"c25272ad-ab34-434f-b8be-a945f45ca00b\") " pod="openstack/dnsmasq-dns-5bf47b49b7-wrhqm" Dec 07 19:33:17 crc kubenswrapper[4815]: I1207 19:33:17.657016 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-2gxx9"] Dec 07 19:33:17 crc kubenswrapper[4815]: I1207 19:33:17.657718 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-2gxx9" podUID="3864f9d5-9110-45d5-a5a1-9790beeae00c" containerName="dnsmasq-dns" containerID="cri-o://e6f7b669c3d0eacc1604d49bd31ebb64dcc88da5928a03856a46428aa1211ace" gracePeriod=10 Dec 07 19:33:17 crc kubenswrapper[4815]: I1207 19:33:17.691500 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c25272ad-ab34-434f-b8be-a945f45ca00b-config\") pod \"dnsmasq-dns-5bf47b49b7-wrhqm\" (UID: \"c25272ad-ab34-434f-b8be-a945f45ca00b\") " pod="openstack/dnsmasq-dns-5bf47b49b7-wrhqm" Dec 07 19:33:17 crc kubenswrapper[4815]: I1207 19:33:17.691881 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c25272ad-ab34-434f-b8be-a945f45ca00b-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-wrhqm\" (UID: \"c25272ad-ab34-434f-b8be-a945f45ca00b\") " pod="openstack/dnsmasq-dns-5bf47b49b7-wrhqm" Dec 07 19:33:17 crc kubenswrapper[4815]: I1207 19:33:17.692037 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vnq7\" (UniqueName: \"kubernetes.io/projected/c25272ad-ab34-434f-b8be-a945f45ca00b-kube-api-access-5vnq7\") pod \"dnsmasq-dns-5bf47b49b7-wrhqm\" (UID: \"c25272ad-ab34-434f-b8be-a945f45ca00b\") " pod="openstack/dnsmasq-dns-5bf47b49b7-wrhqm" Dec 07 19:33:17 crc kubenswrapper[4815]: I1207 19:33:17.692282 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c25272ad-ab34-434f-b8be-a945f45ca00b-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-wrhqm\" (UID: \"c25272ad-ab34-434f-b8be-a945f45ca00b\") " pod="openstack/dnsmasq-dns-5bf47b49b7-wrhqm" Dec 07 19:33:17 crc kubenswrapper[4815]: I1207 19:33:17.692964 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c25272ad-ab34-434f-b8be-a945f45ca00b-config\") pod \"dnsmasq-dns-5bf47b49b7-wrhqm\" (UID: \"c25272ad-ab34-434f-b8be-a945f45ca00b\") " pod="openstack/dnsmasq-dns-5bf47b49b7-wrhqm" Dec 07 19:33:17 crc kubenswrapper[4815]: I1207 19:33:17.693258 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c25272ad-ab34-434f-b8be-a945f45ca00b-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-wrhqm\" (UID: \"c25272ad-ab34-434f-b8be-a945f45ca00b\") " pod="openstack/dnsmasq-dns-5bf47b49b7-wrhqm" Dec 07 19:33:17 crc kubenswrapper[4815]: I1207 19:33:17.693680 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c25272ad-ab34-434f-b8be-a945f45ca00b-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-wrhqm\" (UID: \"c25272ad-ab34-434f-b8be-a945f45ca00b\") " pod="openstack/dnsmasq-dns-5bf47b49b7-wrhqm" Dec 07 19:33:17 crc kubenswrapper[4815]: I1207 19:33:17.733801 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vnq7\" (UniqueName: \"kubernetes.io/projected/c25272ad-ab34-434f-b8be-a945f45ca00b-kube-api-access-5vnq7\") pod \"dnsmasq-dns-5bf47b49b7-wrhqm\" (UID: \"c25272ad-ab34-434f-b8be-a945f45ca00b\") " pod="openstack/dnsmasq-dns-5bf47b49b7-wrhqm" Dec 07 19:33:17 crc kubenswrapper[4815]: I1207 19:33:17.736393 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8554648995-mhmdw"] Dec 07 19:33:17 crc kubenswrapper[4815]: I1207 19:33:17.737907 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-mhmdw" Dec 07 19:33:17 crc kubenswrapper[4815]: I1207 19:33:17.741048 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Dec 07 19:33:17 crc kubenswrapper[4815]: I1207 19:33:17.746539 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-mhmdw"] Dec 07 19:33:17 crc kubenswrapper[4815]: I1207 19:33:17.785905 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-wrhqm" Dec 07 19:33:17 crc kubenswrapper[4815]: I1207 19:33:17.796311 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32f95e51-757c-4167-ad8d-32f472266fe5-config\") pod \"dnsmasq-dns-8554648995-mhmdw\" (UID: \"32f95e51-757c-4167-ad8d-32f472266fe5\") " pod="openstack/dnsmasq-dns-8554648995-mhmdw" Dec 07 19:33:17 crc kubenswrapper[4815]: I1207 19:33:17.796381 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9nkb\" (UniqueName: \"kubernetes.io/projected/32f95e51-757c-4167-ad8d-32f472266fe5-kube-api-access-m9nkb\") pod \"dnsmasq-dns-8554648995-mhmdw\" (UID: \"32f95e51-757c-4167-ad8d-32f472266fe5\") " pod="openstack/dnsmasq-dns-8554648995-mhmdw" Dec 07 19:33:17 crc kubenswrapper[4815]: I1207 19:33:17.796434 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/32f95e51-757c-4167-ad8d-32f472266fe5-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-mhmdw\" (UID: \"32f95e51-757c-4167-ad8d-32f472266fe5\") " pod="openstack/dnsmasq-dns-8554648995-mhmdw" Dec 07 19:33:17 crc kubenswrapper[4815]: I1207 19:33:17.796458 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/32f95e51-757c-4167-ad8d-32f472266fe5-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-mhmdw\" (UID: \"32f95e51-757c-4167-ad8d-32f472266fe5\") " pod="openstack/dnsmasq-dns-8554648995-mhmdw" Dec 07 19:33:17 crc kubenswrapper[4815]: I1207 19:33:17.796515 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/32f95e51-757c-4167-ad8d-32f472266fe5-dns-svc\") pod \"dnsmasq-dns-8554648995-mhmdw\" (UID: \"32f95e51-757c-4167-ad8d-32f472266fe5\") " pod="openstack/dnsmasq-dns-8554648995-mhmdw" Dec 07 19:33:17 crc kubenswrapper[4815]: I1207 19:33:17.897658 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/32f95e51-757c-4167-ad8d-32f472266fe5-dns-svc\") pod \"dnsmasq-dns-8554648995-mhmdw\" (UID: \"32f95e51-757c-4167-ad8d-32f472266fe5\") " pod="openstack/dnsmasq-dns-8554648995-mhmdw" Dec 07 19:33:17 crc kubenswrapper[4815]: I1207 19:33:17.897751 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32f95e51-757c-4167-ad8d-32f472266fe5-config\") pod \"dnsmasq-dns-8554648995-mhmdw\" (UID: \"32f95e51-757c-4167-ad8d-32f472266fe5\") " pod="openstack/dnsmasq-dns-8554648995-mhmdw" Dec 07 19:33:17 crc kubenswrapper[4815]: I1207 19:33:17.897811 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9nkb\" (UniqueName: \"kubernetes.io/projected/32f95e51-757c-4167-ad8d-32f472266fe5-kube-api-access-m9nkb\") pod \"dnsmasq-dns-8554648995-mhmdw\" (UID: \"32f95e51-757c-4167-ad8d-32f472266fe5\") " pod="openstack/dnsmasq-dns-8554648995-mhmdw" Dec 07 19:33:17 crc kubenswrapper[4815]: I1207 19:33:17.897837 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/32f95e51-757c-4167-ad8d-32f472266fe5-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-mhmdw\" (UID: \"32f95e51-757c-4167-ad8d-32f472266fe5\") " pod="openstack/dnsmasq-dns-8554648995-mhmdw" Dec 07 19:33:17 crc kubenswrapper[4815]: I1207 19:33:17.897858 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/32f95e51-757c-4167-ad8d-32f472266fe5-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-mhmdw\" (UID: \"32f95e51-757c-4167-ad8d-32f472266fe5\") " pod="openstack/dnsmasq-dns-8554648995-mhmdw" Dec 07 19:33:17 crc kubenswrapper[4815]: I1207 19:33:17.898718 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/32f95e51-757c-4167-ad8d-32f472266fe5-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-mhmdw\" (UID: \"32f95e51-757c-4167-ad8d-32f472266fe5\") " pod="openstack/dnsmasq-dns-8554648995-mhmdw" Dec 07 19:33:17 crc kubenswrapper[4815]: I1207 19:33:17.899322 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/32f95e51-757c-4167-ad8d-32f472266fe5-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-mhmdw\" (UID: \"32f95e51-757c-4167-ad8d-32f472266fe5\") " pod="openstack/dnsmasq-dns-8554648995-mhmdw" Dec 07 19:33:17 crc kubenswrapper[4815]: I1207 19:33:17.900005 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/32f95e51-757c-4167-ad8d-32f472266fe5-dns-svc\") pod \"dnsmasq-dns-8554648995-mhmdw\" (UID: \"32f95e51-757c-4167-ad8d-32f472266fe5\") " pod="openstack/dnsmasq-dns-8554648995-mhmdw" Dec 07 19:33:17 crc kubenswrapper[4815]: I1207 19:33:17.900725 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32f95e51-757c-4167-ad8d-32f472266fe5-config\") pod \"dnsmasq-dns-8554648995-mhmdw\" (UID: \"32f95e51-757c-4167-ad8d-32f472266fe5\") " pod="openstack/dnsmasq-dns-8554648995-mhmdw" Dec 07 19:33:17 crc kubenswrapper[4815]: I1207 19:33:17.934468 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9nkb\" (UniqueName: \"kubernetes.io/projected/32f95e51-757c-4167-ad8d-32f472266fe5-kube-api-access-m9nkb\") pod \"dnsmasq-dns-8554648995-mhmdw\" (UID: \"32f95e51-757c-4167-ad8d-32f472266fe5\") " pod="openstack/dnsmasq-dns-8554648995-mhmdw" Dec 07 19:33:18 crc kubenswrapper[4815]: I1207 19:33:18.017889 4815 generic.go:334] "Generic (PLEG): container finished" podID="3864f9d5-9110-45d5-a5a1-9790beeae00c" containerID="e6f7b669c3d0eacc1604d49bd31ebb64dcc88da5928a03856a46428aa1211ace" exitCode=0 Dec 07 19:33:18 crc kubenswrapper[4815]: I1207 19:33:18.017948 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-2gxx9" event={"ID":"3864f9d5-9110-45d5-a5a1-9790beeae00c","Type":"ContainerDied","Data":"e6f7b669c3d0eacc1604d49bd31ebb64dcc88da5928a03856a46428aa1211ace"} Dec 07 19:33:18 crc kubenswrapper[4815]: I1207 19:33:18.127048 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-mhmdw" Dec 07 19:33:20 crc kubenswrapper[4815]: I1207 19:33:20.572899 4815 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-57d769cc4f-2gxx9" podUID="3864f9d5-9110-45d5-a5a1-9790beeae00c" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.98:5353: connect: connection refused" Dec 07 19:33:20 crc kubenswrapper[4815]: W1207 19:33:20.756777 4815 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4ebfeb2d_526f_4f02_be1d_def3f49c555c.slice/crio-6d52d91071ccec201c802b919d7a7b4122db2fff3ef84bb33bb372a2381fa022 WatchSource:0}: Error finding container 6d52d91071ccec201c802b919d7a7b4122db2fff3ef84bb33bb372a2381fa022: Status 404 returned error can't find the container with id 6d52d91071ccec201c802b919d7a7b4122db2fff3ef84bb33bb372a2381fa022 Dec 07 19:33:21 crc kubenswrapper[4815]: I1207 19:33:21.038805 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"4ebfeb2d-526f-4f02-be1d-def3f49c555c","Type":"ContainerStarted","Data":"6d52d91071ccec201c802b919d7a7b4122db2fff3ef84bb33bb372a2381fa022"} Dec 07 19:33:25 crc kubenswrapper[4815]: I1207 19:33:25.562081 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-2gxx9" Dec 07 19:33:25 crc kubenswrapper[4815]: I1207 19:33:25.718225 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4dgzw\" (UniqueName: \"kubernetes.io/projected/3864f9d5-9110-45d5-a5a1-9790beeae00c-kube-api-access-4dgzw\") pod \"3864f9d5-9110-45d5-a5a1-9790beeae00c\" (UID: \"3864f9d5-9110-45d5-a5a1-9790beeae00c\") " Dec 07 19:33:25 crc kubenswrapper[4815]: I1207 19:33:25.718380 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3864f9d5-9110-45d5-a5a1-9790beeae00c-config\") pod \"3864f9d5-9110-45d5-a5a1-9790beeae00c\" (UID: \"3864f9d5-9110-45d5-a5a1-9790beeae00c\") " Dec 07 19:33:25 crc kubenswrapper[4815]: I1207 19:33:25.718430 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3864f9d5-9110-45d5-a5a1-9790beeae00c-dns-svc\") pod \"3864f9d5-9110-45d5-a5a1-9790beeae00c\" (UID: \"3864f9d5-9110-45d5-a5a1-9790beeae00c\") " Dec 07 19:33:25 crc kubenswrapper[4815]: I1207 19:33:25.722662 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3864f9d5-9110-45d5-a5a1-9790beeae00c-kube-api-access-4dgzw" (OuterVolumeSpecName: "kube-api-access-4dgzw") pod "3864f9d5-9110-45d5-a5a1-9790beeae00c" (UID: "3864f9d5-9110-45d5-a5a1-9790beeae00c"). InnerVolumeSpecName "kube-api-access-4dgzw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:33:25 crc kubenswrapper[4815]: I1207 19:33:25.752844 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3864f9d5-9110-45d5-a5a1-9790beeae00c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3864f9d5-9110-45d5-a5a1-9790beeae00c" (UID: "3864f9d5-9110-45d5-a5a1-9790beeae00c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:33:25 crc kubenswrapper[4815]: I1207 19:33:25.756553 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3864f9d5-9110-45d5-a5a1-9790beeae00c-config" (OuterVolumeSpecName: "config") pod "3864f9d5-9110-45d5-a5a1-9790beeae00c" (UID: "3864f9d5-9110-45d5-a5a1-9790beeae00c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:33:25 crc kubenswrapper[4815]: I1207 19:33:25.820341 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4dgzw\" (UniqueName: \"kubernetes.io/projected/3864f9d5-9110-45d5-a5a1-9790beeae00c-kube-api-access-4dgzw\") on node \"crc\" DevicePath \"\"" Dec 07 19:33:25 crc kubenswrapper[4815]: I1207 19:33:25.820384 4815 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3864f9d5-9110-45d5-a5a1-9790beeae00c-config\") on node \"crc\" DevicePath \"\"" Dec 07 19:33:25 crc kubenswrapper[4815]: I1207 19:33:25.820398 4815 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3864f9d5-9110-45d5-a5a1-9790beeae00c-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 07 19:33:26 crc kubenswrapper[4815]: I1207 19:33:26.083194 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-2gxx9" event={"ID":"3864f9d5-9110-45d5-a5a1-9790beeae00c","Type":"ContainerDied","Data":"f8d3eb4163f8c7d7e11ec5e82d5b1b582ddeab0a08eb1502232d7ba79566396a"} Dec 07 19:33:26 crc kubenswrapper[4815]: I1207 19:33:26.083523 4815 scope.go:117] "RemoveContainer" containerID="e6f7b669c3d0eacc1604d49bd31ebb64dcc88da5928a03856a46428aa1211ace" Dec 07 19:33:26 crc kubenswrapper[4815]: I1207 19:33:26.083652 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-2gxx9" Dec 07 19:33:26 crc kubenswrapper[4815]: I1207 19:33:26.109728 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-2gxx9"] Dec 07 19:33:26 crc kubenswrapper[4815]: I1207 19:33:26.115234 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-2gxx9"] Dec 07 19:33:26 crc kubenswrapper[4815]: I1207 19:33:26.870953 4815 scope.go:117] "RemoveContainer" containerID="289ff882ed3c27a9575db558b3d0520fa25a5b7096254e4ee4d73a916e9fca58" Dec 07 19:33:27 crc kubenswrapper[4815]: I1207 19:33:27.108182 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-2wpl6" event={"ID":"5847a638-115d-4f64-afe1-2935c623a28e","Type":"ContainerStarted","Data":"fcb9dcf4285f5a86e0fdf17104b5d1e73495bb246bfb39792950fbec8645fe0e"} Dec 07 19:33:27 crc kubenswrapper[4815]: I1207 19:33:27.108271 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666b6646f7-2wpl6" podUID="5847a638-115d-4f64-afe1-2935c623a28e" containerName="dnsmasq-dns" containerID="cri-o://fcb9dcf4285f5a86e0fdf17104b5d1e73495bb246bfb39792950fbec8645fe0e" gracePeriod=10 Dec 07 19:33:27 crc kubenswrapper[4815]: I1207 19:33:27.108771 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666b6646f7-2wpl6" Dec 07 19:33:27 crc kubenswrapper[4815]: I1207 19:33:27.129885 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666b6646f7-2wpl6" podStartSLOduration=15.419373374 podStartE2EDuration="33.129869449s" podCreationTimestamp="2025-12-07 19:32:54 +0000 UTC" firstStartedPulling="2025-12-07 19:32:55.837442952 +0000 UTC m=+1080.416432997" lastFinishedPulling="2025-12-07 19:33:13.547939027 +0000 UTC m=+1098.126929072" observedRunningTime="2025-12-07 19:33:27.125490854 +0000 UTC m=+1111.704480899" watchObservedRunningTime="2025-12-07 19:33:27.129869449 +0000 UTC m=+1111.708859484" Dec 07 19:33:27 crc kubenswrapper[4815]: I1207 19:33:27.162078 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-7fzms"] Dec 07 19:33:27 crc kubenswrapper[4815]: I1207 19:33:27.251987 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-mhmdw"] Dec 07 19:33:27 crc kubenswrapper[4815]: W1207 19:33:27.281512 4815 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod83334f9a_9d33_4229_9552_9f69c525dd82.slice/crio-6d6b650d3267ebd2350eb3346303df4be74b98ce0025e1578059eb2effb36a04 WatchSource:0}: Error finding container 6d6b650d3267ebd2350eb3346303df4be74b98ce0025e1578059eb2effb36a04: Status 404 returned error can't find the container with id 6d6b650d3267ebd2350eb3346303df4be74b98ce0025e1578059eb2effb36a04 Dec 07 19:33:27 crc kubenswrapper[4815]: I1207 19:33:27.357074 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-wrhqm"] Dec 07 19:33:27 crc kubenswrapper[4815]: I1207 19:33:27.778256 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3864f9d5-9110-45d5-a5a1-9790beeae00c" path="/var/lib/kubelet/pods/3864f9d5-9110-45d5-a5a1-9790beeae00c/volumes" Dec 07 19:33:27 crc kubenswrapper[4815]: I1207 19:33:27.809512 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-2wpl6" Dec 07 19:33:27 crc kubenswrapper[4815]: I1207 19:33:27.877442 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5847a638-115d-4f64-afe1-2935c623a28e-dns-svc\") pod \"5847a638-115d-4f64-afe1-2935c623a28e\" (UID: \"5847a638-115d-4f64-afe1-2935c623a28e\") " Dec 07 19:33:27 crc kubenswrapper[4815]: I1207 19:33:27.877494 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5847a638-115d-4f64-afe1-2935c623a28e-config\") pod \"5847a638-115d-4f64-afe1-2935c623a28e\" (UID: \"5847a638-115d-4f64-afe1-2935c623a28e\") " Dec 07 19:33:27 crc kubenswrapper[4815]: I1207 19:33:27.978964 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9gckx\" (UniqueName: \"kubernetes.io/projected/5847a638-115d-4f64-afe1-2935c623a28e-kube-api-access-9gckx\") pod \"5847a638-115d-4f64-afe1-2935c623a28e\" (UID: \"5847a638-115d-4f64-afe1-2935c623a28e\") " Dec 07 19:33:28 crc kubenswrapper[4815]: I1207 19:33:28.000529 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5847a638-115d-4f64-afe1-2935c623a28e-kube-api-access-9gckx" (OuterVolumeSpecName: "kube-api-access-9gckx") pod "5847a638-115d-4f64-afe1-2935c623a28e" (UID: "5847a638-115d-4f64-afe1-2935c623a28e"). InnerVolumeSpecName "kube-api-access-9gckx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:33:28 crc kubenswrapper[4815]: I1207 19:33:28.080529 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9gckx\" (UniqueName: \"kubernetes.io/projected/5847a638-115d-4f64-afe1-2935c623a28e-kube-api-access-9gckx\") on node \"crc\" DevicePath \"\"" Dec 07 19:33:28 crc kubenswrapper[4815]: I1207 19:33:28.124504 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"814a06c9-c432-4a32-835e-59a4831cf335","Type":"ContainerStarted","Data":"27aedc7db2bbcd1f9de2ea9311a9535e98297dae22558748a941d3049bf6da65"} Dec 07 19:33:28 crc kubenswrapper[4815]: I1207 19:33:28.128748 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-7fzms" event={"ID":"83334f9a-9d33-4229-9552-9f69c525dd82","Type":"ContainerStarted","Data":"6d6b650d3267ebd2350eb3346303df4be74b98ce0025e1578059eb2effb36a04"} Dec 07 19:33:28 crc kubenswrapper[4815]: I1207 19:33:28.130553 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"4ebfeb2d-526f-4f02-be1d-def3f49c555c","Type":"ContainerStarted","Data":"6b10493cac86fc86f5eea8c9a88324a383a4c1571655245d127115ce1c41e175"} Dec 07 19:33:28 crc kubenswrapper[4815]: I1207 19:33:28.132726 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-24gkx" event={"ID":"5d3ed0f7-1ea2-48e7-bab4-f5a709da4850","Type":"ContainerStarted","Data":"97e764462965c5f47acaa2ad4edb2a6d370493c487dbfa860113311ab50fd539"} Dec 07 19:33:28 crc kubenswrapper[4815]: I1207 19:33:28.132866 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-24gkx" Dec 07 19:33:28 crc kubenswrapper[4815]: I1207 19:33:28.136545 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"17df86ce-9c36-46d9-b5f2-d5dae8ae4675","Type":"ContainerStarted","Data":"a553e7ba7d88d2010a9f475b0ac97d1b6848e41c64295f46f2ded2ba62418927"} Dec 07 19:33:28 crc kubenswrapper[4815]: I1207 19:33:28.141052 4815 generic.go:334] "Generic (PLEG): container finished" podID="5847a638-115d-4f64-afe1-2935c623a28e" containerID="fcb9dcf4285f5a86e0fdf17104b5d1e73495bb246bfb39792950fbec8645fe0e" exitCode=0 Dec 07 19:33:28 crc kubenswrapper[4815]: I1207 19:33:28.141106 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-2wpl6" event={"ID":"5847a638-115d-4f64-afe1-2935c623a28e","Type":"ContainerDied","Data":"fcb9dcf4285f5a86e0fdf17104b5d1e73495bb246bfb39792950fbec8645fe0e"} Dec 07 19:33:28 crc kubenswrapper[4815]: I1207 19:33:28.141131 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-2wpl6" event={"ID":"5847a638-115d-4f64-afe1-2935c623a28e","Type":"ContainerDied","Data":"e17efe4f2a4d42448fefd6384ac55e70b2a532e25131b0821bb06a58b41ec55c"} Dec 07 19:33:28 crc kubenswrapper[4815]: I1207 19:33:28.141146 4815 scope.go:117] "RemoveContainer" containerID="fcb9dcf4285f5a86e0fdf17104b5d1e73495bb246bfb39792950fbec8645fe0e" Dec 07 19:33:28 crc kubenswrapper[4815]: I1207 19:33:28.141188 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-2wpl6" Dec 07 19:33:28 crc kubenswrapper[4815]: I1207 19:33:28.143445 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"77640a82-dd82-4fe6-89f8-58d8ab094b71","Type":"ContainerStarted","Data":"bc3612029e8d5cef2c319fb7ba85dfae3c8b00da4820d54c26c9f1c47b62e849"} Dec 07 19:33:28 crc kubenswrapper[4815]: I1207 19:33:28.144276 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Dec 07 19:33:28 crc kubenswrapper[4815]: I1207 19:33:28.146024 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-wrhqm" event={"ID":"c25272ad-ab34-434f-b8be-a945f45ca00b","Type":"ContainerStarted","Data":"065305783d9d62062297823da9dc7d536cc6e1774c5c434d28f78b0d6f70ecec"} Dec 07 19:33:28 crc kubenswrapper[4815]: I1207 19:33:28.162619 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-mhmdw" event={"ID":"32f95e51-757c-4167-ad8d-32f472266fe5","Type":"ContainerStarted","Data":"fa7a4880bd65fed8bace853161447f89603fc2e95d799864354b90ee528c5d4c"} Dec 07 19:33:28 crc kubenswrapper[4815]: I1207 19:33:28.168362 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"1dbe253c-608d-4711-904b-44926572c998","Type":"ContainerStarted","Data":"877982319df3dc1d390210137beb2590c427fb343cf0e38e6766116eb82dba57"} Dec 07 19:33:28 crc kubenswrapper[4815]: I1207 19:33:28.170362 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"b428ca75-b6e8-428e-be32-eb320bacbdda","Type":"ContainerStarted","Data":"6f642e6d6fb2b5fbbe64af7046ef0cacf3a079805f00cf14393e7244fffb007b"} Dec 07 19:33:28 crc kubenswrapper[4815]: I1207 19:33:28.184323 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-24gkx" podStartSLOduration=9.916219071 podStartE2EDuration="22.184307224s" podCreationTimestamp="2025-12-07 19:33:06 +0000 UTC" firstStartedPulling="2025-12-07 19:33:14.531898021 +0000 UTC m=+1099.110888066" lastFinishedPulling="2025-12-07 19:33:26.799986164 +0000 UTC m=+1111.378976219" observedRunningTime="2025-12-07 19:33:28.180384382 +0000 UTC m=+1112.759374437" watchObservedRunningTime="2025-12-07 19:33:28.184307224 +0000 UTC m=+1112.763297269" Dec 07 19:33:28 crc kubenswrapper[4815]: I1207 19:33:28.208212 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=16.075994361 podStartE2EDuration="29.208186356s" podCreationTimestamp="2025-12-07 19:32:59 +0000 UTC" firstStartedPulling="2025-12-07 19:33:12.354624119 +0000 UTC m=+1096.933614164" lastFinishedPulling="2025-12-07 19:33:25.486816114 +0000 UTC m=+1110.065806159" observedRunningTime="2025-12-07 19:33:28.198798188 +0000 UTC m=+1112.777788233" watchObservedRunningTime="2025-12-07 19:33:28.208186356 +0000 UTC m=+1112.787176411" Dec 07 19:33:28 crc kubenswrapper[4815]: I1207 19:33:28.209172 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5847a638-115d-4f64-afe1-2935c623a28e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5847a638-115d-4f64-afe1-2935c623a28e" (UID: "5847a638-115d-4f64-afe1-2935c623a28e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:33:28 crc kubenswrapper[4815]: I1207 19:33:28.285737 4815 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5847a638-115d-4f64-afe1-2935c623a28e-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 07 19:33:28 crc kubenswrapper[4815]: I1207 19:33:28.360270 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5847a638-115d-4f64-afe1-2935c623a28e-config" (OuterVolumeSpecName: "config") pod "5847a638-115d-4f64-afe1-2935c623a28e" (UID: "5847a638-115d-4f64-afe1-2935c623a28e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:33:28 crc kubenswrapper[4815]: I1207 19:33:28.387897 4815 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5847a638-115d-4f64-afe1-2935c623a28e-config\") on node \"crc\" DevicePath \"\"" Dec 07 19:33:28 crc kubenswrapper[4815]: I1207 19:33:28.458600 4815 scope.go:117] "RemoveContainer" containerID="4c7287fe1b5ce4f58b049260fb15eada94d97775d0ebbd77ad06b1f02e140612" Dec 07 19:33:28 crc kubenswrapper[4815]: I1207 19:33:28.483826 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-2wpl6"] Dec 07 19:33:28 crc kubenswrapper[4815]: I1207 19:33:28.486249 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-2wpl6"] Dec 07 19:33:28 crc kubenswrapper[4815]: I1207 19:33:28.530338 4815 scope.go:117] "RemoveContainer" containerID="fcb9dcf4285f5a86e0fdf17104b5d1e73495bb246bfb39792950fbec8645fe0e" Dec 07 19:33:28 crc kubenswrapper[4815]: E1207 19:33:28.532190 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fcb9dcf4285f5a86e0fdf17104b5d1e73495bb246bfb39792950fbec8645fe0e\": container with ID starting with fcb9dcf4285f5a86e0fdf17104b5d1e73495bb246bfb39792950fbec8645fe0e not found: ID does not exist" containerID="fcb9dcf4285f5a86e0fdf17104b5d1e73495bb246bfb39792950fbec8645fe0e" Dec 07 19:33:28 crc kubenswrapper[4815]: I1207 19:33:28.532233 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcb9dcf4285f5a86e0fdf17104b5d1e73495bb246bfb39792950fbec8645fe0e"} err="failed to get container status \"fcb9dcf4285f5a86e0fdf17104b5d1e73495bb246bfb39792950fbec8645fe0e\": rpc error: code = NotFound desc = could not find container \"fcb9dcf4285f5a86e0fdf17104b5d1e73495bb246bfb39792950fbec8645fe0e\": container with ID starting with fcb9dcf4285f5a86e0fdf17104b5d1e73495bb246bfb39792950fbec8645fe0e not found: ID does not exist" Dec 07 19:33:28 crc kubenswrapper[4815]: I1207 19:33:28.532261 4815 scope.go:117] "RemoveContainer" containerID="4c7287fe1b5ce4f58b049260fb15eada94d97775d0ebbd77ad06b1f02e140612" Dec 07 19:33:28 crc kubenswrapper[4815]: E1207 19:33:28.532745 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c7287fe1b5ce4f58b049260fb15eada94d97775d0ebbd77ad06b1f02e140612\": container with ID starting with 4c7287fe1b5ce4f58b049260fb15eada94d97775d0ebbd77ad06b1f02e140612 not found: ID does not exist" containerID="4c7287fe1b5ce4f58b049260fb15eada94d97775d0ebbd77ad06b1f02e140612" Dec 07 19:33:28 crc kubenswrapper[4815]: I1207 19:33:28.532796 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c7287fe1b5ce4f58b049260fb15eada94d97775d0ebbd77ad06b1f02e140612"} err="failed to get container status \"4c7287fe1b5ce4f58b049260fb15eada94d97775d0ebbd77ad06b1f02e140612\": rpc error: code = NotFound desc = could not find container \"4c7287fe1b5ce4f58b049260fb15eada94d97775d0ebbd77ad06b1f02e140612\": container with ID starting with 4c7287fe1b5ce4f58b049260fb15eada94d97775d0ebbd77ad06b1f02e140612 not found: ID does not exist" Dec 07 19:33:29 crc kubenswrapper[4815]: I1207 19:33:29.190205 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8a52cbc1-9245-48e5-8b22-0cdf96dc671b","Type":"ContainerStarted","Data":"2e00117efd8b9f2d1f8e4f741b77a80615ae76d7a0a18cc88633da08214fe472"} Dec 07 19:33:29 crc kubenswrapper[4815]: I1207 19:33:29.190286 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 07 19:33:29 crc kubenswrapper[4815]: I1207 19:33:29.193697 4815 generic.go:334] "Generic (PLEG): container finished" podID="2889719b-c3b6-4a36-8c6e-4b3e9f9a2c21" containerID="71722b2fd4ca8b68c6924ac2ee72c9db710ad360f785999fd15f8043f2f21f29" exitCode=0 Dec 07 19:33:29 crc kubenswrapper[4815]: I1207 19:33:29.193857 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-bhvc5" event={"ID":"2889719b-c3b6-4a36-8c6e-4b3e9f9a2c21","Type":"ContainerDied","Data":"71722b2fd4ca8b68c6924ac2ee72c9db710ad360f785999fd15f8043f2f21f29"} Dec 07 19:33:29 crc kubenswrapper[4815]: I1207 19:33:29.214626 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=16.140354888 podStartE2EDuration="29.21461018s" podCreationTimestamp="2025-12-07 19:33:00 +0000 UTC" firstStartedPulling="2025-12-07 19:33:13.846219781 +0000 UTC m=+1098.425209826" lastFinishedPulling="2025-12-07 19:33:26.920475073 +0000 UTC m=+1111.499465118" observedRunningTime="2025-12-07 19:33:29.207288831 +0000 UTC m=+1113.786278876" watchObservedRunningTime="2025-12-07 19:33:29.21461018 +0000 UTC m=+1113.793600225" Dec 07 19:33:29 crc kubenswrapper[4815]: I1207 19:33:29.224297 4815 generic.go:334] "Generic (PLEG): container finished" podID="c25272ad-ab34-434f-b8be-a945f45ca00b" containerID="1f679f92806d17f56022e4f685faced3db19387e900f7275e00620734238d307" exitCode=0 Dec 07 19:33:29 crc kubenswrapper[4815]: I1207 19:33:29.224385 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-wrhqm" event={"ID":"c25272ad-ab34-434f-b8be-a945f45ca00b","Type":"ContainerDied","Data":"1f679f92806d17f56022e4f685faced3db19387e900f7275e00620734238d307"} Dec 07 19:33:29 crc kubenswrapper[4815]: I1207 19:33:29.234992 4815 generic.go:334] "Generic (PLEG): container finished" podID="32f95e51-757c-4167-ad8d-32f472266fe5" containerID="d28702612d320e416cdc28f964877956ebcc35a1d48210bdfc819fcff8b5e4f7" exitCode=0 Dec 07 19:33:29 crc kubenswrapper[4815]: I1207 19:33:29.235050 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-mhmdw" event={"ID":"32f95e51-757c-4167-ad8d-32f472266fe5","Type":"ContainerDied","Data":"d28702612d320e416cdc28f964877956ebcc35a1d48210bdfc819fcff8b5e4f7"} Dec 07 19:33:29 crc kubenswrapper[4815]: I1207 19:33:29.241629 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c2f348fe-7af1-4260-9946-27b3e711400d","Type":"ContainerStarted","Data":"35fe694f36a5b968fea28439d6a0189c15b3d94d39753db9a98857fefc277512"} Dec 07 19:33:29 crc kubenswrapper[4815]: I1207 19:33:29.780144 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5847a638-115d-4f64-afe1-2935c623a28e" path="/var/lib/kubelet/pods/5847a638-115d-4f64-afe1-2935c623a28e/volumes" Dec 07 19:33:32 crc kubenswrapper[4815]: I1207 19:33:32.263420 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-bhvc5" event={"ID":"2889719b-c3b6-4a36-8c6e-4b3e9f9a2c21","Type":"ContainerStarted","Data":"b3f4b372592c04ab42a7ad0fdab151f8223ac306b32ed21584124b51908a270a"} Dec 07 19:33:32 crc kubenswrapper[4815]: I1207 19:33:32.268091 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-wrhqm" event={"ID":"c25272ad-ab34-434f-b8be-a945f45ca00b","Type":"ContainerStarted","Data":"62971a2b227f29eb8d8b9cef1188515dd4da18679ff8d2b7756ae0c9a6111de6"} Dec 07 19:33:32 crc kubenswrapper[4815]: I1207 19:33:32.268131 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5bf47b49b7-wrhqm" Dec 07 19:33:32 crc kubenswrapper[4815]: I1207 19:33:32.269937 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-mhmdw" event={"ID":"32f95e51-757c-4167-ad8d-32f472266fe5","Type":"ContainerStarted","Data":"37094d35dbf9ecac4d2980e440888c791ee77a7066969174f63a4bb00a828281"} Dec 07 19:33:32 crc kubenswrapper[4815]: I1207 19:33:32.270151 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8554648995-mhmdw" Dec 07 19:33:32 crc kubenswrapper[4815]: I1207 19:33:32.291750 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5bf47b49b7-wrhqm" podStartSLOduration=15.291728165 podStartE2EDuration="15.291728165s" podCreationTimestamp="2025-12-07 19:33:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:33:32.289367428 +0000 UTC m=+1116.868357483" watchObservedRunningTime="2025-12-07 19:33:32.291728165 +0000 UTC m=+1116.870718220" Dec 07 19:33:32 crc kubenswrapper[4815]: I1207 19:33:32.321528 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8554648995-mhmdw" podStartSLOduration=15.321511365 podStartE2EDuration="15.321511365s" podCreationTimestamp="2025-12-07 19:33:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:33:32.316353528 +0000 UTC m=+1116.895343593" watchObservedRunningTime="2025-12-07 19:33:32.321511365 +0000 UTC m=+1116.900501410" Dec 07 19:33:34 crc kubenswrapper[4815]: I1207 19:33:34.308187 4815 generic.go:334] "Generic (PLEG): container finished" podID="1dbe253c-608d-4711-904b-44926572c998" containerID="877982319df3dc1d390210137beb2590c427fb343cf0e38e6766116eb82dba57" exitCode=0 Dec 07 19:33:34 crc kubenswrapper[4815]: I1207 19:33:34.308266 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"1dbe253c-608d-4711-904b-44926572c998","Type":"ContainerDied","Data":"877982319df3dc1d390210137beb2590c427fb343cf0e38e6766116eb82dba57"} Dec 07 19:33:34 crc kubenswrapper[4815]: I1207 19:33:34.500559 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Dec 07 19:33:35 crc kubenswrapper[4815]: I1207 19:33:35.317314 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"1dbe253c-608d-4711-904b-44926572c998","Type":"ContainerStarted","Data":"b4c6129470867bb5ddc7011cd67810675c00419ab8eae133674a546616214ea9"} Dec 07 19:33:35 crc kubenswrapper[4815]: I1207 19:33:35.320566 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-7fzms" event={"ID":"83334f9a-9d33-4229-9552-9f69c525dd82","Type":"ContainerStarted","Data":"6e76da78373e906df094c8777ec07a629f8349a4247c768251b4b869354f098f"} Dec 07 19:33:35 crc kubenswrapper[4815]: I1207 19:33:35.324215 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-bhvc5" event={"ID":"2889719b-c3b6-4a36-8c6e-4b3e9f9a2c21","Type":"ContainerStarted","Data":"4fcf94e752275bea365735d8265331a88af9055207f52aaba326319e8ed9c28c"} Dec 07 19:33:35 crc kubenswrapper[4815]: I1207 19:33:35.324287 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-bhvc5" Dec 07 19:33:35 crc kubenswrapper[4815]: I1207 19:33:35.324326 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-bhvc5" Dec 07 19:33:35 crc kubenswrapper[4815]: I1207 19:33:35.326434 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"4ebfeb2d-526f-4f02-be1d-def3f49c555c","Type":"ContainerStarted","Data":"b980e4c3ee5fd69ac004211b25fd338567450bc6c72681aa335a845f60339623"} Dec 07 19:33:35 crc kubenswrapper[4815]: I1207 19:33:35.329450 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"17df86ce-9c36-46d9-b5f2-d5dae8ae4675","Type":"ContainerStarted","Data":"45457879c5b556a85cff7b0873ac56ef9801b2e24d2a5fa5b0609df27a1d445e"} Dec 07 19:33:35 crc kubenswrapper[4815]: I1207 19:33:35.332947 4815 generic.go:334] "Generic (PLEG): container finished" podID="b428ca75-b6e8-428e-be32-eb320bacbdda" containerID="6f642e6d6fb2b5fbbe64af7046ef0cacf3a079805f00cf14393e7244fffb007b" exitCode=0 Dec 07 19:33:35 crc kubenswrapper[4815]: I1207 19:33:35.332988 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"b428ca75-b6e8-428e-be32-eb320bacbdda","Type":"ContainerDied","Data":"6f642e6d6fb2b5fbbe64af7046ef0cacf3a079805f00cf14393e7244fffb007b"} Dec 07 19:33:35 crc kubenswrapper[4815]: I1207 19:33:35.393217 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-bhvc5" podStartSLOduration=18.204475956 podStartE2EDuration="29.393201824s" podCreationTimestamp="2025-12-07 19:33:06 +0000 UTC" firstStartedPulling="2025-12-07 19:33:15.608971281 +0000 UTC m=+1100.187961326" lastFinishedPulling="2025-12-07 19:33:26.797697149 +0000 UTC m=+1111.376687194" observedRunningTime="2025-12-07 19:33:35.389637642 +0000 UTC m=+1119.968627687" watchObservedRunningTime="2025-12-07 19:33:35.393201824 +0000 UTC m=+1119.972191869" Dec 07 19:33:35 crc kubenswrapper[4815]: I1207 19:33:35.398633 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=26.208177973 podStartE2EDuration="38.398615519s" podCreationTimestamp="2025-12-07 19:32:57 +0000 UTC" firstStartedPulling="2025-12-07 19:33:14.088211638 +0000 UTC m=+1098.667201683" lastFinishedPulling="2025-12-07 19:33:26.278649184 +0000 UTC m=+1110.857639229" observedRunningTime="2025-12-07 19:33:35.357139975 +0000 UTC m=+1119.936130060" watchObservedRunningTime="2025-12-07 19:33:35.398615519 +0000 UTC m=+1119.977605614" Dec 07 19:33:35 crc kubenswrapper[4815]: I1207 19:33:35.461504 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=17.046722261 podStartE2EDuration="30.461481573s" podCreationTimestamp="2025-12-07 19:33:05 +0000 UTC" firstStartedPulling="2025-12-07 19:33:20.75837528 +0000 UTC m=+1105.337365325" lastFinishedPulling="2025-12-07 19:33:34.173134592 +0000 UTC m=+1118.752124637" observedRunningTime="2025-12-07 19:33:35.457208081 +0000 UTC m=+1120.036198126" watchObservedRunningTime="2025-12-07 19:33:35.461481573 +0000 UTC m=+1120.040471618" Dec 07 19:33:35 crc kubenswrapper[4815]: I1207 19:33:35.490205 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-7fzms" podStartSLOduration=11.633172676000001 podStartE2EDuration="18.490182842s" podCreationTimestamp="2025-12-07 19:33:17 +0000 UTC" firstStartedPulling="2025-12-07 19:33:27.285168082 +0000 UTC m=+1111.864158127" lastFinishedPulling="2025-12-07 19:33:34.142178248 +0000 UTC m=+1118.721168293" observedRunningTime="2025-12-07 19:33:35.477700506 +0000 UTC m=+1120.056690561" watchObservedRunningTime="2025-12-07 19:33:35.490182842 +0000 UTC m=+1120.069172887" Dec 07 19:33:35 crc kubenswrapper[4815]: I1207 19:33:35.571350 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=10.054503847 podStartE2EDuration="29.571328068s" podCreationTimestamp="2025-12-07 19:33:06 +0000 UTC" firstStartedPulling="2025-12-07 19:33:14.672454632 +0000 UTC m=+1099.251444677" lastFinishedPulling="2025-12-07 19:33:34.189278853 +0000 UTC m=+1118.768268898" observedRunningTime="2025-12-07 19:33:35.568445966 +0000 UTC m=+1120.147436021" watchObservedRunningTime="2025-12-07 19:33:35.571328068 +0000 UTC m=+1120.150318113" Dec 07 19:33:36 crc kubenswrapper[4815]: I1207 19:33:36.498676 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Dec 07 19:33:36 crc kubenswrapper[4815]: I1207 19:33:36.498741 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Dec 07 19:33:36 crc kubenswrapper[4815]: I1207 19:33:36.545373 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Dec 07 19:33:37 crc kubenswrapper[4815]: I1207 19:33:37.400231 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Dec 07 19:33:37 crc kubenswrapper[4815]: I1207 19:33:37.705543 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Dec 07 19:33:37 crc kubenswrapper[4815]: I1207 19:33:37.705601 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Dec 07 19:33:37 crc kubenswrapper[4815]: I1207 19:33:37.743053 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Dec 07 19:33:37 crc kubenswrapper[4815]: I1207 19:33:37.788326 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5bf47b49b7-wrhqm" Dec 07 19:33:38 crc kubenswrapper[4815]: I1207 19:33:38.129136 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8554648995-mhmdw" Dec 07 19:33:38 crc kubenswrapper[4815]: I1207 19:33:38.182409 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-wrhqm"] Dec 07 19:33:38 crc kubenswrapper[4815]: I1207 19:33:38.355648 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5bf47b49b7-wrhqm" podUID="c25272ad-ab34-434f-b8be-a945f45ca00b" containerName="dnsmasq-dns" containerID="cri-o://62971a2b227f29eb8d8b9cef1188515dd4da18679ff8d2b7756ae0c9a6111de6" gracePeriod=10 Dec 07 19:33:38 crc kubenswrapper[4815]: I1207 19:33:38.412563 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Dec 07 19:33:38 crc kubenswrapper[4815]: I1207 19:33:38.772373 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Dec 07 19:33:38 crc kubenswrapper[4815]: E1207 19:33:38.773151 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3864f9d5-9110-45d5-a5a1-9790beeae00c" containerName="dnsmasq-dns" Dec 07 19:33:38 crc kubenswrapper[4815]: I1207 19:33:38.773169 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="3864f9d5-9110-45d5-a5a1-9790beeae00c" containerName="dnsmasq-dns" Dec 07 19:33:38 crc kubenswrapper[4815]: E1207 19:33:38.773214 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5847a638-115d-4f64-afe1-2935c623a28e" containerName="init" Dec 07 19:33:38 crc kubenswrapper[4815]: I1207 19:33:38.773223 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="5847a638-115d-4f64-afe1-2935c623a28e" containerName="init" Dec 07 19:33:38 crc kubenswrapper[4815]: E1207 19:33:38.773236 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3864f9d5-9110-45d5-a5a1-9790beeae00c" containerName="init" Dec 07 19:33:38 crc kubenswrapper[4815]: I1207 19:33:38.773244 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="3864f9d5-9110-45d5-a5a1-9790beeae00c" containerName="init" Dec 07 19:33:38 crc kubenswrapper[4815]: E1207 19:33:38.773304 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5847a638-115d-4f64-afe1-2935c623a28e" containerName="dnsmasq-dns" Dec 07 19:33:38 crc kubenswrapper[4815]: I1207 19:33:38.773315 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="5847a638-115d-4f64-afe1-2935c623a28e" containerName="dnsmasq-dns" Dec 07 19:33:38 crc kubenswrapper[4815]: I1207 19:33:38.773575 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="3864f9d5-9110-45d5-a5a1-9790beeae00c" containerName="dnsmasq-dns" Dec 07 19:33:38 crc kubenswrapper[4815]: I1207 19:33:38.773617 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="5847a638-115d-4f64-afe1-2935c623a28e" containerName="dnsmasq-dns" Dec 07 19:33:38 crc kubenswrapper[4815]: I1207 19:33:38.776071 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 07 19:33:38 crc kubenswrapper[4815]: I1207 19:33:38.786465 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Dec 07 19:33:38 crc kubenswrapper[4815]: I1207 19:33:38.786766 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Dec 07 19:33:38 crc kubenswrapper[4815]: I1207 19:33:38.787241 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-vqhbq" Dec 07 19:33:38 crc kubenswrapper[4815]: I1207 19:33:38.787403 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Dec 07 19:33:38 crc kubenswrapper[4815]: I1207 19:33:38.807569 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 07 19:33:38 crc kubenswrapper[4815]: I1207 19:33:38.850161 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/46adfee5-dc94-49a0-bf62-33d15b10e89c-scripts\") pod \"ovn-northd-0\" (UID: \"46adfee5-dc94-49a0-bf62-33d15b10e89c\") " pod="openstack/ovn-northd-0" Dec 07 19:33:38 crc kubenswrapper[4815]: I1207 19:33:38.850483 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46adfee5-dc94-49a0-bf62-33d15b10e89c-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"46adfee5-dc94-49a0-bf62-33d15b10e89c\") " pod="openstack/ovn-northd-0" Dec 07 19:33:38 crc kubenswrapper[4815]: I1207 19:33:38.850507 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46adfee5-dc94-49a0-bf62-33d15b10e89c-config\") pod \"ovn-northd-0\" (UID: \"46adfee5-dc94-49a0-bf62-33d15b10e89c\") " pod="openstack/ovn-northd-0" Dec 07 19:33:38 crc kubenswrapper[4815]: I1207 19:33:38.850551 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/46adfee5-dc94-49a0-bf62-33d15b10e89c-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"46adfee5-dc94-49a0-bf62-33d15b10e89c\") " pod="openstack/ovn-northd-0" Dec 07 19:33:38 crc kubenswrapper[4815]: I1207 19:33:38.850579 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/46adfee5-dc94-49a0-bf62-33d15b10e89c-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"46adfee5-dc94-49a0-bf62-33d15b10e89c\") " pod="openstack/ovn-northd-0" Dec 07 19:33:38 crc kubenswrapper[4815]: I1207 19:33:38.850602 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6khc\" (UniqueName: \"kubernetes.io/projected/46adfee5-dc94-49a0-bf62-33d15b10e89c-kube-api-access-d6khc\") pod \"ovn-northd-0\" (UID: \"46adfee5-dc94-49a0-bf62-33d15b10e89c\") " pod="openstack/ovn-northd-0" Dec 07 19:33:38 crc kubenswrapper[4815]: I1207 19:33:38.850645 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/46adfee5-dc94-49a0-bf62-33d15b10e89c-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"46adfee5-dc94-49a0-bf62-33d15b10e89c\") " pod="openstack/ovn-northd-0" Dec 07 19:33:38 crc kubenswrapper[4815]: I1207 19:33:38.951490 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/46adfee5-dc94-49a0-bf62-33d15b10e89c-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"46adfee5-dc94-49a0-bf62-33d15b10e89c\") " pod="openstack/ovn-northd-0" Dec 07 19:33:38 crc kubenswrapper[4815]: I1207 19:33:38.951635 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/46adfee5-dc94-49a0-bf62-33d15b10e89c-scripts\") pod \"ovn-northd-0\" (UID: \"46adfee5-dc94-49a0-bf62-33d15b10e89c\") " pod="openstack/ovn-northd-0" Dec 07 19:33:38 crc kubenswrapper[4815]: I1207 19:33:38.951684 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46adfee5-dc94-49a0-bf62-33d15b10e89c-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"46adfee5-dc94-49a0-bf62-33d15b10e89c\") " pod="openstack/ovn-northd-0" Dec 07 19:33:38 crc kubenswrapper[4815]: I1207 19:33:38.951735 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46adfee5-dc94-49a0-bf62-33d15b10e89c-config\") pod \"ovn-northd-0\" (UID: \"46adfee5-dc94-49a0-bf62-33d15b10e89c\") " pod="openstack/ovn-northd-0" Dec 07 19:33:38 crc kubenswrapper[4815]: I1207 19:33:38.951782 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/46adfee5-dc94-49a0-bf62-33d15b10e89c-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"46adfee5-dc94-49a0-bf62-33d15b10e89c\") " pod="openstack/ovn-northd-0" Dec 07 19:33:38 crc kubenswrapper[4815]: I1207 19:33:38.951827 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/46adfee5-dc94-49a0-bf62-33d15b10e89c-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"46adfee5-dc94-49a0-bf62-33d15b10e89c\") " pod="openstack/ovn-northd-0" Dec 07 19:33:38 crc kubenswrapper[4815]: I1207 19:33:38.951860 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6khc\" (UniqueName: \"kubernetes.io/projected/46adfee5-dc94-49a0-bf62-33d15b10e89c-kube-api-access-d6khc\") pod \"ovn-northd-0\" (UID: \"46adfee5-dc94-49a0-bf62-33d15b10e89c\") " pod="openstack/ovn-northd-0" Dec 07 19:33:38 crc kubenswrapper[4815]: I1207 19:33:38.952638 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/46adfee5-dc94-49a0-bf62-33d15b10e89c-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"46adfee5-dc94-49a0-bf62-33d15b10e89c\") " pod="openstack/ovn-northd-0" Dec 07 19:33:38 crc kubenswrapper[4815]: I1207 19:33:38.953259 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/46adfee5-dc94-49a0-bf62-33d15b10e89c-scripts\") pod \"ovn-northd-0\" (UID: \"46adfee5-dc94-49a0-bf62-33d15b10e89c\") " pod="openstack/ovn-northd-0" Dec 07 19:33:38 crc kubenswrapper[4815]: I1207 19:33:38.955300 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46adfee5-dc94-49a0-bf62-33d15b10e89c-config\") pod \"ovn-northd-0\" (UID: \"46adfee5-dc94-49a0-bf62-33d15b10e89c\") " pod="openstack/ovn-northd-0" Dec 07 19:33:38 crc kubenswrapper[4815]: I1207 19:33:38.965165 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/46adfee5-dc94-49a0-bf62-33d15b10e89c-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"46adfee5-dc94-49a0-bf62-33d15b10e89c\") " pod="openstack/ovn-northd-0" Dec 07 19:33:38 crc kubenswrapper[4815]: I1207 19:33:38.990022 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Dec 07 19:33:39 crc kubenswrapper[4815]: I1207 19:33:38.991303 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Dec 07 19:33:39 crc kubenswrapper[4815]: I1207 19:33:39.057534 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/46adfee5-dc94-49a0-bf62-33d15b10e89c-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"46adfee5-dc94-49a0-bf62-33d15b10e89c\") " pod="openstack/ovn-northd-0" Dec 07 19:33:39 crc kubenswrapper[4815]: I1207 19:33:39.071877 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46adfee5-dc94-49a0-bf62-33d15b10e89c-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"46adfee5-dc94-49a0-bf62-33d15b10e89c\") " pod="openstack/ovn-northd-0" Dec 07 19:33:39 crc kubenswrapper[4815]: I1207 19:33:39.086452 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6khc\" (UniqueName: \"kubernetes.io/projected/46adfee5-dc94-49a0-bf62-33d15b10e89c-kube-api-access-d6khc\") pod \"ovn-northd-0\" (UID: \"46adfee5-dc94-49a0-bf62-33d15b10e89c\") " pod="openstack/ovn-northd-0" Dec 07 19:33:39 crc kubenswrapper[4815]: I1207 19:33:39.112610 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 07 19:33:39 crc kubenswrapper[4815]: I1207 19:33:39.224264 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-wrhqm" Dec 07 19:33:39 crc kubenswrapper[4815]: I1207 19:33:39.366305 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c25272ad-ab34-434f-b8be-a945f45ca00b-dns-svc\") pod \"c25272ad-ab34-434f-b8be-a945f45ca00b\" (UID: \"c25272ad-ab34-434f-b8be-a945f45ca00b\") " Dec 07 19:33:39 crc kubenswrapper[4815]: I1207 19:33:39.366470 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5vnq7\" (UniqueName: \"kubernetes.io/projected/c25272ad-ab34-434f-b8be-a945f45ca00b-kube-api-access-5vnq7\") pod \"c25272ad-ab34-434f-b8be-a945f45ca00b\" (UID: \"c25272ad-ab34-434f-b8be-a945f45ca00b\") " Dec 07 19:33:39 crc kubenswrapper[4815]: I1207 19:33:39.366766 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c25272ad-ab34-434f-b8be-a945f45ca00b-config\") pod \"c25272ad-ab34-434f-b8be-a945f45ca00b\" (UID: \"c25272ad-ab34-434f-b8be-a945f45ca00b\") " Dec 07 19:33:39 crc kubenswrapper[4815]: I1207 19:33:39.366899 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c25272ad-ab34-434f-b8be-a945f45ca00b-ovsdbserver-nb\") pod \"c25272ad-ab34-434f-b8be-a945f45ca00b\" (UID: \"c25272ad-ab34-434f-b8be-a945f45ca00b\") " Dec 07 19:33:39 crc kubenswrapper[4815]: I1207 19:33:39.371893 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c25272ad-ab34-434f-b8be-a945f45ca00b-kube-api-access-5vnq7" (OuterVolumeSpecName: "kube-api-access-5vnq7") pod "c25272ad-ab34-434f-b8be-a945f45ca00b" (UID: "c25272ad-ab34-434f-b8be-a945f45ca00b"). InnerVolumeSpecName "kube-api-access-5vnq7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:33:39 crc kubenswrapper[4815]: I1207 19:33:39.393259 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"b428ca75-b6e8-428e-be32-eb320bacbdda","Type":"ContainerStarted","Data":"fa94dcb547c6bae2e047b60cdb8b54953ec941ee55a277bcc351584d6114affb"} Dec 07 19:33:39 crc kubenswrapper[4815]: I1207 19:33:39.406500 4815 generic.go:334] "Generic (PLEG): container finished" podID="c25272ad-ab34-434f-b8be-a945f45ca00b" containerID="62971a2b227f29eb8d8b9cef1188515dd4da18679ff8d2b7756ae0c9a6111de6" exitCode=0 Dec 07 19:33:39 crc kubenswrapper[4815]: I1207 19:33:39.406826 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-wrhqm" event={"ID":"c25272ad-ab34-434f-b8be-a945f45ca00b","Type":"ContainerDied","Data":"62971a2b227f29eb8d8b9cef1188515dd4da18679ff8d2b7756ae0c9a6111de6"} Dec 07 19:33:39 crc kubenswrapper[4815]: I1207 19:33:39.406900 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-wrhqm" event={"ID":"c25272ad-ab34-434f-b8be-a945f45ca00b","Type":"ContainerDied","Data":"065305783d9d62062297823da9dc7d536cc6e1774c5c434d28f78b0d6f70ecec"} Dec 07 19:33:39 crc kubenswrapper[4815]: I1207 19:33:39.406939 4815 scope.go:117] "RemoveContainer" containerID="62971a2b227f29eb8d8b9cef1188515dd4da18679ff8d2b7756ae0c9a6111de6" Dec 07 19:33:39 crc kubenswrapper[4815]: I1207 19:33:39.407144 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-wrhqm" Dec 07 19:33:39 crc kubenswrapper[4815]: I1207 19:33:39.424623 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c25272ad-ab34-434f-b8be-a945f45ca00b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c25272ad-ab34-434f-b8be-a945f45ca00b" (UID: "c25272ad-ab34-434f-b8be-a945f45ca00b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:33:39 crc kubenswrapper[4815]: I1207 19:33:39.427129 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=30.405271191 podStartE2EDuration="43.427102506s" podCreationTimestamp="2025-12-07 19:32:56 +0000 UTC" firstStartedPulling="2025-12-07 19:33:13.953600886 +0000 UTC m=+1098.532590931" lastFinishedPulling="2025-12-07 19:33:26.975432201 +0000 UTC m=+1111.554422246" observedRunningTime="2025-12-07 19:33:39.424950545 +0000 UTC m=+1124.003940590" watchObservedRunningTime="2025-12-07 19:33:39.427102506 +0000 UTC m=+1124.006092571" Dec 07 19:33:39 crc kubenswrapper[4815]: I1207 19:33:39.476323 4815 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c25272ad-ab34-434f-b8be-a945f45ca00b-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 07 19:33:39 crc kubenswrapper[4815]: I1207 19:33:39.476366 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5vnq7\" (UniqueName: \"kubernetes.io/projected/c25272ad-ab34-434f-b8be-a945f45ca00b-kube-api-access-5vnq7\") on node \"crc\" DevicePath \"\"" Dec 07 19:33:39 crc kubenswrapper[4815]: I1207 19:33:39.482984 4815 scope.go:117] "RemoveContainer" containerID="1f679f92806d17f56022e4f685faced3db19387e900f7275e00620734238d307" Dec 07 19:33:39 crc kubenswrapper[4815]: I1207 19:33:39.483234 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c25272ad-ab34-434f-b8be-a945f45ca00b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c25272ad-ab34-434f-b8be-a945f45ca00b" (UID: "c25272ad-ab34-434f-b8be-a945f45ca00b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:33:39 crc kubenswrapper[4815]: I1207 19:33:39.518646 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c25272ad-ab34-434f-b8be-a945f45ca00b-config" (OuterVolumeSpecName: "config") pod "c25272ad-ab34-434f-b8be-a945f45ca00b" (UID: "c25272ad-ab34-434f-b8be-a945f45ca00b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:33:39 crc kubenswrapper[4815]: I1207 19:33:39.578271 4815 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c25272ad-ab34-434f-b8be-a945f45ca00b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 07 19:33:39 crc kubenswrapper[4815]: I1207 19:33:39.578312 4815 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c25272ad-ab34-434f-b8be-a945f45ca00b-config\") on node \"crc\" DevicePath \"\"" Dec 07 19:33:39 crc kubenswrapper[4815]: I1207 19:33:39.582968 4815 scope.go:117] "RemoveContainer" containerID="62971a2b227f29eb8d8b9cef1188515dd4da18679ff8d2b7756ae0c9a6111de6" Dec 07 19:33:39 crc kubenswrapper[4815]: E1207 19:33:39.584089 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62971a2b227f29eb8d8b9cef1188515dd4da18679ff8d2b7756ae0c9a6111de6\": container with ID starting with 62971a2b227f29eb8d8b9cef1188515dd4da18679ff8d2b7756ae0c9a6111de6 not found: ID does not exist" containerID="62971a2b227f29eb8d8b9cef1188515dd4da18679ff8d2b7756ae0c9a6111de6" Dec 07 19:33:39 crc kubenswrapper[4815]: I1207 19:33:39.584132 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62971a2b227f29eb8d8b9cef1188515dd4da18679ff8d2b7756ae0c9a6111de6"} err="failed to get container status \"62971a2b227f29eb8d8b9cef1188515dd4da18679ff8d2b7756ae0c9a6111de6\": rpc error: code = NotFound desc = could not find container \"62971a2b227f29eb8d8b9cef1188515dd4da18679ff8d2b7756ae0c9a6111de6\": container with ID starting with 62971a2b227f29eb8d8b9cef1188515dd4da18679ff8d2b7756ae0c9a6111de6 not found: ID does not exist" Dec 07 19:33:39 crc kubenswrapper[4815]: I1207 19:33:39.584161 4815 scope.go:117] "RemoveContainer" containerID="1f679f92806d17f56022e4f685faced3db19387e900f7275e00620734238d307" Dec 07 19:33:39 crc kubenswrapper[4815]: E1207 19:33:39.584584 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f679f92806d17f56022e4f685faced3db19387e900f7275e00620734238d307\": container with ID starting with 1f679f92806d17f56022e4f685faced3db19387e900f7275e00620734238d307 not found: ID does not exist" containerID="1f679f92806d17f56022e4f685faced3db19387e900f7275e00620734238d307" Dec 07 19:33:39 crc kubenswrapper[4815]: I1207 19:33:39.584609 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f679f92806d17f56022e4f685faced3db19387e900f7275e00620734238d307"} err="failed to get container status \"1f679f92806d17f56022e4f685faced3db19387e900f7275e00620734238d307\": rpc error: code = NotFound desc = could not find container \"1f679f92806d17f56022e4f685faced3db19387e900f7275e00620734238d307\": container with ID starting with 1f679f92806d17f56022e4f685faced3db19387e900f7275e00620734238d307 not found: ID does not exist" Dec 07 19:33:39 crc kubenswrapper[4815]: I1207 19:33:39.740100 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-wrhqm"] Dec 07 19:33:39 crc kubenswrapper[4815]: I1207 19:33:39.746493 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-wrhqm"] Dec 07 19:33:39 crc kubenswrapper[4815]: I1207 19:33:39.829480 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c25272ad-ab34-434f-b8be-a945f45ca00b" path="/var/lib/kubelet/pods/c25272ad-ab34-434f-b8be-a945f45ca00b/volumes" Dec 07 19:33:39 crc kubenswrapper[4815]: I1207 19:33:39.830277 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 07 19:33:39 crc kubenswrapper[4815]: W1207 19:33:39.837367 4815 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46adfee5_dc94_49a0_bf62_33d15b10e89c.slice/crio-adb680e9b0fe8df7e57d9e5d9c0e0ab69014540a839cb5b78a5dcb8a8c9c3c6a WatchSource:0}: Error finding container adb680e9b0fe8df7e57d9e5d9c0e0ab69014540a839cb5b78a5dcb8a8c9c3c6a: Status 404 returned error can't find the container with id adb680e9b0fe8df7e57d9e5d9c0e0ab69014540a839cb5b78a5dcb8a8c9c3c6a Dec 07 19:33:40 crc kubenswrapper[4815]: I1207 19:33:40.418224 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"46adfee5-dc94-49a0-bf62-33d15b10e89c","Type":"ContainerStarted","Data":"adb680e9b0fe8df7e57d9e5d9c0e0ab69014540a839cb5b78a5dcb8a8c9c3c6a"} Dec 07 19:33:41 crc kubenswrapper[4815]: I1207 19:33:41.260288 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 07 19:33:41 crc kubenswrapper[4815]: I1207 19:33:41.323107 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Dec 07 19:33:41 crc kubenswrapper[4815]: I1207 19:33:41.416955 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Dec 07 19:33:42 crc kubenswrapper[4815]: I1207 19:33:42.437656 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"46adfee5-dc94-49a0-bf62-33d15b10e89c","Type":"ContainerStarted","Data":"6fd6d67eba118a60034996db34904822437d3ae28ea9f8324ad53127a3de8a52"} Dec 07 19:33:42 crc kubenswrapper[4815]: I1207 19:33:42.438057 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Dec 07 19:33:42 crc kubenswrapper[4815]: I1207 19:33:42.438072 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"46adfee5-dc94-49a0-bf62-33d15b10e89c","Type":"ContainerStarted","Data":"1cf1e5627df7166767e3ff6c60a256b0e05ea75ec4652fefd03165d261d7adc2"} Dec 07 19:33:42 crc kubenswrapper[4815]: I1207 19:33:42.463342 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.829382119 podStartE2EDuration="4.463324283s" podCreationTimestamp="2025-12-07 19:33:38 +0000 UTC" firstStartedPulling="2025-12-07 19:33:39.839504156 +0000 UTC m=+1124.418494201" lastFinishedPulling="2025-12-07 19:33:41.47344632 +0000 UTC m=+1126.052436365" observedRunningTime="2025-12-07 19:33:42.459695329 +0000 UTC m=+1127.038685384" watchObservedRunningTime="2025-12-07 19:33:42.463324283 +0000 UTC m=+1127.042314338" Dec 07 19:33:47 crc kubenswrapper[4815]: I1207 19:33:47.806837 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Dec 07 19:33:47 crc kubenswrapper[4815]: I1207 19:33:47.807547 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Dec 07 19:33:47 crc kubenswrapper[4815]: I1207 19:33:47.955737 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Dec 07 19:33:48 crc kubenswrapper[4815]: I1207 19:33:48.589949 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Dec 07 19:33:49 crc kubenswrapper[4815]: I1207 19:33:49.198322 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-qqh8l"] Dec 07 19:33:49 crc kubenswrapper[4815]: E1207 19:33:49.198627 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c25272ad-ab34-434f-b8be-a945f45ca00b" containerName="init" Dec 07 19:33:49 crc kubenswrapper[4815]: I1207 19:33:49.198638 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="c25272ad-ab34-434f-b8be-a945f45ca00b" containerName="init" Dec 07 19:33:49 crc kubenswrapper[4815]: E1207 19:33:49.198650 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c25272ad-ab34-434f-b8be-a945f45ca00b" containerName="dnsmasq-dns" Dec 07 19:33:49 crc kubenswrapper[4815]: I1207 19:33:49.198657 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="c25272ad-ab34-434f-b8be-a945f45ca00b" containerName="dnsmasq-dns" Dec 07 19:33:49 crc kubenswrapper[4815]: I1207 19:33:49.198807 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="c25272ad-ab34-434f-b8be-a945f45ca00b" containerName="dnsmasq-dns" Dec 07 19:33:49 crc kubenswrapper[4815]: I1207 19:33:49.199313 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-qqh8l" Dec 07 19:33:49 crc kubenswrapper[4815]: I1207 19:33:49.211958 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-qqh8l"] Dec 07 19:33:49 crc kubenswrapper[4815]: I1207 19:33:49.294755 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hj7df\" (UniqueName: \"kubernetes.io/projected/41c495d9-ee0f-4f79-b451-6ac61ec6db38-kube-api-access-hj7df\") pod \"keystone-db-create-qqh8l\" (UID: \"41c495d9-ee0f-4f79-b451-6ac61ec6db38\") " pod="openstack/keystone-db-create-qqh8l" Dec 07 19:33:49 crc kubenswrapper[4815]: I1207 19:33:49.294879 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41c495d9-ee0f-4f79-b451-6ac61ec6db38-operator-scripts\") pod \"keystone-db-create-qqh8l\" (UID: \"41c495d9-ee0f-4f79-b451-6ac61ec6db38\") " pod="openstack/keystone-db-create-qqh8l" Dec 07 19:33:49 crc kubenswrapper[4815]: I1207 19:33:49.340199 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-6091-account-create-update-9gll9"] Dec 07 19:33:49 crc kubenswrapper[4815]: I1207 19:33:49.341224 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6091-account-create-update-9gll9" Dec 07 19:33:49 crc kubenswrapper[4815]: I1207 19:33:49.345981 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6091-account-create-update-9gll9"] Dec 07 19:33:49 crc kubenswrapper[4815]: I1207 19:33:49.348163 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Dec 07 19:33:49 crc kubenswrapper[4815]: I1207 19:33:49.395932 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hj7df\" (UniqueName: \"kubernetes.io/projected/41c495d9-ee0f-4f79-b451-6ac61ec6db38-kube-api-access-hj7df\") pod \"keystone-db-create-qqh8l\" (UID: \"41c495d9-ee0f-4f79-b451-6ac61ec6db38\") " pod="openstack/keystone-db-create-qqh8l" Dec 07 19:33:49 crc kubenswrapper[4815]: I1207 19:33:49.396014 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41c495d9-ee0f-4f79-b451-6ac61ec6db38-operator-scripts\") pod \"keystone-db-create-qqh8l\" (UID: \"41c495d9-ee0f-4f79-b451-6ac61ec6db38\") " pod="openstack/keystone-db-create-qqh8l" Dec 07 19:33:49 crc kubenswrapper[4815]: I1207 19:33:49.396706 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41c495d9-ee0f-4f79-b451-6ac61ec6db38-operator-scripts\") pod \"keystone-db-create-qqh8l\" (UID: \"41c495d9-ee0f-4f79-b451-6ac61ec6db38\") " pod="openstack/keystone-db-create-qqh8l" Dec 07 19:33:49 crc kubenswrapper[4815]: I1207 19:33:49.417701 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hj7df\" (UniqueName: \"kubernetes.io/projected/41c495d9-ee0f-4f79-b451-6ac61ec6db38-kube-api-access-hj7df\") pod \"keystone-db-create-qqh8l\" (UID: \"41c495d9-ee0f-4f79-b451-6ac61ec6db38\") " pod="openstack/keystone-db-create-qqh8l" Dec 07 19:33:49 crc kubenswrapper[4815]: I1207 19:33:49.496994 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkt7h\" (UniqueName: \"kubernetes.io/projected/29003470-3a1c-4604-bb14-e005d825b5d2-kube-api-access-nkt7h\") pod \"keystone-6091-account-create-update-9gll9\" (UID: \"29003470-3a1c-4604-bb14-e005d825b5d2\") " pod="openstack/keystone-6091-account-create-update-9gll9" Dec 07 19:33:49 crc kubenswrapper[4815]: I1207 19:33:49.497370 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29003470-3a1c-4604-bb14-e005d825b5d2-operator-scripts\") pod \"keystone-6091-account-create-update-9gll9\" (UID: \"29003470-3a1c-4604-bb14-e005d825b5d2\") " pod="openstack/keystone-6091-account-create-update-9gll9" Dec 07 19:33:49 crc kubenswrapper[4815]: I1207 19:33:49.582882 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-qqh8l" Dec 07 19:33:49 crc kubenswrapper[4815]: I1207 19:33:49.598664 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29003470-3a1c-4604-bb14-e005d825b5d2-operator-scripts\") pod \"keystone-6091-account-create-update-9gll9\" (UID: \"29003470-3a1c-4604-bb14-e005d825b5d2\") " pod="openstack/keystone-6091-account-create-update-9gll9" Dec 07 19:33:49 crc kubenswrapper[4815]: I1207 19:33:49.598792 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkt7h\" (UniqueName: \"kubernetes.io/projected/29003470-3a1c-4604-bb14-e005d825b5d2-kube-api-access-nkt7h\") pod \"keystone-6091-account-create-update-9gll9\" (UID: \"29003470-3a1c-4604-bb14-e005d825b5d2\") " pod="openstack/keystone-6091-account-create-update-9gll9" Dec 07 19:33:49 crc kubenswrapper[4815]: I1207 19:33:49.599795 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29003470-3a1c-4604-bb14-e005d825b5d2-operator-scripts\") pod \"keystone-6091-account-create-update-9gll9\" (UID: \"29003470-3a1c-4604-bb14-e005d825b5d2\") " pod="openstack/keystone-6091-account-create-update-9gll9" Dec 07 19:33:49 crc kubenswrapper[4815]: I1207 19:33:49.639856 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-z7kz6"] Dec 07 19:33:49 crc kubenswrapper[4815]: I1207 19:33:49.641445 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-z7kz6" Dec 07 19:33:49 crc kubenswrapper[4815]: I1207 19:33:49.644037 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkt7h\" (UniqueName: \"kubernetes.io/projected/29003470-3a1c-4604-bb14-e005d825b5d2-kube-api-access-nkt7h\") pod \"keystone-6091-account-create-update-9gll9\" (UID: \"29003470-3a1c-4604-bb14-e005d825b5d2\") " pod="openstack/keystone-6091-account-create-update-9gll9" Dec 07 19:33:49 crc kubenswrapper[4815]: I1207 19:33:49.654685 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-z7kz6"] Dec 07 19:33:49 crc kubenswrapper[4815]: I1207 19:33:49.654976 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6091-account-create-update-9gll9" Dec 07 19:33:49 crc kubenswrapper[4815]: I1207 19:33:49.802041 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mr4fg\" (UniqueName: \"kubernetes.io/projected/d318fb6c-147d-42f1-8b19-0abd6b58c83c-kube-api-access-mr4fg\") pod \"placement-db-create-z7kz6\" (UID: \"d318fb6c-147d-42f1-8b19-0abd6b58c83c\") " pod="openstack/placement-db-create-z7kz6" Dec 07 19:33:49 crc kubenswrapper[4815]: I1207 19:33:49.802603 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d318fb6c-147d-42f1-8b19-0abd6b58c83c-operator-scripts\") pod \"placement-db-create-z7kz6\" (UID: \"d318fb6c-147d-42f1-8b19-0abd6b58c83c\") " pod="openstack/placement-db-create-z7kz6" Dec 07 19:33:49 crc kubenswrapper[4815]: I1207 19:33:49.846392 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-49f3-account-create-update-v7qzn"] Dec 07 19:33:49 crc kubenswrapper[4815]: I1207 19:33:49.851497 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-49f3-account-create-update-v7qzn" Dec 07 19:33:49 crc kubenswrapper[4815]: I1207 19:33:49.853785 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Dec 07 19:33:49 crc kubenswrapper[4815]: I1207 19:33:49.870111 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-49f3-account-create-update-v7qzn"] Dec 07 19:33:49 crc kubenswrapper[4815]: I1207 19:33:49.938292 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mr4fg\" (UniqueName: \"kubernetes.io/projected/d318fb6c-147d-42f1-8b19-0abd6b58c83c-kube-api-access-mr4fg\") pod \"placement-db-create-z7kz6\" (UID: \"d318fb6c-147d-42f1-8b19-0abd6b58c83c\") " pod="openstack/placement-db-create-z7kz6" Dec 07 19:33:49 crc kubenswrapper[4815]: I1207 19:33:49.938383 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d318fb6c-147d-42f1-8b19-0abd6b58c83c-operator-scripts\") pod \"placement-db-create-z7kz6\" (UID: \"d318fb6c-147d-42f1-8b19-0abd6b58c83c\") " pod="openstack/placement-db-create-z7kz6" Dec 07 19:33:49 crc kubenswrapper[4815]: I1207 19:33:49.939404 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d318fb6c-147d-42f1-8b19-0abd6b58c83c-operator-scripts\") pod \"placement-db-create-z7kz6\" (UID: \"d318fb6c-147d-42f1-8b19-0abd6b58c83c\") " pod="openstack/placement-db-create-z7kz6" Dec 07 19:33:49 crc kubenswrapper[4815]: I1207 19:33:49.959629 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mr4fg\" (UniqueName: \"kubernetes.io/projected/d318fb6c-147d-42f1-8b19-0abd6b58c83c-kube-api-access-mr4fg\") pod \"placement-db-create-z7kz6\" (UID: \"d318fb6c-147d-42f1-8b19-0abd6b58c83c\") " pod="openstack/placement-db-create-z7kz6" Dec 07 19:33:50 crc kubenswrapper[4815]: I1207 19:33:50.040212 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79fxl\" (UniqueName: \"kubernetes.io/projected/86c5c96e-14ac-43c0-b753-faf913b71ed9-kube-api-access-79fxl\") pod \"placement-49f3-account-create-update-v7qzn\" (UID: \"86c5c96e-14ac-43c0-b753-faf913b71ed9\") " pod="openstack/placement-49f3-account-create-update-v7qzn" Dec 07 19:33:50 crc kubenswrapper[4815]: I1207 19:33:50.040604 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86c5c96e-14ac-43c0-b753-faf913b71ed9-operator-scripts\") pod \"placement-49f3-account-create-update-v7qzn\" (UID: \"86c5c96e-14ac-43c0-b753-faf913b71ed9\") " pod="openstack/placement-49f3-account-create-update-v7qzn" Dec 07 19:33:50 crc kubenswrapper[4815]: I1207 19:33:50.059017 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-z7kz6" Dec 07 19:33:50 crc kubenswrapper[4815]: I1207 19:33:50.141889 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86c5c96e-14ac-43c0-b753-faf913b71ed9-operator-scripts\") pod \"placement-49f3-account-create-update-v7qzn\" (UID: \"86c5c96e-14ac-43c0-b753-faf913b71ed9\") " pod="openstack/placement-49f3-account-create-update-v7qzn" Dec 07 19:33:50 crc kubenswrapper[4815]: I1207 19:33:50.142053 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79fxl\" (UniqueName: \"kubernetes.io/projected/86c5c96e-14ac-43c0-b753-faf913b71ed9-kube-api-access-79fxl\") pod \"placement-49f3-account-create-update-v7qzn\" (UID: \"86c5c96e-14ac-43c0-b753-faf913b71ed9\") " pod="openstack/placement-49f3-account-create-update-v7qzn" Dec 07 19:33:50 crc kubenswrapper[4815]: I1207 19:33:50.143214 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86c5c96e-14ac-43c0-b753-faf913b71ed9-operator-scripts\") pod \"placement-49f3-account-create-update-v7qzn\" (UID: \"86c5c96e-14ac-43c0-b753-faf913b71ed9\") " pod="openstack/placement-49f3-account-create-update-v7qzn" Dec 07 19:33:50 crc kubenswrapper[4815]: I1207 19:33:50.147889 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-qqh8l"] Dec 07 19:33:50 crc kubenswrapper[4815]: I1207 19:33:50.165994 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79fxl\" (UniqueName: \"kubernetes.io/projected/86c5c96e-14ac-43c0-b753-faf913b71ed9-kube-api-access-79fxl\") pod \"placement-49f3-account-create-update-v7qzn\" (UID: \"86c5c96e-14ac-43c0-b753-faf913b71ed9\") " pod="openstack/placement-49f3-account-create-update-v7qzn" Dec 07 19:33:50 crc kubenswrapper[4815]: W1207 19:33:50.167253 4815 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod41c495d9_ee0f_4f79_b451_6ac61ec6db38.slice/crio-9f9e9141346bfa90c7c071ebe39e77d4ba8f715a9141c5a4d842d527394ed686 WatchSource:0}: Error finding container 9f9e9141346bfa90c7c071ebe39e77d4ba8f715a9141c5a4d842d527394ed686: Status 404 returned error can't find the container with id 9f9e9141346bfa90c7c071ebe39e77d4ba8f715a9141c5a4d842d527394ed686 Dec 07 19:33:50 crc kubenswrapper[4815]: I1207 19:33:50.169560 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-49f3-account-create-update-v7qzn" Dec 07 19:33:50 crc kubenswrapper[4815]: I1207 19:33:50.237268 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6091-account-create-update-9gll9"] Dec 07 19:33:50 crc kubenswrapper[4815]: W1207 19:33:50.253989 4815 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod29003470_3a1c_4604_bb14_e005d825b5d2.slice/crio-2451aa9baca923df30646c09f8ebe1eac483c9e1b0472239829b1debd3f2f30c WatchSource:0}: Error finding container 2451aa9baca923df30646c09f8ebe1eac483c9e1b0472239829b1debd3f2f30c: Status 404 returned error can't find the container with id 2451aa9baca923df30646c09f8ebe1eac483c9e1b0472239829b1debd3f2f30c Dec 07 19:33:50 crc kubenswrapper[4815]: I1207 19:33:50.373009 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-z7kz6"] Dec 07 19:33:50 crc kubenswrapper[4815]: I1207 19:33:50.508168 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-49f3-account-create-update-v7qzn"] Dec 07 19:33:50 crc kubenswrapper[4815]: I1207 19:33:50.518239 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6091-account-create-update-9gll9" event={"ID":"29003470-3a1c-4604-bb14-e005d825b5d2","Type":"ContainerStarted","Data":"32d06d9aeda0cb5ad5ab366cd1a5a3a6c0a29ce0220962dfec177f9970840ae7"} Dec 07 19:33:50 crc kubenswrapper[4815]: I1207 19:33:50.518286 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6091-account-create-update-9gll9" event={"ID":"29003470-3a1c-4604-bb14-e005d825b5d2","Type":"ContainerStarted","Data":"2451aa9baca923df30646c09f8ebe1eac483c9e1b0472239829b1debd3f2f30c"} Dec 07 19:33:50 crc kubenswrapper[4815]: I1207 19:33:50.523740 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-qqh8l" event={"ID":"41c495d9-ee0f-4f79-b451-6ac61ec6db38","Type":"ContainerStarted","Data":"6e82e266d076860201e90b246dc6e841f1554832841722c3755e22355a393a53"} Dec 07 19:33:50 crc kubenswrapper[4815]: I1207 19:33:50.523786 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-qqh8l" event={"ID":"41c495d9-ee0f-4f79-b451-6ac61ec6db38","Type":"ContainerStarted","Data":"9f9e9141346bfa90c7c071ebe39e77d4ba8f715a9141c5a4d842d527394ed686"} Dec 07 19:33:50 crc kubenswrapper[4815]: I1207 19:33:50.531638 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-z7kz6" event={"ID":"d318fb6c-147d-42f1-8b19-0abd6b58c83c","Type":"ContainerStarted","Data":"511e7944a3d03b7464baf0bac4cefc1f26f92f73367d0cb303a45b658abb4881"} Dec 07 19:33:50 crc kubenswrapper[4815]: I1207 19:33:50.540904 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-6091-account-create-update-9gll9" podStartSLOduration=1.540879924 podStartE2EDuration="1.540879924s" podCreationTimestamp="2025-12-07 19:33:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:33:50.534144412 +0000 UTC m=+1135.113134457" watchObservedRunningTime="2025-12-07 19:33:50.540879924 +0000 UTC m=+1135.119869969" Dec 07 19:33:51 crc kubenswrapper[4815]: I1207 19:33:51.544243 4815 generic.go:334] "Generic (PLEG): container finished" podID="29003470-3a1c-4604-bb14-e005d825b5d2" containerID="32d06d9aeda0cb5ad5ab366cd1a5a3a6c0a29ce0220962dfec177f9970840ae7" exitCode=0 Dec 07 19:33:51 crc kubenswrapper[4815]: I1207 19:33:51.544930 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6091-account-create-update-9gll9" event={"ID":"29003470-3a1c-4604-bb14-e005d825b5d2","Type":"ContainerDied","Data":"32d06d9aeda0cb5ad5ab366cd1a5a3a6c0a29ce0220962dfec177f9970840ae7"} Dec 07 19:33:51 crc kubenswrapper[4815]: I1207 19:33:51.547421 4815 generic.go:334] "Generic (PLEG): container finished" podID="41c495d9-ee0f-4f79-b451-6ac61ec6db38" containerID="6e82e266d076860201e90b246dc6e841f1554832841722c3755e22355a393a53" exitCode=0 Dec 07 19:33:51 crc kubenswrapper[4815]: I1207 19:33:51.547508 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-qqh8l" event={"ID":"41c495d9-ee0f-4f79-b451-6ac61ec6db38","Type":"ContainerDied","Data":"6e82e266d076860201e90b246dc6e841f1554832841722c3755e22355a393a53"} Dec 07 19:33:51 crc kubenswrapper[4815]: I1207 19:33:51.549629 4815 generic.go:334] "Generic (PLEG): container finished" podID="86c5c96e-14ac-43c0-b753-faf913b71ed9" containerID="987d1d8085f5aa4d7aac4b25f0914e81da0aa0a55a6eeb0609ef98ac15795d09" exitCode=0 Dec 07 19:33:51 crc kubenswrapper[4815]: I1207 19:33:51.549684 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-49f3-account-create-update-v7qzn" event={"ID":"86c5c96e-14ac-43c0-b753-faf913b71ed9","Type":"ContainerDied","Data":"987d1d8085f5aa4d7aac4b25f0914e81da0aa0a55a6eeb0609ef98ac15795d09"} Dec 07 19:33:51 crc kubenswrapper[4815]: I1207 19:33:51.549708 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-49f3-account-create-update-v7qzn" event={"ID":"86c5c96e-14ac-43c0-b753-faf913b71ed9","Type":"ContainerStarted","Data":"8cb0f4037e926b787faeead5f0c4cca2f97b5147e101f834b818e8feccbd500a"} Dec 07 19:33:51 crc kubenswrapper[4815]: I1207 19:33:51.554329 4815 generic.go:334] "Generic (PLEG): container finished" podID="d318fb6c-147d-42f1-8b19-0abd6b58c83c" containerID="413fe81c61b692e04e1d2a7fb37af579a1e2f298dc1b7bd06aaf96021a73e74e" exitCode=0 Dec 07 19:33:51 crc kubenswrapper[4815]: I1207 19:33:51.554401 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-z7kz6" event={"ID":"d318fb6c-147d-42f1-8b19-0abd6b58c83c","Type":"ContainerDied","Data":"413fe81c61b692e04e1d2a7fb37af579a1e2f298dc1b7bd06aaf96021a73e74e"} Dec 07 19:33:51 crc kubenswrapper[4815]: I1207 19:33:51.564774 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-qqh8l" podStartSLOduration=2.564750026 podStartE2EDuration="2.564750026s" podCreationTimestamp="2025-12-07 19:33:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:33:50.561087201 +0000 UTC m=+1135.140077246" watchObservedRunningTime="2025-12-07 19:33:51.564750026 +0000 UTC m=+1136.143740071" Dec 07 19:33:52 crc kubenswrapper[4815]: I1207 19:33:52.859345 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-z7kz6" Dec 07 19:33:53 crc kubenswrapper[4815]: I1207 19:33:53.000993 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d318fb6c-147d-42f1-8b19-0abd6b58c83c-operator-scripts\") pod \"d318fb6c-147d-42f1-8b19-0abd6b58c83c\" (UID: \"d318fb6c-147d-42f1-8b19-0abd6b58c83c\") " Dec 07 19:33:53 crc kubenswrapper[4815]: I1207 19:33:53.001092 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mr4fg\" (UniqueName: \"kubernetes.io/projected/d318fb6c-147d-42f1-8b19-0abd6b58c83c-kube-api-access-mr4fg\") pod \"d318fb6c-147d-42f1-8b19-0abd6b58c83c\" (UID: \"d318fb6c-147d-42f1-8b19-0abd6b58c83c\") " Dec 07 19:33:53 crc kubenswrapper[4815]: I1207 19:33:53.002442 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d318fb6c-147d-42f1-8b19-0abd6b58c83c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d318fb6c-147d-42f1-8b19-0abd6b58c83c" (UID: "d318fb6c-147d-42f1-8b19-0abd6b58c83c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:33:53 crc kubenswrapper[4815]: I1207 19:33:53.014471 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d318fb6c-147d-42f1-8b19-0abd6b58c83c-kube-api-access-mr4fg" (OuterVolumeSpecName: "kube-api-access-mr4fg") pod "d318fb6c-147d-42f1-8b19-0abd6b58c83c" (UID: "d318fb6c-147d-42f1-8b19-0abd6b58c83c"). InnerVolumeSpecName "kube-api-access-mr4fg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:33:53 crc kubenswrapper[4815]: I1207 19:33:53.045092 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-49f3-account-create-update-v7qzn" Dec 07 19:33:53 crc kubenswrapper[4815]: I1207 19:33:53.051506 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6091-account-create-update-9gll9" Dec 07 19:33:53 crc kubenswrapper[4815]: I1207 19:33:53.095036 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-qqh8l" Dec 07 19:33:53 crc kubenswrapper[4815]: I1207 19:33:53.110926 4815 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d318fb6c-147d-42f1-8b19-0abd6b58c83c-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 07 19:33:53 crc kubenswrapper[4815]: I1207 19:33:53.110959 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mr4fg\" (UniqueName: \"kubernetes.io/projected/d318fb6c-147d-42f1-8b19-0abd6b58c83c-kube-api-access-mr4fg\") on node \"crc\" DevicePath \"\"" Dec 07 19:33:53 crc kubenswrapper[4815]: I1207 19:33:53.211640 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29003470-3a1c-4604-bb14-e005d825b5d2-operator-scripts\") pod \"29003470-3a1c-4604-bb14-e005d825b5d2\" (UID: \"29003470-3a1c-4604-bb14-e005d825b5d2\") " Dec 07 19:33:53 crc kubenswrapper[4815]: I1207 19:33:53.211727 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86c5c96e-14ac-43c0-b753-faf913b71ed9-operator-scripts\") pod \"86c5c96e-14ac-43c0-b753-faf913b71ed9\" (UID: \"86c5c96e-14ac-43c0-b753-faf913b71ed9\") " Dec 07 19:33:53 crc kubenswrapper[4815]: I1207 19:33:53.211861 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nkt7h\" (UniqueName: \"kubernetes.io/projected/29003470-3a1c-4604-bb14-e005d825b5d2-kube-api-access-nkt7h\") pod \"29003470-3a1c-4604-bb14-e005d825b5d2\" (UID: \"29003470-3a1c-4604-bb14-e005d825b5d2\") " Dec 07 19:33:53 crc kubenswrapper[4815]: I1207 19:33:53.211905 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41c495d9-ee0f-4f79-b451-6ac61ec6db38-operator-scripts\") pod \"41c495d9-ee0f-4f79-b451-6ac61ec6db38\" (UID: \"41c495d9-ee0f-4f79-b451-6ac61ec6db38\") " Dec 07 19:33:53 crc kubenswrapper[4815]: I1207 19:33:53.211982 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-79fxl\" (UniqueName: \"kubernetes.io/projected/86c5c96e-14ac-43c0-b753-faf913b71ed9-kube-api-access-79fxl\") pod \"86c5c96e-14ac-43c0-b753-faf913b71ed9\" (UID: \"86c5c96e-14ac-43c0-b753-faf913b71ed9\") " Dec 07 19:33:53 crc kubenswrapper[4815]: I1207 19:33:53.212011 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hj7df\" (UniqueName: \"kubernetes.io/projected/41c495d9-ee0f-4f79-b451-6ac61ec6db38-kube-api-access-hj7df\") pod \"41c495d9-ee0f-4f79-b451-6ac61ec6db38\" (UID: \"41c495d9-ee0f-4f79-b451-6ac61ec6db38\") " Dec 07 19:33:53 crc kubenswrapper[4815]: I1207 19:33:53.215190 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41c495d9-ee0f-4f79-b451-6ac61ec6db38-kube-api-access-hj7df" (OuterVolumeSpecName: "kube-api-access-hj7df") pod "41c495d9-ee0f-4f79-b451-6ac61ec6db38" (UID: "41c495d9-ee0f-4f79-b451-6ac61ec6db38"). InnerVolumeSpecName "kube-api-access-hj7df". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:33:53 crc kubenswrapper[4815]: I1207 19:33:53.215506 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41c495d9-ee0f-4f79-b451-6ac61ec6db38-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "41c495d9-ee0f-4f79-b451-6ac61ec6db38" (UID: "41c495d9-ee0f-4f79-b451-6ac61ec6db38"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:33:53 crc kubenswrapper[4815]: I1207 19:33:53.215821 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29003470-3a1c-4604-bb14-e005d825b5d2-kube-api-access-nkt7h" (OuterVolumeSpecName: "kube-api-access-nkt7h") pod "29003470-3a1c-4604-bb14-e005d825b5d2" (UID: "29003470-3a1c-4604-bb14-e005d825b5d2"). InnerVolumeSpecName "kube-api-access-nkt7h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:33:53 crc kubenswrapper[4815]: I1207 19:33:53.216115 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29003470-3a1c-4604-bb14-e005d825b5d2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "29003470-3a1c-4604-bb14-e005d825b5d2" (UID: "29003470-3a1c-4604-bb14-e005d825b5d2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:33:53 crc kubenswrapper[4815]: I1207 19:33:53.216338 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86c5c96e-14ac-43c0-b753-faf913b71ed9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "86c5c96e-14ac-43c0-b753-faf913b71ed9" (UID: "86c5c96e-14ac-43c0-b753-faf913b71ed9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:33:53 crc kubenswrapper[4815]: I1207 19:33:53.217905 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86c5c96e-14ac-43c0-b753-faf913b71ed9-kube-api-access-79fxl" (OuterVolumeSpecName: "kube-api-access-79fxl") pod "86c5c96e-14ac-43c0-b753-faf913b71ed9" (UID: "86c5c96e-14ac-43c0-b753-faf913b71ed9"). InnerVolumeSpecName "kube-api-access-79fxl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:33:53 crc kubenswrapper[4815]: I1207 19:33:53.313966 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-79fxl\" (UniqueName: \"kubernetes.io/projected/86c5c96e-14ac-43c0-b753-faf913b71ed9-kube-api-access-79fxl\") on node \"crc\" DevicePath \"\"" Dec 07 19:33:53 crc kubenswrapper[4815]: I1207 19:33:53.314003 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hj7df\" (UniqueName: \"kubernetes.io/projected/41c495d9-ee0f-4f79-b451-6ac61ec6db38-kube-api-access-hj7df\") on node \"crc\" DevicePath \"\"" Dec 07 19:33:53 crc kubenswrapper[4815]: I1207 19:33:53.314015 4815 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29003470-3a1c-4604-bb14-e005d825b5d2-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 07 19:33:53 crc kubenswrapper[4815]: I1207 19:33:53.314024 4815 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86c5c96e-14ac-43c0-b753-faf913b71ed9-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 07 19:33:53 crc kubenswrapper[4815]: I1207 19:33:53.314032 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nkt7h\" (UniqueName: \"kubernetes.io/projected/29003470-3a1c-4604-bb14-e005d825b5d2-kube-api-access-nkt7h\") on node \"crc\" DevicePath \"\"" Dec 07 19:33:53 crc kubenswrapper[4815]: I1207 19:33:53.314041 4815 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41c495d9-ee0f-4f79-b451-6ac61ec6db38-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 07 19:33:53 crc kubenswrapper[4815]: I1207 19:33:53.569922 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-49f3-account-create-update-v7qzn" event={"ID":"86c5c96e-14ac-43c0-b753-faf913b71ed9","Type":"ContainerDied","Data":"8cb0f4037e926b787faeead5f0c4cca2f97b5147e101f834b818e8feccbd500a"} Dec 07 19:33:53 crc kubenswrapper[4815]: I1207 19:33:53.569958 4815 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8cb0f4037e926b787faeead5f0c4cca2f97b5147e101f834b818e8feccbd500a" Dec 07 19:33:53 crc kubenswrapper[4815]: I1207 19:33:53.570014 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-49f3-account-create-update-v7qzn" Dec 07 19:33:53 crc kubenswrapper[4815]: I1207 19:33:53.571638 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-z7kz6" Dec 07 19:33:53 crc kubenswrapper[4815]: I1207 19:33:53.572061 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-z7kz6" event={"ID":"d318fb6c-147d-42f1-8b19-0abd6b58c83c","Type":"ContainerDied","Data":"511e7944a3d03b7464baf0bac4cefc1f26f92f73367d0cb303a45b658abb4881"} Dec 07 19:33:53 crc kubenswrapper[4815]: I1207 19:33:53.572084 4815 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="511e7944a3d03b7464baf0bac4cefc1f26f92f73367d0cb303a45b658abb4881" Dec 07 19:33:53 crc kubenswrapper[4815]: I1207 19:33:53.573353 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6091-account-create-update-9gll9" event={"ID":"29003470-3a1c-4604-bb14-e005d825b5d2","Type":"ContainerDied","Data":"2451aa9baca923df30646c09f8ebe1eac483c9e1b0472239829b1debd3f2f30c"} Dec 07 19:33:53 crc kubenswrapper[4815]: I1207 19:33:53.573373 4815 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2451aa9baca923df30646c09f8ebe1eac483c9e1b0472239829b1debd3f2f30c" Dec 07 19:33:53 crc kubenswrapper[4815]: I1207 19:33:53.573405 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6091-account-create-update-9gll9" Dec 07 19:33:53 crc kubenswrapper[4815]: I1207 19:33:53.575581 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-qqh8l" event={"ID":"41c495d9-ee0f-4f79-b451-6ac61ec6db38","Type":"ContainerDied","Data":"9f9e9141346bfa90c7c071ebe39e77d4ba8f715a9141c5a4d842d527394ed686"} Dec 07 19:33:53 crc kubenswrapper[4815]: I1207 19:33:53.575602 4815 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f9e9141346bfa90c7c071ebe39e77d4ba8f715a9141c5a4d842d527394ed686" Dec 07 19:33:53 crc kubenswrapper[4815]: I1207 19:33:53.575635 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-qqh8l" Dec 07 19:33:54 crc kubenswrapper[4815]: I1207 19:33:54.179475 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Dec 07 19:33:54 crc kubenswrapper[4815]: I1207 19:33:54.745979 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-dbfd4"] Dec 07 19:33:54 crc kubenswrapper[4815]: E1207 19:33:54.746270 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41c495d9-ee0f-4f79-b451-6ac61ec6db38" containerName="mariadb-database-create" Dec 07 19:33:54 crc kubenswrapper[4815]: I1207 19:33:54.746282 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="41c495d9-ee0f-4f79-b451-6ac61ec6db38" containerName="mariadb-database-create" Dec 07 19:33:54 crc kubenswrapper[4815]: E1207 19:33:54.746294 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86c5c96e-14ac-43c0-b753-faf913b71ed9" containerName="mariadb-account-create-update" Dec 07 19:33:54 crc kubenswrapper[4815]: I1207 19:33:54.746301 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="86c5c96e-14ac-43c0-b753-faf913b71ed9" containerName="mariadb-account-create-update" Dec 07 19:33:54 crc kubenswrapper[4815]: E1207 19:33:54.746327 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29003470-3a1c-4604-bb14-e005d825b5d2" containerName="mariadb-account-create-update" Dec 07 19:33:54 crc kubenswrapper[4815]: I1207 19:33:54.746335 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="29003470-3a1c-4604-bb14-e005d825b5d2" containerName="mariadb-account-create-update" Dec 07 19:33:54 crc kubenswrapper[4815]: E1207 19:33:54.746353 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d318fb6c-147d-42f1-8b19-0abd6b58c83c" containerName="mariadb-database-create" Dec 07 19:33:54 crc kubenswrapper[4815]: I1207 19:33:54.746358 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="d318fb6c-147d-42f1-8b19-0abd6b58c83c" containerName="mariadb-database-create" Dec 07 19:33:54 crc kubenswrapper[4815]: I1207 19:33:54.746487 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="86c5c96e-14ac-43c0-b753-faf913b71ed9" containerName="mariadb-account-create-update" Dec 07 19:33:54 crc kubenswrapper[4815]: I1207 19:33:54.746501 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="d318fb6c-147d-42f1-8b19-0abd6b58c83c" containerName="mariadb-database-create" Dec 07 19:33:54 crc kubenswrapper[4815]: I1207 19:33:54.746516 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="41c495d9-ee0f-4f79-b451-6ac61ec6db38" containerName="mariadb-database-create" Dec 07 19:33:54 crc kubenswrapper[4815]: I1207 19:33:54.746522 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="29003470-3a1c-4604-bb14-e005d825b5d2" containerName="mariadb-account-create-update" Dec 07 19:33:54 crc kubenswrapper[4815]: I1207 19:33:54.747074 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-dbfd4" Dec 07 19:33:54 crc kubenswrapper[4815]: I1207 19:33:54.762252 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-dbfd4"] Dec 07 19:33:54 crc kubenswrapper[4815]: I1207 19:33:54.830939 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-8d69-account-create-update-5gjns"] Dec 07 19:33:54 crc kubenswrapper[4815]: I1207 19:33:54.831879 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-8d69-account-create-update-5gjns" Dec 07 19:33:54 crc kubenswrapper[4815]: I1207 19:33:54.839481 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Dec 07 19:33:54 crc kubenswrapper[4815]: I1207 19:33:54.849175 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-8d69-account-create-update-5gjns"] Dec 07 19:33:54 crc kubenswrapper[4815]: I1207 19:33:54.940139 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b7b56ed-7eb9-4bb0-addd-1e234f0f2204-operator-scripts\") pod \"glance-db-create-dbfd4\" (UID: \"8b7b56ed-7eb9-4bb0-addd-1e234f0f2204\") " pod="openstack/glance-db-create-dbfd4" Dec 07 19:33:54 crc kubenswrapper[4815]: I1207 19:33:54.940208 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4ce65cf3-4329-4b48-8171-f04889224482-operator-scripts\") pod \"glance-8d69-account-create-update-5gjns\" (UID: \"4ce65cf3-4329-4b48-8171-f04889224482\") " pod="openstack/glance-8d69-account-create-update-5gjns" Dec 07 19:33:54 crc kubenswrapper[4815]: I1207 19:33:54.940232 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltz59\" (UniqueName: \"kubernetes.io/projected/8b7b56ed-7eb9-4bb0-addd-1e234f0f2204-kube-api-access-ltz59\") pod \"glance-db-create-dbfd4\" (UID: \"8b7b56ed-7eb9-4bb0-addd-1e234f0f2204\") " pod="openstack/glance-db-create-dbfd4" Dec 07 19:33:54 crc kubenswrapper[4815]: I1207 19:33:54.940654 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79rss\" (UniqueName: \"kubernetes.io/projected/4ce65cf3-4329-4b48-8171-f04889224482-kube-api-access-79rss\") pod \"glance-8d69-account-create-update-5gjns\" (UID: \"4ce65cf3-4329-4b48-8171-f04889224482\") " pod="openstack/glance-8d69-account-create-update-5gjns" Dec 07 19:33:55 crc kubenswrapper[4815]: I1207 19:33:55.042446 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4ce65cf3-4329-4b48-8171-f04889224482-operator-scripts\") pod \"glance-8d69-account-create-update-5gjns\" (UID: \"4ce65cf3-4329-4b48-8171-f04889224482\") " pod="openstack/glance-8d69-account-create-update-5gjns" Dec 07 19:33:55 crc kubenswrapper[4815]: I1207 19:33:55.042507 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltz59\" (UniqueName: \"kubernetes.io/projected/8b7b56ed-7eb9-4bb0-addd-1e234f0f2204-kube-api-access-ltz59\") pod \"glance-db-create-dbfd4\" (UID: \"8b7b56ed-7eb9-4bb0-addd-1e234f0f2204\") " pod="openstack/glance-db-create-dbfd4" Dec 07 19:33:55 crc kubenswrapper[4815]: I1207 19:33:55.042706 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79rss\" (UniqueName: \"kubernetes.io/projected/4ce65cf3-4329-4b48-8171-f04889224482-kube-api-access-79rss\") pod \"glance-8d69-account-create-update-5gjns\" (UID: \"4ce65cf3-4329-4b48-8171-f04889224482\") " pod="openstack/glance-8d69-account-create-update-5gjns" Dec 07 19:33:55 crc kubenswrapper[4815]: I1207 19:33:55.042749 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b7b56ed-7eb9-4bb0-addd-1e234f0f2204-operator-scripts\") pod \"glance-db-create-dbfd4\" (UID: \"8b7b56ed-7eb9-4bb0-addd-1e234f0f2204\") " pod="openstack/glance-db-create-dbfd4" Dec 07 19:33:55 crc kubenswrapper[4815]: I1207 19:33:55.043444 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4ce65cf3-4329-4b48-8171-f04889224482-operator-scripts\") pod \"glance-8d69-account-create-update-5gjns\" (UID: \"4ce65cf3-4329-4b48-8171-f04889224482\") " pod="openstack/glance-8d69-account-create-update-5gjns" Dec 07 19:33:55 crc kubenswrapper[4815]: I1207 19:33:55.044172 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b7b56ed-7eb9-4bb0-addd-1e234f0f2204-operator-scripts\") pod \"glance-db-create-dbfd4\" (UID: \"8b7b56ed-7eb9-4bb0-addd-1e234f0f2204\") " pod="openstack/glance-db-create-dbfd4" Dec 07 19:33:55 crc kubenswrapper[4815]: I1207 19:33:55.064535 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltz59\" (UniqueName: \"kubernetes.io/projected/8b7b56ed-7eb9-4bb0-addd-1e234f0f2204-kube-api-access-ltz59\") pod \"glance-db-create-dbfd4\" (UID: \"8b7b56ed-7eb9-4bb0-addd-1e234f0f2204\") " pod="openstack/glance-db-create-dbfd4" Dec 07 19:33:55 crc kubenswrapper[4815]: I1207 19:33:55.064554 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79rss\" (UniqueName: \"kubernetes.io/projected/4ce65cf3-4329-4b48-8171-f04889224482-kube-api-access-79rss\") pod \"glance-8d69-account-create-update-5gjns\" (UID: \"4ce65cf3-4329-4b48-8171-f04889224482\") " pod="openstack/glance-8d69-account-create-update-5gjns" Dec 07 19:33:55 crc kubenswrapper[4815]: I1207 19:33:55.092121 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-dbfd4" Dec 07 19:33:55 crc kubenswrapper[4815]: I1207 19:33:55.151610 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-8d69-account-create-update-5gjns" Dec 07 19:33:55 crc kubenswrapper[4815]: I1207 19:33:55.396352 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-dbfd4"] Dec 07 19:33:55 crc kubenswrapper[4815]: I1207 19:33:55.597027 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-dbfd4" event={"ID":"8b7b56ed-7eb9-4bb0-addd-1e234f0f2204","Type":"ContainerStarted","Data":"3c5a1416cc04506e01c294be2f260a88a9dc7c21d8f7cea9650a51f522750029"} Dec 07 19:33:55 crc kubenswrapper[4815]: I1207 19:33:55.597303 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-dbfd4" event={"ID":"8b7b56ed-7eb9-4bb0-addd-1e234f0f2204","Type":"ContainerStarted","Data":"014b615ee4b4daa848162d319d6b3ce816cd176af241781d384b829ba31a39fb"} Dec 07 19:33:55 crc kubenswrapper[4815]: I1207 19:33:55.617010 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-dbfd4" podStartSLOduration=1.616993162 podStartE2EDuration="1.616993162s" podCreationTimestamp="2025-12-07 19:33:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:33:55.610892148 +0000 UTC m=+1140.189882213" watchObservedRunningTime="2025-12-07 19:33:55.616993162 +0000 UTC m=+1140.195983207" Dec 07 19:33:55 crc kubenswrapper[4815]: I1207 19:33:55.682848 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-8d69-account-create-update-5gjns"] Dec 07 19:33:55 crc kubenswrapper[4815]: W1207 19:33:55.692695 4815 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4ce65cf3_4329_4b48_8171_f04889224482.slice/crio-d1c4a197717bc4065d941b817f73fe0f41c2816ac90cab43c3cbcf7d984d80d7 WatchSource:0}: Error finding container d1c4a197717bc4065d941b817f73fe0f41c2816ac90cab43c3cbcf7d984d80d7: Status 404 returned error can't find the container with id d1c4a197717bc4065d941b817f73fe0f41c2816ac90cab43c3cbcf7d984d80d7 Dec 07 19:33:56 crc kubenswrapper[4815]: I1207 19:33:56.359873 4815 patch_prober.go:28] interesting pod/machine-config-daemon-gkn4h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 07 19:33:56 crc kubenswrapper[4815]: I1207 19:33:56.360232 4815 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 07 19:33:56 crc kubenswrapper[4815]: I1207 19:33:56.608932 4815 generic.go:334] "Generic (PLEG): container finished" podID="8b7b56ed-7eb9-4bb0-addd-1e234f0f2204" containerID="3c5a1416cc04506e01c294be2f260a88a9dc7c21d8f7cea9650a51f522750029" exitCode=0 Dec 07 19:33:56 crc kubenswrapper[4815]: I1207 19:33:56.609331 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-dbfd4" event={"ID":"8b7b56ed-7eb9-4bb0-addd-1e234f0f2204","Type":"ContainerDied","Data":"3c5a1416cc04506e01c294be2f260a88a9dc7c21d8f7cea9650a51f522750029"} Dec 07 19:33:56 crc kubenswrapper[4815]: I1207 19:33:56.613309 4815 generic.go:334] "Generic (PLEG): container finished" podID="4ce65cf3-4329-4b48-8171-f04889224482" containerID="29aa9f4931f11735f982d8534d433d442982a877dea241f2844724847f9738d5" exitCode=0 Dec 07 19:33:56 crc kubenswrapper[4815]: I1207 19:33:56.613340 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-8d69-account-create-update-5gjns" event={"ID":"4ce65cf3-4329-4b48-8171-f04889224482","Type":"ContainerDied","Data":"29aa9f4931f11735f982d8534d433d442982a877dea241f2844724847f9738d5"} Dec 07 19:33:56 crc kubenswrapper[4815]: I1207 19:33:56.613359 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-8d69-account-create-update-5gjns" event={"ID":"4ce65cf3-4329-4b48-8171-f04889224482","Type":"ContainerStarted","Data":"d1c4a197717bc4065d941b817f73fe0f41c2816ac90cab43c3cbcf7d984d80d7"} Dec 07 19:33:57 crc kubenswrapper[4815]: I1207 19:33:57.954104 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-dbfd4" Dec 07 19:33:58 crc kubenswrapper[4815]: I1207 19:33:58.052711 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-8d69-account-create-update-5gjns" Dec 07 19:33:58 crc kubenswrapper[4815]: I1207 19:33:58.093851 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b7b56ed-7eb9-4bb0-addd-1e234f0f2204-operator-scripts\") pod \"8b7b56ed-7eb9-4bb0-addd-1e234f0f2204\" (UID: \"8b7b56ed-7eb9-4bb0-addd-1e234f0f2204\") " Dec 07 19:33:58 crc kubenswrapper[4815]: I1207 19:33:58.094194 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ltz59\" (UniqueName: \"kubernetes.io/projected/8b7b56ed-7eb9-4bb0-addd-1e234f0f2204-kube-api-access-ltz59\") pod \"8b7b56ed-7eb9-4bb0-addd-1e234f0f2204\" (UID: \"8b7b56ed-7eb9-4bb0-addd-1e234f0f2204\") " Dec 07 19:33:58 crc kubenswrapper[4815]: I1207 19:33:58.094825 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b7b56ed-7eb9-4bb0-addd-1e234f0f2204-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8b7b56ed-7eb9-4bb0-addd-1e234f0f2204" (UID: "8b7b56ed-7eb9-4bb0-addd-1e234f0f2204"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:33:58 crc kubenswrapper[4815]: I1207 19:33:58.099970 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b7b56ed-7eb9-4bb0-addd-1e234f0f2204-kube-api-access-ltz59" (OuterVolumeSpecName: "kube-api-access-ltz59") pod "8b7b56ed-7eb9-4bb0-addd-1e234f0f2204" (UID: "8b7b56ed-7eb9-4bb0-addd-1e234f0f2204"). InnerVolumeSpecName "kube-api-access-ltz59". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:33:58 crc kubenswrapper[4815]: I1207 19:33:58.195864 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-79rss\" (UniqueName: \"kubernetes.io/projected/4ce65cf3-4329-4b48-8171-f04889224482-kube-api-access-79rss\") pod \"4ce65cf3-4329-4b48-8171-f04889224482\" (UID: \"4ce65cf3-4329-4b48-8171-f04889224482\") " Dec 07 19:33:58 crc kubenswrapper[4815]: I1207 19:33:58.196752 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4ce65cf3-4329-4b48-8171-f04889224482-operator-scripts\") pod \"4ce65cf3-4329-4b48-8171-f04889224482\" (UID: \"4ce65cf3-4329-4b48-8171-f04889224482\") " Dec 07 19:33:58 crc kubenswrapper[4815]: I1207 19:33:58.197257 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ltz59\" (UniqueName: \"kubernetes.io/projected/8b7b56ed-7eb9-4bb0-addd-1e234f0f2204-kube-api-access-ltz59\") on node \"crc\" DevicePath \"\"" Dec 07 19:33:58 crc kubenswrapper[4815]: I1207 19:33:58.197321 4815 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b7b56ed-7eb9-4bb0-addd-1e234f0f2204-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 07 19:33:58 crc kubenswrapper[4815]: I1207 19:33:58.197776 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ce65cf3-4329-4b48-8171-f04889224482-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4ce65cf3-4329-4b48-8171-f04889224482" (UID: "4ce65cf3-4329-4b48-8171-f04889224482"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:33:58 crc kubenswrapper[4815]: I1207 19:33:58.200202 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ce65cf3-4329-4b48-8171-f04889224482-kube-api-access-79rss" (OuterVolumeSpecName: "kube-api-access-79rss") pod "4ce65cf3-4329-4b48-8171-f04889224482" (UID: "4ce65cf3-4329-4b48-8171-f04889224482"). InnerVolumeSpecName "kube-api-access-79rss". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:33:58 crc kubenswrapper[4815]: I1207 19:33:58.299443 4815 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4ce65cf3-4329-4b48-8171-f04889224482-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 07 19:33:58 crc kubenswrapper[4815]: I1207 19:33:58.299474 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-79rss\" (UniqueName: \"kubernetes.io/projected/4ce65cf3-4329-4b48-8171-f04889224482-kube-api-access-79rss\") on node \"crc\" DevicePath \"\"" Dec 07 19:33:58 crc kubenswrapper[4815]: I1207 19:33:58.629894 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-dbfd4" event={"ID":"8b7b56ed-7eb9-4bb0-addd-1e234f0f2204","Type":"ContainerDied","Data":"014b615ee4b4daa848162d319d6b3ce816cd176af241781d384b829ba31a39fb"} Dec 07 19:33:58 crc kubenswrapper[4815]: I1207 19:33:58.629935 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-dbfd4" Dec 07 19:33:58 crc kubenswrapper[4815]: I1207 19:33:58.629951 4815 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="014b615ee4b4daa848162d319d6b3ce816cd176af241781d384b829ba31a39fb" Dec 07 19:33:58 crc kubenswrapper[4815]: I1207 19:33:58.631869 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-8d69-account-create-update-5gjns" event={"ID":"4ce65cf3-4329-4b48-8171-f04889224482","Type":"ContainerDied","Data":"d1c4a197717bc4065d941b817f73fe0f41c2816ac90cab43c3cbcf7d984d80d7"} Dec 07 19:33:58 crc kubenswrapper[4815]: I1207 19:33:58.631893 4815 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d1c4a197717bc4065d941b817f73fe0f41c2816ac90cab43c3cbcf7d984d80d7" Dec 07 19:33:58 crc kubenswrapper[4815]: I1207 19:33:58.631960 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-8d69-account-create-update-5gjns" Dec 07 19:34:00 crc kubenswrapper[4815]: I1207 19:34:00.059623 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-s7wqs"] Dec 07 19:34:00 crc kubenswrapper[4815]: E1207 19:34:00.060373 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ce65cf3-4329-4b48-8171-f04889224482" containerName="mariadb-account-create-update" Dec 07 19:34:00 crc kubenswrapper[4815]: I1207 19:34:00.060390 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ce65cf3-4329-4b48-8171-f04889224482" containerName="mariadb-account-create-update" Dec 07 19:34:00 crc kubenswrapper[4815]: E1207 19:34:00.060432 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b7b56ed-7eb9-4bb0-addd-1e234f0f2204" containerName="mariadb-database-create" Dec 07 19:34:00 crc kubenswrapper[4815]: I1207 19:34:00.060441 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b7b56ed-7eb9-4bb0-addd-1e234f0f2204" containerName="mariadb-database-create" Dec 07 19:34:00 crc kubenswrapper[4815]: I1207 19:34:00.060649 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b7b56ed-7eb9-4bb0-addd-1e234f0f2204" containerName="mariadb-database-create" Dec 07 19:34:00 crc kubenswrapper[4815]: I1207 19:34:00.060678 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ce65cf3-4329-4b48-8171-f04889224482" containerName="mariadb-account-create-update" Dec 07 19:34:00 crc kubenswrapper[4815]: I1207 19:34:00.061386 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-s7wqs" Dec 07 19:34:00 crc kubenswrapper[4815]: I1207 19:34:00.065797 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-s7wqs"] Dec 07 19:34:00 crc kubenswrapper[4815]: I1207 19:34:00.066588 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-2mmrk" Dec 07 19:34:00 crc kubenswrapper[4815]: I1207 19:34:00.067213 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Dec 07 19:34:00 crc kubenswrapper[4815]: I1207 19:34:00.232111 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfq5r\" (UniqueName: \"kubernetes.io/projected/9e07853f-f24e-4bf6-8af1-15e4e9cccbc4-kube-api-access-pfq5r\") pod \"glance-db-sync-s7wqs\" (UID: \"9e07853f-f24e-4bf6-8af1-15e4e9cccbc4\") " pod="openstack/glance-db-sync-s7wqs" Dec 07 19:34:00 crc kubenswrapper[4815]: I1207 19:34:00.232432 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9e07853f-f24e-4bf6-8af1-15e4e9cccbc4-db-sync-config-data\") pod \"glance-db-sync-s7wqs\" (UID: \"9e07853f-f24e-4bf6-8af1-15e4e9cccbc4\") " pod="openstack/glance-db-sync-s7wqs" Dec 07 19:34:00 crc kubenswrapper[4815]: I1207 19:34:00.232569 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e07853f-f24e-4bf6-8af1-15e4e9cccbc4-combined-ca-bundle\") pod \"glance-db-sync-s7wqs\" (UID: \"9e07853f-f24e-4bf6-8af1-15e4e9cccbc4\") " pod="openstack/glance-db-sync-s7wqs" Dec 07 19:34:00 crc kubenswrapper[4815]: I1207 19:34:00.232675 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e07853f-f24e-4bf6-8af1-15e4e9cccbc4-config-data\") pod \"glance-db-sync-s7wqs\" (UID: \"9e07853f-f24e-4bf6-8af1-15e4e9cccbc4\") " pod="openstack/glance-db-sync-s7wqs" Dec 07 19:34:00 crc kubenswrapper[4815]: I1207 19:34:00.334041 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfq5r\" (UniqueName: \"kubernetes.io/projected/9e07853f-f24e-4bf6-8af1-15e4e9cccbc4-kube-api-access-pfq5r\") pod \"glance-db-sync-s7wqs\" (UID: \"9e07853f-f24e-4bf6-8af1-15e4e9cccbc4\") " pod="openstack/glance-db-sync-s7wqs" Dec 07 19:34:00 crc kubenswrapper[4815]: I1207 19:34:00.334112 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9e07853f-f24e-4bf6-8af1-15e4e9cccbc4-db-sync-config-data\") pod \"glance-db-sync-s7wqs\" (UID: \"9e07853f-f24e-4bf6-8af1-15e4e9cccbc4\") " pod="openstack/glance-db-sync-s7wqs" Dec 07 19:34:00 crc kubenswrapper[4815]: I1207 19:34:00.334179 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e07853f-f24e-4bf6-8af1-15e4e9cccbc4-combined-ca-bundle\") pod \"glance-db-sync-s7wqs\" (UID: \"9e07853f-f24e-4bf6-8af1-15e4e9cccbc4\") " pod="openstack/glance-db-sync-s7wqs" Dec 07 19:34:00 crc kubenswrapper[4815]: I1207 19:34:00.334220 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e07853f-f24e-4bf6-8af1-15e4e9cccbc4-config-data\") pod \"glance-db-sync-s7wqs\" (UID: \"9e07853f-f24e-4bf6-8af1-15e4e9cccbc4\") " pod="openstack/glance-db-sync-s7wqs" Dec 07 19:34:00 crc kubenswrapper[4815]: I1207 19:34:00.339831 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9e07853f-f24e-4bf6-8af1-15e4e9cccbc4-db-sync-config-data\") pod \"glance-db-sync-s7wqs\" (UID: \"9e07853f-f24e-4bf6-8af1-15e4e9cccbc4\") " pod="openstack/glance-db-sync-s7wqs" Dec 07 19:34:00 crc kubenswrapper[4815]: I1207 19:34:00.340899 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e07853f-f24e-4bf6-8af1-15e4e9cccbc4-combined-ca-bundle\") pod \"glance-db-sync-s7wqs\" (UID: \"9e07853f-f24e-4bf6-8af1-15e4e9cccbc4\") " pod="openstack/glance-db-sync-s7wqs" Dec 07 19:34:00 crc kubenswrapper[4815]: I1207 19:34:00.351146 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e07853f-f24e-4bf6-8af1-15e4e9cccbc4-config-data\") pod \"glance-db-sync-s7wqs\" (UID: \"9e07853f-f24e-4bf6-8af1-15e4e9cccbc4\") " pod="openstack/glance-db-sync-s7wqs" Dec 07 19:34:00 crc kubenswrapper[4815]: I1207 19:34:00.359565 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfq5r\" (UniqueName: \"kubernetes.io/projected/9e07853f-f24e-4bf6-8af1-15e4e9cccbc4-kube-api-access-pfq5r\") pod \"glance-db-sync-s7wqs\" (UID: \"9e07853f-f24e-4bf6-8af1-15e4e9cccbc4\") " pod="openstack/glance-db-sync-s7wqs" Dec 07 19:34:00 crc kubenswrapper[4815]: I1207 19:34:00.497950 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-s7wqs" Dec 07 19:34:00 crc kubenswrapper[4815]: I1207 19:34:00.651408 4815 generic.go:334] "Generic (PLEG): container finished" podID="814a06c9-c432-4a32-835e-59a4831cf335" containerID="27aedc7db2bbcd1f9de2ea9311a9535e98297dae22558748a941d3049bf6da65" exitCode=0 Dec 07 19:34:00 crc kubenswrapper[4815]: I1207 19:34:00.651691 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"814a06c9-c432-4a32-835e-59a4831cf335","Type":"ContainerDied","Data":"27aedc7db2bbcd1f9de2ea9311a9535e98297dae22558748a941d3049bf6da65"} Dec 07 19:34:00 crc kubenswrapper[4815]: I1207 19:34:00.688960 4815 generic.go:334] "Generic (PLEG): container finished" podID="c2f348fe-7af1-4260-9946-27b3e711400d" containerID="35fe694f36a5b968fea28439d6a0189c15b3d94d39753db9a98857fefc277512" exitCode=0 Dec 07 19:34:00 crc kubenswrapper[4815]: I1207 19:34:00.689110 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c2f348fe-7af1-4260-9946-27b3e711400d","Type":"ContainerDied","Data":"35fe694f36a5b968fea28439d6a0189c15b3d94d39753db9a98857fefc277512"} Dec 07 19:34:01 crc kubenswrapper[4815]: I1207 19:34:01.129389 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-s7wqs"] Dec 07 19:34:01 crc kubenswrapper[4815]: W1207 19:34:01.133040 4815 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9e07853f_f24e_4bf6_8af1_15e4e9cccbc4.slice/crio-0708348e73ca8b638ead18552c8d669357b12970682a4beaad8e3da5e1e21d87 WatchSource:0}: Error finding container 0708348e73ca8b638ead18552c8d669357b12970682a4beaad8e3da5e1e21d87: Status 404 returned error can't find the container with id 0708348e73ca8b638ead18552c8d669357b12970682a4beaad8e3da5e1e21d87 Dec 07 19:34:01 crc kubenswrapper[4815]: I1207 19:34:01.701910 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"814a06c9-c432-4a32-835e-59a4831cf335","Type":"ContainerStarted","Data":"29919a154299bd8a43d0a6db20c47d16533c78aca191b5f888bd4bdf9b546516"} Dec 07 19:34:01 crc kubenswrapper[4815]: I1207 19:34:01.703515 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 07 19:34:01 crc kubenswrapper[4815]: I1207 19:34:01.710354 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c2f348fe-7af1-4260-9946-27b3e711400d","Type":"ContainerStarted","Data":"c6a0f7926e8736e115fd39c81271cae94bd2d46de082517f365745f1b0ebf535"} Dec 07 19:34:01 crc kubenswrapper[4815]: I1207 19:34:01.711872 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 07 19:34:01 crc kubenswrapper[4815]: I1207 19:34:01.714560 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-s7wqs" event={"ID":"9e07853f-f24e-4bf6-8af1-15e4e9cccbc4","Type":"ContainerStarted","Data":"0708348e73ca8b638ead18552c8d669357b12970682a4beaad8e3da5e1e21d87"} Dec 07 19:34:01 crc kubenswrapper[4815]: I1207 19:34:01.753580 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=55.386779458 podStartE2EDuration="1m6.753562706s" podCreationTimestamp="2025-12-07 19:32:55 +0000 UTC" firstStartedPulling="2025-12-07 19:33:14.10545721 +0000 UTC m=+1098.684447255" lastFinishedPulling="2025-12-07 19:33:25.472240458 +0000 UTC m=+1110.051230503" observedRunningTime="2025-12-07 19:34:01.740512424 +0000 UTC m=+1146.319502459" watchObservedRunningTime="2025-12-07 19:34:01.753562706 +0000 UTC m=+1146.332552751" Dec 07 19:34:01 crc kubenswrapper[4815]: I1207 19:34:01.804103 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=54.430222105 podStartE2EDuration="1m7.804082148s" podCreationTimestamp="2025-12-07 19:32:54 +0000 UTC" firstStartedPulling="2025-12-07 19:33:13.340890338 +0000 UTC m=+1097.919880383" lastFinishedPulling="2025-12-07 19:33:26.714750381 +0000 UTC m=+1111.293740426" observedRunningTime="2025-12-07 19:34:01.799870128 +0000 UTC m=+1146.378860183" watchObservedRunningTime="2025-12-07 19:34:01.804082148 +0000 UTC m=+1146.383072193" Dec 07 19:34:01 crc kubenswrapper[4815]: I1207 19:34:01.889240 4815 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-24gkx" podUID="5d3ed0f7-1ea2-48e7-bab4-f5a709da4850" containerName="ovn-controller" probeResult="failure" output=< Dec 07 19:34:01 crc kubenswrapper[4815]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 07 19:34:01 crc kubenswrapper[4815]: > Dec 07 19:34:06 crc kubenswrapper[4815]: I1207 19:34:06.882587 4815 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-24gkx" podUID="5d3ed0f7-1ea2-48e7-bab4-f5a709da4850" containerName="ovn-controller" probeResult="failure" output=< Dec 07 19:34:06 crc kubenswrapper[4815]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 07 19:34:06 crc kubenswrapper[4815]: > Dec 07 19:34:06 crc kubenswrapper[4815]: I1207 19:34:06.902592 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-bhvc5" Dec 07 19:34:06 crc kubenswrapper[4815]: I1207 19:34:06.904475 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-bhvc5" Dec 07 19:34:07 crc kubenswrapper[4815]: I1207 19:34:07.148334 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-24gkx-config-jw62j"] Dec 07 19:34:07 crc kubenswrapper[4815]: I1207 19:34:07.160533 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-24gkx-config-jw62j" Dec 07 19:34:07 crc kubenswrapper[4815]: I1207 19:34:07.177903 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-24gkx-config-jw62j"] Dec 07 19:34:07 crc kubenswrapper[4815]: I1207 19:34:07.181769 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 07 19:34:07 crc kubenswrapper[4815]: I1207 19:34:07.263228 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2997b217-ae9b-4d50-b2cc-b39825b574f6-var-run\") pod \"ovn-controller-24gkx-config-jw62j\" (UID: \"2997b217-ae9b-4d50-b2cc-b39825b574f6\") " pod="openstack/ovn-controller-24gkx-config-jw62j" Dec 07 19:34:07 crc kubenswrapper[4815]: I1207 19:34:07.263311 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ss9kg\" (UniqueName: \"kubernetes.io/projected/2997b217-ae9b-4d50-b2cc-b39825b574f6-kube-api-access-ss9kg\") pod \"ovn-controller-24gkx-config-jw62j\" (UID: \"2997b217-ae9b-4d50-b2cc-b39825b574f6\") " pod="openstack/ovn-controller-24gkx-config-jw62j" Dec 07 19:34:07 crc kubenswrapper[4815]: I1207 19:34:07.263332 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2997b217-ae9b-4d50-b2cc-b39825b574f6-scripts\") pod \"ovn-controller-24gkx-config-jw62j\" (UID: \"2997b217-ae9b-4d50-b2cc-b39825b574f6\") " pod="openstack/ovn-controller-24gkx-config-jw62j" Dec 07 19:34:07 crc kubenswrapper[4815]: I1207 19:34:07.263346 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/2997b217-ae9b-4d50-b2cc-b39825b574f6-additional-scripts\") pod \"ovn-controller-24gkx-config-jw62j\" (UID: \"2997b217-ae9b-4d50-b2cc-b39825b574f6\") " pod="openstack/ovn-controller-24gkx-config-jw62j" Dec 07 19:34:07 crc kubenswrapper[4815]: I1207 19:34:07.263383 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2997b217-ae9b-4d50-b2cc-b39825b574f6-var-run-ovn\") pod \"ovn-controller-24gkx-config-jw62j\" (UID: \"2997b217-ae9b-4d50-b2cc-b39825b574f6\") " pod="openstack/ovn-controller-24gkx-config-jw62j" Dec 07 19:34:07 crc kubenswrapper[4815]: I1207 19:34:07.263431 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2997b217-ae9b-4d50-b2cc-b39825b574f6-var-log-ovn\") pod \"ovn-controller-24gkx-config-jw62j\" (UID: \"2997b217-ae9b-4d50-b2cc-b39825b574f6\") " pod="openstack/ovn-controller-24gkx-config-jw62j" Dec 07 19:34:07 crc kubenswrapper[4815]: I1207 19:34:07.384136 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2997b217-ae9b-4d50-b2cc-b39825b574f6-var-run-ovn\") pod \"ovn-controller-24gkx-config-jw62j\" (UID: \"2997b217-ae9b-4d50-b2cc-b39825b574f6\") " pod="openstack/ovn-controller-24gkx-config-jw62j" Dec 07 19:34:07 crc kubenswrapper[4815]: I1207 19:34:07.384212 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2997b217-ae9b-4d50-b2cc-b39825b574f6-var-log-ovn\") pod \"ovn-controller-24gkx-config-jw62j\" (UID: \"2997b217-ae9b-4d50-b2cc-b39825b574f6\") " pod="openstack/ovn-controller-24gkx-config-jw62j" Dec 07 19:34:07 crc kubenswrapper[4815]: I1207 19:34:07.384256 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2997b217-ae9b-4d50-b2cc-b39825b574f6-var-run\") pod \"ovn-controller-24gkx-config-jw62j\" (UID: \"2997b217-ae9b-4d50-b2cc-b39825b574f6\") " pod="openstack/ovn-controller-24gkx-config-jw62j" Dec 07 19:34:07 crc kubenswrapper[4815]: I1207 19:34:07.384296 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ss9kg\" (UniqueName: \"kubernetes.io/projected/2997b217-ae9b-4d50-b2cc-b39825b574f6-kube-api-access-ss9kg\") pod \"ovn-controller-24gkx-config-jw62j\" (UID: \"2997b217-ae9b-4d50-b2cc-b39825b574f6\") " pod="openstack/ovn-controller-24gkx-config-jw62j" Dec 07 19:34:07 crc kubenswrapper[4815]: I1207 19:34:07.384318 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2997b217-ae9b-4d50-b2cc-b39825b574f6-scripts\") pod \"ovn-controller-24gkx-config-jw62j\" (UID: \"2997b217-ae9b-4d50-b2cc-b39825b574f6\") " pod="openstack/ovn-controller-24gkx-config-jw62j" Dec 07 19:34:07 crc kubenswrapper[4815]: I1207 19:34:07.384332 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/2997b217-ae9b-4d50-b2cc-b39825b574f6-additional-scripts\") pod \"ovn-controller-24gkx-config-jw62j\" (UID: \"2997b217-ae9b-4d50-b2cc-b39825b574f6\") " pod="openstack/ovn-controller-24gkx-config-jw62j" Dec 07 19:34:07 crc kubenswrapper[4815]: I1207 19:34:07.384743 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2997b217-ae9b-4d50-b2cc-b39825b574f6-var-run\") pod \"ovn-controller-24gkx-config-jw62j\" (UID: \"2997b217-ae9b-4d50-b2cc-b39825b574f6\") " pod="openstack/ovn-controller-24gkx-config-jw62j" Dec 07 19:34:07 crc kubenswrapper[4815]: I1207 19:34:07.384805 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2997b217-ae9b-4d50-b2cc-b39825b574f6-var-run-ovn\") pod \"ovn-controller-24gkx-config-jw62j\" (UID: \"2997b217-ae9b-4d50-b2cc-b39825b574f6\") " pod="openstack/ovn-controller-24gkx-config-jw62j" Dec 07 19:34:07 crc kubenswrapper[4815]: I1207 19:34:07.384841 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2997b217-ae9b-4d50-b2cc-b39825b574f6-var-log-ovn\") pod \"ovn-controller-24gkx-config-jw62j\" (UID: \"2997b217-ae9b-4d50-b2cc-b39825b574f6\") " pod="openstack/ovn-controller-24gkx-config-jw62j" Dec 07 19:34:07 crc kubenswrapper[4815]: I1207 19:34:07.385107 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/2997b217-ae9b-4d50-b2cc-b39825b574f6-additional-scripts\") pod \"ovn-controller-24gkx-config-jw62j\" (UID: \"2997b217-ae9b-4d50-b2cc-b39825b574f6\") " pod="openstack/ovn-controller-24gkx-config-jw62j" Dec 07 19:34:07 crc kubenswrapper[4815]: I1207 19:34:07.386651 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2997b217-ae9b-4d50-b2cc-b39825b574f6-scripts\") pod \"ovn-controller-24gkx-config-jw62j\" (UID: \"2997b217-ae9b-4d50-b2cc-b39825b574f6\") " pod="openstack/ovn-controller-24gkx-config-jw62j" Dec 07 19:34:07 crc kubenswrapper[4815]: I1207 19:34:07.425568 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ss9kg\" (UniqueName: \"kubernetes.io/projected/2997b217-ae9b-4d50-b2cc-b39825b574f6-kube-api-access-ss9kg\") pod \"ovn-controller-24gkx-config-jw62j\" (UID: \"2997b217-ae9b-4d50-b2cc-b39825b574f6\") " pod="openstack/ovn-controller-24gkx-config-jw62j" Dec 07 19:34:07 crc kubenswrapper[4815]: I1207 19:34:07.483379 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-24gkx-config-jw62j" Dec 07 19:34:11 crc kubenswrapper[4815]: I1207 19:34:11.965211 4815 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-24gkx" podUID="5d3ed0f7-1ea2-48e7-bab4-f5a709da4850" containerName="ovn-controller" probeResult="failure" output=< Dec 07 19:34:11 crc kubenswrapper[4815]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 07 19:34:11 crc kubenswrapper[4815]: > Dec 07 19:34:16 crc kubenswrapper[4815]: I1207 19:34:16.457206 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 07 19:34:16 crc kubenswrapper[4815]: I1207 19:34:16.650090 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 07 19:34:16 crc kubenswrapper[4815]: I1207 19:34:16.948660 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-tjblv"] Dec 07 19:34:16 crc kubenswrapper[4815]: I1207 19:34:16.949949 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-tjblv" Dec 07 19:34:16 crc kubenswrapper[4815]: I1207 19:34:16.988285 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-tjblv"] Dec 07 19:34:17 crc kubenswrapper[4815]: I1207 19:34:17.004835 4815 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-24gkx" podUID="5d3ed0f7-1ea2-48e7-bab4-f5a709da4850" containerName="ovn-controller" probeResult="failure" output=< Dec 07 19:34:17 crc kubenswrapper[4815]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 07 19:34:17 crc kubenswrapper[4815]: > Dec 07 19:34:17 crc kubenswrapper[4815]: I1207 19:34:17.048368 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/08539e2c-7b61-45a4-8dad-7763d4cc8d01-operator-scripts\") pod \"cinder-db-create-tjblv\" (UID: \"08539e2c-7b61-45a4-8dad-7763d4cc8d01\") " pod="openstack/cinder-db-create-tjblv" Dec 07 19:34:17 crc kubenswrapper[4815]: I1207 19:34:17.048527 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9nvx\" (UniqueName: \"kubernetes.io/projected/08539e2c-7b61-45a4-8dad-7763d4cc8d01-kube-api-access-w9nvx\") pod \"cinder-db-create-tjblv\" (UID: \"08539e2c-7b61-45a4-8dad-7763d4cc8d01\") " pod="openstack/cinder-db-create-tjblv" Dec 07 19:34:17 crc kubenswrapper[4815]: E1207 19:34:17.074190 4815 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-glance-api:current-podified" Dec 07 19:34:17 crc kubenswrapper[4815]: E1207 19:34:17.074404 4815 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:glance-db-sync,Image:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/glance/glance.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pfq5r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42415,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42415,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-db-sync-s7wqs_openstack(9e07853f-f24e-4bf6-8af1-15e4e9cccbc4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 07 19:34:17 crc kubenswrapper[4815]: E1207 19:34:17.086053 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/glance-db-sync-s7wqs" podUID="9e07853f-f24e-4bf6-8af1-15e4e9cccbc4" Dec 07 19:34:17 crc kubenswrapper[4815]: I1207 19:34:17.123964 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-t9b9t"] Dec 07 19:34:17 crc kubenswrapper[4815]: I1207 19:34:17.124938 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-t9b9t" Dec 07 19:34:17 crc kubenswrapper[4815]: I1207 19:34:17.142954 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-f0f9-account-create-update-djwr2"] Dec 07 19:34:17 crc kubenswrapper[4815]: I1207 19:34:17.152438 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-f0f9-account-create-update-djwr2" Dec 07 19:34:17 crc kubenswrapper[4815]: I1207 19:34:17.153658 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9nvx\" (UniqueName: \"kubernetes.io/projected/08539e2c-7b61-45a4-8dad-7763d4cc8d01-kube-api-access-w9nvx\") pod \"cinder-db-create-tjblv\" (UID: \"08539e2c-7b61-45a4-8dad-7763d4cc8d01\") " pod="openstack/cinder-db-create-tjblv" Dec 07 19:34:17 crc kubenswrapper[4815]: I1207 19:34:17.153792 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/08539e2c-7b61-45a4-8dad-7763d4cc8d01-operator-scripts\") pod \"cinder-db-create-tjblv\" (UID: \"08539e2c-7b61-45a4-8dad-7763d4cc8d01\") " pod="openstack/cinder-db-create-tjblv" Dec 07 19:34:17 crc kubenswrapper[4815]: I1207 19:34:17.155810 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Dec 07 19:34:17 crc kubenswrapper[4815]: I1207 19:34:17.158057 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/08539e2c-7b61-45a4-8dad-7763d4cc8d01-operator-scripts\") pod \"cinder-db-create-tjblv\" (UID: \"08539e2c-7b61-45a4-8dad-7763d4cc8d01\") " pod="openstack/cinder-db-create-tjblv" Dec 07 19:34:17 crc kubenswrapper[4815]: I1207 19:34:17.159897 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-t9b9t"] Dec 07 19:34:17 crc kubenswrapper[4815]: I1207 19:34:17.185644 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-f0f9-account-create-update-djwr2"] Dec 07 19:34:17 crc kubenswrapper[4815]: I1207 19:34:17.226015 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9nvx\" (UniqueName: \"kubernetes.io/projected/08539e2c-7b61-45a4-8dad-7763d4cc8d01-kube-api-access-w9nvx\") pod \"cinder-db-create-tjblv\" (UID: \"08539e2c-7b61-45a4-8dad-7763d4cc8d01\") " pod="openstack/cinder-db-create-tjblv" Dec 07 19:34:17 crc kubenswrapper[4815]: I1207 19:34:17.254826 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2crw\" (UniqueName: \"kubernetes.io/projected/2a63d827-767e-4965-871c-be277190b680-kube-api-access-l2crw\") pod \"barbican-db-create-t9b9t\" (UID: \"2a63d827-767e-4965-871c-be277190b680\") " pod="openstack/barbican-db-create-t9b9t" Dec 07 19:34:17 crc kubenswrapper[4815]: I1207 19:34:17.254971 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a63d827-767e-4965-871c-be277190b680-operator-scripts\") pod \"barbican-db-create-t9b9t\" (UID: \"2a63d827-767e-4965-871c-be277190b680\") " pod="openstack/barbican-db-create-t9b9t" Dec 07 19:34:17 crc kubenswrapper[4815]: I1207 19:34:17.255026 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72214816-2ac0-4ba7-99f8-6d56479c7e40-operator-scripts\") pod \"barbican-f0f9-account-create-update-djwr2\" (UID: \"72214816-2ac0-4ba7-99f8-6d56479c7e40\") " pod="openstack/barbican-f0f9-account-create-update-djwr2" Dec 07 19:34:17 crc kubenswrapper[4815]: I1207 19:34:17.255046 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2kps\" (UniqueName: \"kubernetes.io/projected/72214816-2ac0-4ba7-99f8-6d56479c7e40-kube-api-access-d2kps\") pod \"barbican-f0f9-account-create-update-djwr2\" (UID: \"72214816-2ac0-4ba7-99f8-6d56479c7e40\") " pod="openstack/barbican-f0f9-account-create-update-djwr2" Dec 07 19:34:17 crc kubenswrapper[4815]: I1207 19:34:17.276007 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-tjblv" Dec 07 19:34:17 crc kubenswrapper[4815]: I1207 19:34:17.317337 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-pvvmf"] Dec 07 19:34:17 crc kubenswrapper[4815]: I1207 19:34:17.318449 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-pvvmf" Dec 07 19:34:17 crc kubenswrapper[4815]: I1207 19:34:17.336699 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 07 19:34:17 crc kubenswrapper[4815]: I1207 19:34:17.336961 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-mdlc7" Dec 07 19:34:17 crc kubenswrapper[4815]: I1207 19:34:17.337077 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 07 19:34:17 crc kubenswrapper[4815]: I1207 19:34:17.347306 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 07 19:34:17 crc kubenswrapper[4815]: I1207 19:34:17.357795 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a63d827-767e-4965-871c-be277190b680-operator-scripts\") pod \"barbican-db-create-t9b9t\" (UID: \"2a63d827-767e-4965-871c-be277190b680\") " pod="openstack/barbican-db-create-t9b9t" Dec 07 19:34:17 crc kubenswrapper[4815]: I1207 19:34:17.357939 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72214816-2ac0-4ba7-99f8-6d56479c7e40-operator-scripts\") pod \"barbican-f0f9-account-create-update-djwr2\" (UID: \"72214816-2ac0-4ba7-99f8-6d56479c7e40\") " pod="openstack/barbican-f0f9-account-create-update-djwr2" Dec 07 19:34:17 crc kubenswrapper[4815]: I1207 19:34:17.357990 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2kps\" (UniqueName: \"kubernetes.io/projected/72214816-2ac0-4ba7-99f8-6d56479c7e40-kube-api-access-d2kps\") pod \"barbican-f0f9-account-create-update-djwr2\" (UID: \"72214816-2ac0-4ba7-99f8-6d56479c7e40\") " pod="openstack/barbican-f0f9-account-create-update-djwr2" Dec 07 19:34:17 crc kubenswrapper[4815]: I1207 19:34:17.358018 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2crw\" (UniqueName: \"kubernetes.io/projected/2a63d827-767e-4965-871c-be277190b680-kube-api-access-l2crw\") pod \"barbican-db-create-t9b9t\" (UID: \"2a63d827-767e-4965-871c-be277190b680\") " pod="openstack/barbican-db-create-t9b9t" Dec 07 19:34:17 crc kubenswrapper[4815]: I1207 19:34:17.359316 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a63d827-767e-4965-871c-be277190b680-operator-scripts\") pod \"barbican-db-create-t9b9t\" (UID: \"2a63d827-767e-4965-871c-be277190b680\") " pod="openstack/barbican-db-create-t9b9t" Dec 07 19:34:17 crc kubenswrapper[4815]: I1207 19:34:17.360013 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72214816-2ac0-4ba7-99f8-6d56479c7e40-operator-scripts\") pod \"barbican-f0f9-account-create-update-djwr2\" (UID: \"72214816-2ac0-4ba7-99f8-6d56479c7e40\") " pod="openstack/barbican-f0f9-account-create-update-djwr2" Dec 07 19:34:17 crc kubenswrapper[4815]: I1207 19:34:17.414538 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2crw\" (UniqueName: \"kubernetes.io/projected/2a63d827-767e-4965-871c-be277190b680-kube-api-access-l2crw\") pod \"barbican-db-create-t9b9t\" (UID: \"2a63d827-767e-4965-871c-be277190b680\") " pod="openstack/barbican-db-create-t9b9t" Dec 07 19:34:17 crc kubenswrapper[4815]: I1207 19:34:17.443523 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-pvvmf"] Dec 07 19:34:17 crc kubenswrapper[4815]: I1207 19:34:17.451745 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-22e4-account-create-update-5hjjx"] Dec 07 19:34:17 crc kubenswrapper[4815]: I1207 19:34:17.457293 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-22e4-account-create-update-5hjjx" Dec 07 19:34:17 crc kubenswrapper[4815]: I1207 19:34:17.459954 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bb0c957-b8cf-48ef-8a8d-f409fa031e2f-combined-ca-bundle\") pod \"keystone-db-sync-pvvmf\" (UID: \"4bb0c957-b8cf-48ef-8a8d-f409fa031e2f\") " pod="openstack/keystone-db-sync-pvvmf" Dec 07 19:34:17 crc kubenswrapper[4815]: I1207 19:34:17.460000 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bb0c957-b8cf-48ef-8a8d-f409fa031e2f-config-data\") pod \"keystone-db-sync-pvvmf\" (UID: \"4bb0c957-b8cf-48ef-8a8d-f409fa031e2f\") " pod="openstack/keystone-db-sync-pvvmf" Dec 07 19:34:17 crc kubenswrapper[4815]: I1207 19:34:17.460112 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lqk6\" (UniqueName: \"kubernetes.io/projected/4bb0c957-b8cf-48ef-8a8d-f409fa031e2f-kube-api-access-6lqk6\") pod \"keystone-db-sync-pvvmf\" (UID: \"4bb0c957-b8cf-48ef-8a8d-f409fa031e2f\") " pod="openstack/keystone-db-sync-pvvmf" Dec 07 19:34:17 crc kubenswrapper[4815]: I1207 19:34:17.470408 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-t9b9t" Dec 07 19:34:17 crc kubenswrapper[4815]: I1207 19:34:17.484011 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Dec 07 19:34:17 crc kubenswrapper[4815]: I1207 19:34:17.505469 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2kps\" (UniqueName: \"kubernetes.io/projected/72214816-2ac0-4ba7-99f8-6d56479c7e40-kube-api-access-d2kps\") pod \"barbican-f0f9-account-create-update-djwr2\" (UID: \"72214816-2ac0-4ba7-99f8-6d56479c7e40\") " pod="openstack/barbican-f0f9-account-create-update-djwr2" Dec 07 19:34:17 crc kubenswrapper[4815]: I1207 19:34:17.561070 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bb0c957-b8cf-48ef-8a8d-f409fa031e2f-config-data\") pod \"keystone-db-sync-pvvmf\" (UID: \"4bb0c957-b8cf-48ef-8a8d-f409fa031e2f\") " pod="openstack/keystone-db-sync-pvvmf" Dec 07 19:34:17 crc kubenswrapper[4815]: I1207 19:34:17.561174 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-br4xg\" (UniqueName: \"kubernetes.io/projected/c7ceffd5-e9d5-42d6-a93f-cbbfa94c0492-kube-api-access-br4xg\") pod \"cinder-22e4-account-create-update-5hjjx\" (UID: \"c7ceffd5-e9d5-42d6-a93f-cbbfa94c0492\") " pod="openstack/cinder-22e4-account-create-update-5hjjx" Dec 07 19:34:17 crc kubenswrapper[4815]: I1207 19:34:17.561242 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7ceffd5-e9d5-42d6-a93f-cbbfa94c0492-operator-scripts\") pod \"cinder-22e4-account-create-update-5hjjx\" (UID: \"c7ceffd5-e9d5-42d6-a93f-cbbfa94c0492\") " pod="openstack/cinder-22e4-account-create-update-5hjjx" Dec 07 19:34:17 crc kubenswrapper[4815]: I1207 19:34:17.561320 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lqk6\" (UniqueName: \"kubernetes.io/projected/4bb0c957-b8cf-48ef-8a8d-f409fa031e2f-kube-api-access-6lqk6\") pod \"keystone-db-sync-pvvmf\" (UID: \"4bb0c957-b8cf-48ef-8a8d-f409fa031e2f\") " pod="openstack/keystone-db-sync-pvvmf" Dec 07 19:34:17 crc kubenswrapper[4815]: I1207 19:34:17.561400 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bb0c957-b8cf-48ef-8a8d-f409fa031e2f-combined-ca-bundle\") pod \"keystone-db-sync-pvvmf\" (UID: \"4bb0c957-b8cf-48ef-8a8d-f409fa031e2f\") " pod="openstack/keystone-db-sync-pvvmf" Dec 07 19:34:17 crc kubenswrapper[4815]: I1207 19:34:17.570639 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bb0c957-b8cf-48ef-8a8d-f409fa031e2f-config-data\") pod \"keystone-db-sync-pvvmf\" (UID: \"4bb0c957-b8cf-48ef-8a8d-f409fa031e2f\") " pod="openstack/keystone-db-sync-pvvmf" Dec 07 19:34:17 crc kubenswrapper[4815]: I1207 19:34:17.570897 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-22e4-account-create-update-5hjjx"] Dec 07 19:34:17 crc kubenswrapper[4815]: I1207 19:34:17.579615 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bb0c957-b8cf-48ef-8a8d-f409fa031e2f-combined-ca-bundle\") pod \"keystone-db-sync-pvvmf\" (UID: \"4bb0c957-b8cf-48ef-8a8d-f409fa031e2f\") " pod="openstack/keystone-db-sync-pvvmf" Dec 07 19:34:17 crc kubenswrapper[4815]: I1207 19:34:17.584324 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lqk6\" (UniqueName: \"kubernetes.io/projected/4bb0c957-b8cf-48ef-8a8d-f409fa031e2f-kube-api-access-6lqk6\") pod \"keystone-db-sync-pvvmf\" (UID: \"4bb0c957-b8cf-48ef-8a8d-f409fa031e2f\") " pod="openstack/keystone-db-sync-pvvmf" Dec 07 19:34:17 crc kubenswrapper[4815]: I1207 19:34:17.637984 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-wgnj5"] Dec 07 19:34:17 crc kubenswrapper[4815]: I1207 19:34:17.639581 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-wgnj5" Dec 07 19:34:17 crc kubenswrapper[4815]: I1207 19:34:17.661499 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-wgnj5"] Dec 07 19:34:17 crc kubenswrapper[4815]: I1207 19:34:17.670179 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-br4xg\" (UniqueName: \"kubernetes.io/projected/c7ceffd5-e9d5-42d6-a93f-cbbfa94c0492-kube-api-access-br4xg\") pod \"cinder-22e4-account-create-update-5hjjx\" (UID: \"c7ceffd5-e9d5-42d6-a93f-cbbfa94c0492\") " pod="openstack/cinder-22e4-account-create-update-5hjjx" Dec 07 19:34:17 crc kubenswrapper[4815]: I1207 19:34:17.670245 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7ceffd5-e9d5-42d6-a93f-cbbfa94c0492-operator-scripts\") pod \"cinder-22e4-account-create-update-5hjjx\" (UID: \"c7ceffd5-e9d5-42d6-a93f-cbbfa94c0492\") " pod="openstack/cinder-22e4-account-create-update-5hjjx" Dec 07 19:34:17 crc kubenswrapper[4815]: I1207 19:34:17.671100 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7ceffd5-e9d5-42d6-a93f-cbbfa94c0492-operator-scripts\") pod \"cinder-22e4-account-create-update-5hjjx\" (UID: \"c7ceffd5-e9d5-42d6-a93f-cbbfa94c0492\") " pod="openstack/cinder-22e4-account-create-update-5hjjx" Dec 07 19:34:17 crc kubenswrapper[4815]: I1207 19:34:17.722741 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-pvvmf" Dec 07 19:34:17 crc kubenswrapper[4815]: I1207 19:34:17.787788 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7892\" (UniqueName: \"kubernetes.io/projected/c4ee2adb-b254-42cc-8100-09c598d670ef-kube-api-access-g7892\") pod \"neutron-db-create-wgnj5\" (UID: \"c4ee2adb-b254-42cc-8100-09c598d670ef\") " pod="openstack/neutron-db-create-wgnj5" Dec 07 19:34:17 crc kubenswrapper[4815]: I1207 19:34:17.787900 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c4ee2adb-b254-42cc-8100-09c598d670ef-operator-scripts\") pod \"neutron-db-create-wgnj5\" (UID: \"c4ee2adb-b254-42cc-8100-09c598d670ef\") " pod="openstack/neutron-db-create-wgnj5" Dec 07 19:34:17 crc kubenswrapper[4815]: I1207 19:34:17.789831 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-f0f9-account-create-update-djwr2" Dec 07 19:34:17 crc kubenswrapper[4815]: I1207 19:34:17.795060 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-br4xg\" (UniqueName: \"kubernetes.io/projected/c7ceffd5-e9d5-42d6-a93f-cbbfa94c0492-kube-api-access-br4xg\") pod \"cinder-22e4-account-create-update-5hjjx\" (UID: \"c7ceffd5-e9d5-42d6-a93f-cbbfa94c0492\") " pod="openstack/cinder-22e4-account-create-update-5hjjx" Dec 07 19:34:17 crc kubenswrapper[4815]: I1207 19:34:17.876978 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7561-account-create-update-v2s5j"] Dec 07 19:34:17 crc kubenswrapper[4815]: I1207 19:34:17.878087 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7561-account-create-update-v2s5j" Dec 07 19:34:17 crc kubenswrapper[4815]: I1207 19:34:17.891571 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Dec 07 19:34:17 crc kubenswrapper[4815]: I1207 19:34:17.892338 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c4ee2adb-b254-42cc-8100-09c598d670ef-operator-scripts\") pod \"neutron-db-create-wgnj5\" (UID: \"c4ee2adb-b254-42cc-8100-09c598d670ef\") " pod="openstack/neutron-db-create-wgnj5" Dec 07 19:34:17 crc kubenswrapper[4815]: I1207 19:34:17.892459 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7892\" (UniqueName: \"kubernetes.io/projected/c4ee2adb-b254-42cc-8100-09c598d670ef-kube-api-access-g7892\") pod \"neutron-db-create-wgnj5\" (UID: \"c4ee2adb-b254-42cc-8100-09c598d670ef\") " pod="openstack/neutron-db-create-wgnj5" Dec 07 19:34:17 crc kubenswrapper[4815]: I1207 19:34:17.893291 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c4ee2adb-b254-42cc-8100-09c598d670ef-operator-scripts\") pod \"neutron-db-create-wgnj5\" (UID: \"c4ee2adb-b254-42cc-8100-09c598d670ef\") " pod="openstack/neutron-db-create-wgnj5" Dec 07 19:34:17 crc kubenswrapper[4815]: I1207 19:34:17.906868 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7561-account-create-update-v2s5j"] Dec 07 19:34:17 crc kubenswrapper[4815]: I1207 19:34:17.953592 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7892\" (UniqueName: \"kubernetes.io/projected/c4ee2adb-b254-42cc-8100-09c598d670ef-kube-api-access-g7892\") pod \"neutron-db-create-wgnj5\" (UID: \"c4ee2adb-b254-42cc-8100-09c598d670ef\") " pod="openstack/neutron-db-create-wgnj5" Dec 07 19:34:17 crc kubenswrapper[4815]: I1207 19:34:17.994612 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2b4585c-b74e-47d2-9b7e-6a61791104be-operator-scripts\") pod \"neutron-7561-account-create-update-v2s5j\" (UID: \"b2b4585c-b74e-47d2-9b7e-6a61791104be\") " pod="openstack/neutron-7561-account-create-update-v2s5j" Dec 07 19:34:17 crc kubenswrapper[4815]: I1207 19:34:17.994690 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttkmv\" (UniqueName: \"kubernetes.io/projected/b2b4585c-b74e-47d2-9b7e-6a61791104be-kube-api-access-ttkmv\") pod \"neutron-7561-account-create-update-v2s5j\" (UID: \"b2b4585c-b74e-47d2-9b7e-6a61791104be\") " pod="openstack/neutron-7561-account-create-update-v2s5j" Dec 07 19:34:18 crc kubenswrapper[4815]: I1207 19:34:18.022381 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-wgnj5" Dec 07 19:34:18 crc kubenswrapper[4815]: E1207 19:34:18.059834 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-glance-api:current-podified\\\"\"" pod="openstack/glance-db-sync-s7wqs" podUID="9e07853f-f24e-4bf6-8af1-15e4e9cccbc4" Dec 07 19:34:18 crc kubenswrapper[4815]: I1207 19:34:18.078244 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-22e4-account-create-update-5hjjx" Dec 07 19:34:18 crc kubenswrapper[4815]: I1207 19:34:18.139801 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-24gkx-config-jw62j"] Dec 07 19:34:18 crc kubenswrapper[4815]: I1207 19:34:18.141603 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttkmv\" (UniqueName: \"kubernetes.io/projected/b2b4585c-b74e-47d2-9b7e-6a61791104be-kube-api-access-ttkmv\") pod \"neutron-7561-account-create-update-v2s5j\" (UID: \"b2b4585c-b74e-47d2-9b7e-6a61791104be\") " pod="openstack/neutron-7561-account-create-update-v2s5j" Dec 07 19:34:18 crc kubenswrapper[4815]: I1207 19:34:18.142532 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2b4585c-b74e-47d2-9b7e-6a61791104be-operator-scripts\") pod \"neutron-7561-account-create-update-v2s5j\" (UID: \"b2b4585c-b74e-47d2-9b7e-6a61791104be\") " pod="openstack/neutron-7561-account-create-update-v2s5j" Dec 07 19:34:18 crc kubenswrapper[4815]: I1207 19:34:18.145437 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2b4585c-b74e-47d2-9b7e-6a61791104be-operator-scripts\") pod \"neutron-7561-account-create-update-v2s5j\" (UID: \"b2b4585c-b74e-47d2-9b7e-6a61791104be\") " pod="openstack/neutron-7561-account-create-update-v2s5j" Dec 07 19:34:18 crc kubenswrapper[4815]: I1207 19:34:18.204468 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttkmv\" (UniqueName: \"kubernetes.io/projected/b2b4585c-b74e-47d2-9b7e-6a61791104be-kube-api-access-ttkmv\") pod \"neutron-7561-account-create-update-v2s5j\" (UID: \"b2b4585c-b74e-47d2-9b7e-6a61791104be\") " pod="openstack/neutron-7561-account-create-update-v2s5j" Dec 07 19:34:18 crc kubenswrapper[4815]: I1207 19:34:18.252340 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7561-account-create-update-v2s5j" Dec 07 19:34:18 crc kubenswrapper[4815]: I1207 19:34:18.524131 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-tjblv"] Dec 07 19:34:18 crc kubenswrapper[4815]: I1207 19:34:18.838708 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-22e4-account-create-update-5hjjx"] Dec 07 19:34:18 crc kubenswrapper[4815]: W1207 19:34:18.850447 4815 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc7ceffd5_e9d5_42d6_a93f_cbbfa94c0492.slice/crio-27f8ef0035724d4aba3d0d19d7af088af6aa310593822a152d7a38cd96f5db23 WatchSource:0}: Error finding container 27f8ef0035724d4aba3d0d19d7af088af6aa310593822a152d7a38cd96f5db23: Status 404 returned error can't find the container with id 27f8ef0035724d4aba3d0d19d7af088af6aa310593822a152d7a38cd96f5db23 Dec 07 19:34:18 crc kubenswrapper[4815]: I1207 19:34:18.882362 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-pvvmf"] Dec 07 19:34:18 crc kubenswrapper[4815]: I1207 19:34:18.946354 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-wgnj5"] Dec 07 19:34:18 crc kubenswrapper[4815]: I1207 19:34:18.983749 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-t9b9t"] Dec 07 19:34:19 crc kubenswrapper[4815]: I1207 19:34:19.071484 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-22e4-account-create-update-5hjjx" event={"ID":"c7ceffd5-e9d5-42d6-a93f-cbbfa94c0492","Type":"ContainerStarted","Data":"27f8ef0035724d4aba3d0d19d7af088af6aa310593822a152d7a38cd96f5db23"} Dec 07 19:34:19 crc kubenswrapper[4815]: I1207 19:34:19.072943 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-pvvmf" event={"ID":"4bb0c957-b8cf-48ef-8a8d-f409fa031e2f","Type":"ContainerStarted","Data":"b4b52acb7750d9e6abeef2e7c3f48dee0382a357998b485fc0cf626cf1fff80f"} Dec 07 19:34:19 crc kubenswrapper[4815]: I1207 19:34:19.073784 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-t9b9t" event={"ID":"2a63d827-767e-4965-871c-be277190b680","Type":"ContainerStarted","Data":"1e21b19b763a10aa315f5368d5a3a542574f3fadb5929525760264646e2f1985"} Dec 07 19:34:19 crc kubenswrapper[4815]: I1207 19:34:19.084682 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-24gkx-config-jw62j" event={"ID":"2997b217-ae9b-4d50-b2cc-b39825b574f6","Type":"ContainerStarted","Data":"37ffa573132baeeedd24de71eaf28cc651f659e5ea501f45e91697b3ce504cab"} Dec 07 19:34:19 crc kubenswrapper[4815]: I1207 19:34:19.084721 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-24gkx-config-jw62j" event={"ID":"2997b217-ae9b-4d50-b2cc-b39825b574f6","Type":"ContainerStarted","Data":"aa785d22cf947374a6278c1d55666132cb4aa032fbb018d921ae6165539e6d7b"} Dec 07 19:34:19 crc kubenswrapper[4815]: I1207 19:34:19.091690 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-f0f9-account-create-update-djwr2"] Dec 07 19:34:19 crc kubenswrapper[4815]: I1207 19:34:19.095648 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-tjblv" event={"ID":"08539e2c-7b61-45a4-8dad-7763d4cc8d01","Type":"ContainerStarted","Data":"4e80c0b15820ac03d4cbf050bb5a08544ee945014233e8026425a70df3759782"} Dec 07 19:34:19 crc kubenswrapper[4815]: I1207 19:34:19.095718 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-tjblv" event={"ID":"08539e2c-7b61-45a4-8dad-7763d4cc8d01","Type":"ContainerStarted","Data":"e3cdefbc1f8a2d0f39731d1184c2acda959a4eea8abddf0d2de6ebba6cf7d3e0"} Dec 07 19:34:19 crc kubenswrapper[4815]: I1207 19:34:19.097838 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-wgnj5" event={"ID":"c4ee2adb-b254-42cc-8100-09c598d670ef","Type":"ContainerStarted","Data":"0567486e813d56d75b57af861318751db998678a71ddcbe7e7a288c6bf4ec970"} Dec 07 19:34:19 crc kubenswrapper[4815]: I1207 19:34:19.103722 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-24gkx-config-jw62j" podStartSLOduration=12.103703647 podStartE2EDuration="12.103703647s" podCreationTimestamp="2025-12-07 19:34:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:34:19.101701449 +0000 UTC m=+1163.680691494" watchObservedRunningTime="2025-12-07 19:34:19.103703647 +0000 UTC m=+1163.682693692" Dec 07 19:34:19 crc kubenswrapper[4815]: I1207 19:34:19.128316 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-tjblv" podStartSLOduration=3.128294788 podStartE2EDuration="3.128294788s" podCreationTimestamp="2025-12-07 19:34:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:34:19.124198011 +0000 UTC m=+1163.703188056" watchObservedRunningTime="2025-12-07 19:34:19.128294788 +0000 UTC m=+1163.707284833" Dec 07 19:34:19 crc kubenswrapper[4815]: I1207 19:34:19.150274 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7561-account-create-update-v2s5j"] Dec 07 19:34:20 crc kubenswrapper[4815]: I1207 19:34:20.115102 4815 generic.go:334] "Generic (PLEG): container finished" podID="2997b217-ae9b-4d50-b2cc-b39825b574f6" containerID="37ffa573132baeeedd24de71eaf28cc651f659e5ea501f45e91697b3ce504cab" exitCode=0 Dec 07 19:34:20 crc kubenswrapper[4815]: I1207 19:34:20.115200 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-24gkx-config-jw62j" event={"ID":"2997b217-ae9b-4d50-b2cc-b39825b574f6","Type":"ContainerDied","Data":"37ffa573132baeeedd24de71eaf28cc651f659e5ea501f45e91697b3ce504cab"} Dec 07 19:34:20 crc kubenswrapper[4815]: I1207 19:34:20.117542 4815 generic.go:334] "Generic (PLEG): container finished" podID="b2b4585c-b74e-47d2-9b7e-6a61791104be" containerID="e6e3a0aa7582e8a28024c2949c8649e6dfe241f89d48688460aada97a926a10b" exitCode=0 Dec 07 19:34:20 crc kubenswrapper[4815]: I1207 19:34:20.117598 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7561-account-create-update-v2s5j" event={"ID":"b2b4585c-b74e-47d2-9b7e-6a61791104be","Type":"ContainerDied","Data":"e6e3a0aa7582e8a28024c2949c8649e6dfe241f89d48688460aada97a926a10b"} Dec 07 19:34:20 crc kubenswrapper[4815]: I1207 19:34:20.117621 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7561-account-create-update-v2s5j" event={"ID":"b2b4585c-b74e-47d2-9b7e-6a61791104be","Type":"ContainerStarted","Data":"4c8a64a2e73f13e3ec679c5d8f24d23565506b451e338d1ac8f15ec472aaba9f"} Dec 07 19:34:20 crc kubenswrapper[4815]: I1207 19:34:20.119402 4815 generic.go:334] "Generic (PLEG): container finished" podID="08539e2c-7b61-45a4-8dad-7763d4cc8d01" containerID="4e80c0b15820ac03d4cbf050bb5a08544ee945014233e8026425a70df3759782" exitCode=0 Dec 07 19:34:20 crc kubenswrapper[4815]: I1207 19:34:20.119480 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-tjblv" event={"ID":"08539e2c-7b61-45a4-8dad-7763d4cc8d01","Type":"ContainerDied","Data":"4e80c0b15820ac03d4cbf050bb5a08544ee945014233e8026425a70df3759782"} Dec 07 19:34:20 crc kubenswrapper[4815]: I1207 19:34:20.121784 4815 generic.go:334] "Generic (PLEG): container finished" podID="c4ee2adb-b254-42cc-8100-09c598d670ef" containerID="9101cea0698219e2c998588a6935d8d020c7e39bdcf2b48591620eabbab7b833" exitCode=0 Dec 07 19:34:20 crc kubenswrapper[4815]: I1207 19:34:20.121842 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-wgnj5" event={"ID":"c4ee2adb-b254-42cc-8100-09c598d670ef","Type":"ContainerDied","Data":"9101cea0698219e2c998588a6935d8d020c7e39bdcf2b48591620eabbab7b833"} Dec 07 19:34:20 crc kubenswrapper[4815]: I1207 19:34:20.130328 4815 generic.go:334] "Generic (PLEG): container finished" podID="c7ceffd5-e9d5-42d6-a93f-cbbfa94c0492" containerID="9733e73186249f0acf8025983943ef1294053b7f3b72e4d643632ffa9e1ab6f6" exitCode=0 Dec 07 19:34:20 crc kubenswrapper[4815]: I1207 19:34:20.130425 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-22e4-account-create-update-5hjjx" event={"ID":"c7ceffd5-e9d5-42d6-a93f-cbbfa94c0492","Type":"ContainerDied","Data":"9733e73186249f0acf8025983943ef1294053b7f3b72e4d643632ffa9e1ab6f6"} Dec 07 19:34:20 crc kubenswrapper[4815]: I1207 19:34:20.131986 4815 generic.go:334] "Generic (PLEG): container finished" podID="72214816-2ac0-4ba7-99f8-6d56479c7e40" containerID="889d8bcfb06ac09b2961d353f1f74b92505bb6027988185fec9ae18edfec1164" exitCode=0 Dec 07 19:34:20 crc kubenswrapper[4815]: I1207 19:34:20.132025 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-f0f9-account-create-update-djwr2" event={"ID":"72214816-2ac0-4ba7-99f8-6d56479c7e40","Type":"ContainerDied","Data":"889d8bcfb06ac09b2961d353f1f74b92505bb6027988185fec9ae18edfec1164"} Dec 07 19:34:20 crc kubenswrapper[4815]: I1207 19:34:20.132085 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-f0f9-account-create-update-djwr2" event={"ID":"72214816-2ac0-4ba7-99f8-6d56479c7e40","Type":"ContainerStarted","Data":"5044fab9e4017b0e163a0057661ef1a7c6af8e956c69385ad4b1139b111092b9"} Dec 07 19:34:20 crc kubenswrapper[4815]: I1207 19:34:20.136248 4815 generic.go:334] "Generic (PLEG): container finished" podID="2a63d827-767e-4965-871c-be277190b680" containerID="37d0eee11812d81cb6af3ee722d559ea0115882ed3d4303ab77b47e111d50e87" exitCode=0 Dec 07 19:34:20 crc kubenswrapper[4815]: I1207 19:34:20.136287 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-t9b9t" event={"ID":"2a63d827-767e-4965-871c-be277190b680","Type":"ContainerDied","Data":"37d0eee11812d81cb6af3ee722d559ea0115882ed3d4303ab77b47e111d50e87"} Dec 07 19:34:21 crc kubenswrapper[4815]: I1207 19:34:21.924615 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-24gkx" Dec 07 19:34:25 crc kubenswrapper[4815]: I1207 19:34:25.177356 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-24gkx-config-jw62j" event={"ID":"2997b217-ae9b-4d50-b2cc-b39825b574f6","Type":"ContainerDied","Data":"aa785d22cf947374a6278c1d55666132cb4aa032fbb018d921ae6165539e6d7b"} Dec 07 19:34:25 crc kubenswrapper[4815]: I1207 19:34:25.178569 4815 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa785d22cf947374a6278c1d55666132cb4aa032fbb018d921ae6165539e6d7b" Dec 07 19:34:25 crc kubenswrapper[4815]: I1207 19:34:25.179003 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7561-account-create-update-v2s5j" event={"ID":"b2b4585c-b74e-47d2-9b7e-6a61791104be","Type":"ContainerDied","Data":"4c8a64a2e73f13e3ec679c5d8f24d23565506b451e338d1ac8f15ec472aaba9f"} Dec 07 19:34:25 crc kubenswrapper[4815]: I1207 19:34:25.179060 4815 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c8a64a2e73f13e3ec679c5d8f24d23565506b451e338d1ac8f15ec472aaba9f" Dec 07 19:34:25 crc kubenswrapper[4815]: I1207 19:34:25.185321 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-tjblv" event={"ID":"08539e2c-7b61-45a4-8dad-7763d4cc8d01","Type":"ContainerDied","Data":"e3cdefbc1f8a2d0f39731d1184c2acda959a4eea8abddf0d2de6ebba6cf7d3e0"} Dec 07 19:34:25 crc kubenswrapper[4815]: I1207 19:34:25.185353 4815 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3cdefbc1f8a2d0f39731d1184c2acda959a4eea8abddf0d2de6ebba6cf7d3e0" Dec 07 19:34:25 crc kubenswrapper[4815]: I1207 19:34:25.189323 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-wgnj5" event={"ID":"c4ee2adb-b254-42cc-8100-09c598d670ef","Type":"ContainerDied","Data":"0567486e813d56d75b57af861318751db998678a71ddcbe7e7a288c6bf4ec970"} Dec 07 19:34:25 crc kubenswrapper[4815]: I1207 19:34:25.189383 4815 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0567486e813d56d75b57af861318751db998678a71ddcbe7e7a288c6bf4ec970" Dec 07 19:34:25 crc kubenswrapper[4815]: I1207 19:34:25.192243 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-22e4-account-create-update-5hjjx" event={"ID":"c7ceffd5-e9d5-42d6-a93f-cbbfa94c0492","Type":"ContainerDied","Data":"27f8ef0035724d4aba3d0d19d7af088af6aa310593822a152d7a38cd96f5db23"} Dec 07 19:34:25 crc kubenswrapper[4815]: I1207 19:34:25.192265 4815 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="27f8ef0035724d4aba3d0d19d7af088af6aa310593822a152d7a38cd96f5db23" Dec 07 19:34:25 crc kubenswrapper[4815]: I1207 19:34:25.196775 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-f0f9-account-create-update-djwr2" event={"ID":"72214816-2ac0-4ba7-99f8-6d56479c7e40","Type":"ContainerDied","Data":"5044fab9e4017b0e163a0057661ef1a7c6af8e956c69385ad4b1139b111092b9"} Dec 07 19:34:25 crc kubenswrapper[4815]: I1207 19:34:25.196800 4815 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5044fab9e4017b0e163a0057661ef1a7c6af8e956c69385ad4b1139b111092b9" Dec 07 19:34:25 crc kubenswrapper[4815]: I1207 19:34:25.198206 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-t9b9t" event={"ID":"2a63d827-767e-4965-871c-be277190b680","Type":"ContainerDied","Data":"1e21b19b763a10aa315f5368d5a3a542574f3fadb5929525760264646e2f1985"} Dec 07 19:34:25 crc kubenswrapper[4815]: I1207 19:34:25.198223 4815 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e21b19b763a10aa315f5368d5a3a542574f3fadb5929525760264646e2f1985" Dec 07 19:34:25 crc kubenswrapper[4815]: I1207 19:34:25.282063 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-t9b9t" Dec 07 19:34:25 crc kubenswrapper[4815]: I1207 19:34:25.294200 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7561-account-create-update-v2s5j" Dec 07 19:34:25 crc kubenswrapper[4815]: I1207 19:34:25.321707 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-wgnj5" Dec 07 19:34:25 crc kubenswrapper[4815]: I1207 19:34:25.342674 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-22e4-account-create-update-5hjjx" Dec 07 19:34:25 crc kubenswrapper[4815]: I1207 19:34:25.353807 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-24gkx-config-jw62j" Dec 07 19:34:25 crc kubenswrapper[4815]: I1207 19:34:25.358225 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-f0f9-account-create-update-djwr2" Dec 07 19:34:25 crc kubenswrapper[4815]: I1207 19:34:25.361136 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-tjblv" Dec 07 19:34:25 crc kubenswrapper[4815]: I1207 19:34:25.404642 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2b4585c-b74e-47d2-9b7e-6a61791104be-operator-scripts\") pod \"b2b4585c-b74e-47d2-9b7e-6a61791104be\" (UID: \"b2b4585c-b74e-47d2-9b7e-6a61791104be\") " Dec 07 19:34:25 crc kubenswrapper[4815]: I1207 19:34:25.405363 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ttkmv\" (UniqueName: \"kubernetes.io/projected/b2b4585c-b74e-47d2-9b7e-6a61791104be-kube-api-access-ttkmv\") pod \"b2b4585c-b74e-47d2-9b7e-6a61791104be\" (UID: \"b2b4585c-b74e-47d2-9b7e-6a61791104be\") " Dec 07 19:34:25 crc kubenswrapper[4815]: I1207 19:34:25.405407 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2crw\" (UniqueName: \"kubernetes.io/projected/2a63d827-767e-4965-871c-be277190b680-kube-api-access-l2crw\") pod \"2a63d827-767e-4965-871c-be277190b680\" (UID: \"2a63d827-767e-4965-871c-be277190b680\") " Dec 07 19:34:25 crc kubenswrapper[4815]: I1207 19:34:25.405718 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2997b217-ae9b-4d50-b2cc-b39825b574f6-scripts\") pod \"2997b217-ae9b-4d50-b2cc-b39825b574f6\" (UID: \"2997b217-ae9b-4d50-b2cc-b39825b574f6\") " Dec 07 19:34:25 crc kubenswrapper[4815]: I1207 19:34:25.406163 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a63d827-767e-4965-871c-be277190b680-operator-scripts\") pod \"2a63d827-767e-4965-871c-be277190b680\" (UID: \"2a63d827-767e-4965-871c-be277190b680\") " Dec 07 19:34:25 crc kubenswrapper[4815]: I1207 19:34:25.409274 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2997b217-ae9b-4d50-b2cc-b39825b574f6-scripts" (OuterVolumeSpecName: "scripts") pod "2997b217-ae9b-4d50-b2cc-b39825b574f6" (UID: "2997b217-ae9b-4d50-b2cc-b39825b574f6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:34:25 crc kubenswrapper[4815]: I1207 19:34:25.410926 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a63d827-767e-4965-871c-be277190b680-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2a63d827-767e-4965-871c-be277190b680" (UID: "2a63d827-767e-4965-871c-be277190b680"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:34:25 crc kubenswrapper[4815]: I1207 19:34:25.412521 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b2b4585c-b74e-47d2-9b7e-6a61791104be-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b2b4585c-b74e-47d2-9b7e-6a61791104be" (UID: "b2b4585c-b74e-47d2-9b7e-6a61791104be"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:34:25 crc kubenswrapper[4815]: I1207 19:34:25.415431 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2b4585c-b74e-47d2-9b7e-6a61791104be-kube-api-access-ttkmv" (OuterVolumeSpecName: "kube-api-access-ttkmv") pod "b2b4585c-b74e-47d2-9b7e-6a61791104be" (UID: "b2b4585c-b74e-47d2-9b7e-6a61791104be"). InnerVolumeSpecName "kube-api-access-ttkmv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:34:25 crc kubenswrapper[4815]: I1207 19:34:25.415501 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a63d827-767e-4965-871c-be277190b680-kube-api-access-l2crw" (OuterVolumeSpecName: "kube-api-access-l2crw") pod "2a63d827-767e-4965-871c-be277190b680" (UID: "2a63d827-767e-4965-871c-be277190b680"). InnerVolumeSpecName "kube-api-access-l2crw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:34:25 crc kubenswrapper[4815]: I1207 19:34:25.510063 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/2997b217-ae9b-4d50-b2cc-b39825b574f6-additional-scripts\") pod \"2997b217-ae9b-4d50-b2cc-b39825b574f6\" (UID: \"2997b217-ae9b-4d50-b2cc-b39825b574f6\") " Dec 07 19:34:25 crc kubenswrapper[4815]: I1207 19:34:25.510311 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g7892\" (UniqueName: \"kubernetes.io/projected/c4ee2adb-b254-42cc-8100-09c598d670ef-kube-api-access-g7892\") pod \"c4ee2adb-b254-42cc-8100-09c598d670ef\" (UID: \"c4ee2adb-b254-42cc-8100-09c598d670ef\") " Dec 07 19:34:25 crc kubenswrapper[4815]: I1207 19:34:25.510404 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2kps\" (UniqueName: \"kubernetes.io/projected/72214816-2ac0-4ba7-99f8-6d56479c7e40-kube-api-access-d2kps\") pod \"72214816-2ac0-4ba7-99f8-6d56479c7e40\" (UID: \"72214816-2ac0-4ba7-99f8-6d56479c7e40\") " Dec 07 19:34:25 crc kubenswrapper[4815]: I1207 19:34:25.510477 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2997b217-ae9b-4d50-b2cc-b39825b574f6-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "2997b217-ae9b-4d50-b2cc-b39825b574f6" (UID: "2997b217-ae9b-4d50-b2cc-b39825b574f6"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:34:25 crc kubenswrapper[4815]: I1207 19:34:25.511437 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ss9kg\" (UniqueName: \"kubernetes.io/projected/2997b217-ae9b-4d50-b2cc-b39825b574f6-kube-api-access-ss9kg\") pod \"2997b217-ae9b-4d50-b2cc-b39825b574f6\" (UID: \"2997b217-ae9b-4d50-b2cc-b39825b574f6\") " Dec 07 19:34:25 crc kubenswrapper[4815]: I1207 19:34:25.511565 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9nvx\" (UniqueName: \"kubernetes.io/projected/08539e2c-7b61-45a4-8dad-7763d4cc8d01-kube-api-access-w9nvx\") pod \"08539e2c-7b61-45a4-8dad-7763d4cc8d01\" (UID: \"08539e2c-7b61-45a4-8dad-7763d4cc8d01\") " Dec 07 19:34:25 crc kubenswrapper[4815]: I1207 19:34:25.511660 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2997b217-ae9b-4d50-b2cc-b39825b574f6-var-run\") pod \"2997b217-ae9b-4d50-b2cc-b39825b574f6\" (UID: \"2997b217-ae9b-4d50-b2cc-b39825b574f6\") " Dec 07 19:34:25 crc kubenswrapper[4815]: I1207 19:34:25.511752 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2997b217-ae9b-4d50-b2cc-b39825b574f6-var-run-ovn\") pod \"2997b217-ae9b-4d50-b2cc-b39825b574f6\" (UID: \"2997b217-ae9b-4d50-b2cc-b39825b574f6\") " Dec 07 19:34:25 crc kubenswrapper[4815]: I1207 19:34:25.511834 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/08539e2c-7b61-45a4-8dad-7763d4cc8d01-operator-scripts\") pod \"08539e2c-7b61-45a4-8dad-7763d4cc8d01\" (UID: \"08539e2c-7b61-45a4-8dad-7763d4cc8d01\") " Dec 07 19:34:25 crc kubenswrapper[4815]: I1207 19:34:25.511928 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7ceffd5-e9d5-42d6-a93f-cbbfa94c0492-operator-scripts\") pod \"c7ceffd5-e9d5-42d6-a93f-cbbfa94c0492\" (UID: \"c7ceffd5-e9d5-42d6-a93f-cbbfa94c0492\") " Dec 07 19:34:25 crc kubenswrapper[4815]: I1207 19:34:25.512038 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72214816-2ac0-4ba7-99f8-6d56479c7e40-operator-scripts\") pod \"72214816-2ac0-4ba7-99f8-6d56479c7e40\" (UID: \"72214816-2ac0-4ba7-99f8-6d56479c7e40\") " Dec 07 19:34:25 crc kubenswrapper[4815]: I1207 19:34:25.512122 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c4ee2adb-b254-42cc-8100-09c598d670ef-operator-scripts\") pod \"c4ee2adb-b254-42cc-8100-09c598d670ef\" (UID: \"c4ee2adb-b254-42cc-8100-09c598d670ef\") " Dec 07 19:34:25 crc kubenswrapper[4815]: I1207 19:34:25.512203 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-br4xg\" (UniqueName: \"kubernetes.io/projected/c7ceffd5-e9d5-42d6-a93f-cbbfa94c0492-kube-api-access-br4xg\") pod \"c7ceffd5-e9d5-42d6-a93f-cbbfa94c0492\" (UID: \"c7ceffd5-e9d5-42d6-a93f-cbbfa94c0492\") " Dec 07 19:34:25 crc kubenswrapper[4815]: I1207 19:34:25.512274 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2997b217-ae9b-4d50-b2cc-b39825b574f6-var-log-ovn\") pod \"2997b217-ae9b-4d50-b2cc-b39825b574f6\" (UID: \"2997b217-ae9b-4d50-b2cc-b39825b574f6\") " Dec 07 19:34:25 crc kubenswrapper[4815]: I1207 19:34:25.513292 4815 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/2997b217-ae9b-4d50-b2cc-b39825b574f6-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 07 19:34:25 crc kubenswrapper[4815]: I1207 19:34:25.513383 4815 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2997b217-ae9b-4d50-b2cc-b39825b574f6-scripts\") on node \"crc\" DevicePath \"\"" Dec 07 19:34:25 crc kubenswrapper[4815]: I1207 19:34:25.513443 4815 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a63d827-767e-4965-871c-be277190b680-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 07 19:34:25 crc kubenswrapper[4815]: I1207 19:34:25.513521 4815 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2b4585c-b74e-47d2-9b7e-6a61791104be-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 07 19:34:25 crc kubenswrapper[4815]: I1207 19:34:25.513677 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ttkmv\" (UniqueName: \"kubernetes.io/projected/b2b4585c-b74e-47d2-9b7e-6a61791104be-kube-api-access-ttkmv\") on node \"crc\" DevicePath \"\"" Dec 07 19:34:25 crc kubenswrapper[4815]: I1207 19:34:25.513750 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l2crw\" (UniqueName: \"kubernetes.io/projected/2a63d827-767e-4965-871c-be277190b680-kube-api-access-l2crw\") on node \"crc\" DevicePath \"\"" Dec 07 19:34:25 crc kubenswrapper[4815]: I1207 19:34:25.513863 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2997b217-ae9b-4d50-b2cc-b39825b574f6-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "2997b217-ae9b-4d50-b2cc-b39825b574f6" (UID: "2997b217-ae9b-4d50-b2cc-b39825b574f6"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 07 19:34:25 crc kubenswrapper[4815]: I1207 19:34:25.514996 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08539e2c-7b61-45a4-8dad-7763d4cc8d01-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "08539e2c-7b61-45a4-8dad-7763d4cc8d01" (UID: "08539e2c-7b61-45a4-8dad-7763d4cc8d01"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:34:25 crc kubenswrapper[4815]: I1207 19:34:25.515205 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2997b217-ae9b-4d50-b2cc-b39825b574f6-var-run" (OuterVolumeSpecName: "var-run") pod "2997b217-ae9b-4d50-b2cc-b39825b574f6" (UID: "2997b217-ae9b-4d50-b2cc-b39825b574f6"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 07 19:34:25 crc kubenswrapper[4815]: I1207 19:34:25.516036 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2997b217-ae9b-4d50-b2cc-b39825b574f6-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "2997b217-ae9b-4d50-b2cc-b39825b574f6" (UID: "2997b217-ae9b-4d50-b2cc-b39825b574f6"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 07 19:34:25 crc kubenswrapper[4815]: I1207 19:34:25.516815 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72214816-2ac0-4ba7-99f8-6d56479c7e40-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "72214816-2ac0-4ba7-99f8-6d56479c7e40" (UID: "72214816-2ac0-4ba7-99f8-6d56479c7e40"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:34:25 crc kubenswrapper[4815]: I1207 19:34:25.517291 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7ceffd5-e9d5-42d6-a93f-cbbfa94c0492-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c7ceffd5-e9d5-42d6-a93f-cbbfa94c0492" (UID: "c7ceffd5-e9d5-42d6-a93f-cbbfa94c0492"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:34:25 crc kubenswrapper[4815]: I1207 19:34:25.517876 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4ee2adb-b254-42cc-8100-09c598d670ef-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c4ee2adb-b254-42cc-8100-09c598d670ef" (UID: "c4ee2adb-b254-42cc-8100-09c598d670ef"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:34:25 crc kubenswrapper[4815]: I1207 19:34:25.517877 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72214816-2ac0-4ba7-99f8-6d56479c7e40-kube-api-access-d2kps" (OuterVolumeSpecName: "kube-api-access-d2kps") pod "72214816-2ac0-4ba7-99f8-6d56479c7e40" (UID: "72214816-2ac0-4ba7-99f8-6d56479c7e40"). InnerVolumeSpecName "kube-api-access-d2kps". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:34:25 crc kubenswrapper[4815]: I1207 19:34:25.518028 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08539e2c-7b61-45a4-8dad-7763d4cc8d01-kube-api-access-w9nvx" (OuterVolumeSpecName: "kube-api-access-w9nvx") pod "08539e2c-7b61-45a4-8dad-7763d4cc8d01" (UID: "08539e2c-7b61-45a4-8dad-7763d4cc8d01"). InnerVolumeSpecName "kube-api-access-w9nvx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:34:25 crc kubenswrapper[4815]: I1207 19:34:25.519604 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4ee2adb-b254-42cc-8100-09c598d670ef-kube-api-access-g7892" (OuterVolumeSpecName: "kube-api-access-g7892") pod "c4ee2adb-b254-42cc-8100-09c598d670ef" (UID: "c4ee2adb-b254-42cc-8100-09c598d670ef"). InnerVolumeSpecName "kube-api-access-g7892". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:34:25 crc kubenswrapper[4815]: I1207 19:34:25.522213 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7ceffd5-e9d5-42d6-a93f-cbbfa94c0492-kube-api-access-br4xg" (OuterVolumeSpecName: "kube-api-access-br4xg") pod "c7ceffd5-e9d5-42d6-a93f-cbbfa94c0492" (UID: "c7ceffd5-e9d5-42d6-a93f-cbbfa94c0492"). InnerVolumeSpecName "kube-api-access-br4xg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:34:25 crc kubenswrapper[4815]: I1207 19:34:25.522297 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2997b217-ae9b-4d50-b2cc-b39825b574f6-kube-api-access-ss9kg" (OuterVolumeSpecName: "kube-api-access-ss9kg") pod "2997b217-ae9b-4d50-b2cc-b39825b574f6" (UID: "2997b217-ae9b-4d50-b2cc-b39825b574f6"). InnerVolumeSpecName "kube-api-access-ss9kg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:34:25 crc kubenswrapper[4815]: I1207 19:34:25.615684 4815 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72214816-2ac0-4ba7-99f8-6d56479c7e40-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 07 19:34:25 crc kubenswrapper[4815]: I1207 19:34:25.615715 4815 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c4ee2adb-b254-42cc-8100-09c598d670ef-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 07 19:34:25 crc kubenswrapper[4815]: I1207 19:34:25.615724 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-br4xg\" (UniqueName: \"kubernetes.io/projected/c7ceffd5-e9d5-42d6-a93f-cbbfa94c0492-kube-api-access-br4xg\") on node \"crc\" DevicePath \"\"" Dec 07 19:34:25 crc kubenswrapper[4815]: I1207 19:34:25.615733 4815 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2997b217-ae9b-4d50-b2cc-b39825b574f6-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 07 19:34:25 crc kubenswrapper[4815]: I1207 19:34:25.615741 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g7892\" (UniqueName: \"kubernetes.io/projected/c4ee2adb-b254-42cc-8100-09c598d670ef-kube-api-access-g7892\") on node \"crc\" DevicePath \"\"" Dec 07 19:34:25 crc kubenswrapper[4815]: I1207 19:34:25.615750 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d2kps\" (UniqueName: \"kubernetes.io/projected/72214816-2ac0-4ba7-99f8-6d56479c7e40-kube-api-access-d2kps\") on node \"crc\" DevicePath \"\"" Dec 07 19:34:25 crc kubenswrapper[4815]: I1207 19:34:25.615758 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ss9kg\" (UniqueName: \"kubernetes.io/projected/2997b217-ae9b-4d50-b2cc-b39825b574f6-kube-api-access-ss9kg\") on node \"crc\" DevicePath \"\"" Dec 07 19:34:25 crc kubenswrapper[4815]: I1207 19:34:25.615766 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9nvx\" (UniqueName: \"kubernetes.io/projected/08539e2c-7b61-45a4-8dad-7763d4cc8d01-kube-api-access-w9nvx\") on node \"crc\" DevicePath \"\"" Dec 07 19:34:25 crc kubenswrapper[4815]: I1207 19:34:25.615774 4815 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2997b217-ae9b-4d50-b2cc-b39825b574f6-var-run\") on node \"crc\" DevicePath \"\"" Dec 07 19:34:25 crc kubenswrapper[4815]: I1207 19:34:25.615782 4815 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2997b217-ae9b-4d50-b2cc-b39825b574f6-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 07 19:34:25 crc kubenswrapper[4815]: I1207 19:34:25.615792 4815 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/08539e2c-7b61-45a4-8dad-7763d4cc8d01-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 07 19:34:25 crc kubenswrapper[4815]: I1207 19:34:25.615800 4815 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7ceffd5-e9d5-42d6-a93f-cbbfa94c0492-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 07 19:34:26 crc kubenswrapper[4815]: I1207 19:34:26.206081 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-24gkx-config-jw62j" Dec 07 19:34:26 crc kubenswrapper[4815]: I1207 19:34:26.206343 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-pvvmf" event={"ID":"4bb0c957-b8cf-48ef-8a8d-f409fa031e2f","Type":"ContainerStarted","Data":"7700c80d760dfd0819f2f9de3d138f2dafe643324fdbc5b091bdd0eda3b36356"} Dec 07 19:34:26 crc kubenswrapper[4815]: I1207 19:34:26.206426 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-f0f9-account-create-update-djwr2" Dec 07 19:34:26 crc kubenswrapper[4815]: I1207 19:34:26.206461 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7561-account-create-update-v2s5j" Dec 07 19:34:26 crc kubenswrapper[4815]: I1207 19:34:26.207500 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-tjblv" Dec 07 19:34:26 crc kubenswrapper[4815]: I1207 19:34:26.207900 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-t9b9t" Dec 07 19:34:26 crc kubenswrapper[4815]: I1207 19:34:26.208000 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-wgnj5" Dec 07 19:34:26 crc kubenswrapper[4815]: I1207 19:34:26.208130 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-22e4-account-create-update-5hjjx" Dec 07 19:34:26 crc kubenswrapper[4815]: I1207 19:34:26.248488 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-pvvmf" podStartSLOduration=2.996695119 podStartE2EDuration="9.248463687s" podCreationTimestamp="2025-12-07 19:34:17 +0000 UTC" firstStartedPulling="2025-12-07 19:34:18.901169937 +0000 UTC m=+1163.480159982" lastFinishedPulling="2025-12-07 19:34:25.152938505 +0000 UTC m=+1169.731928550" observedRunningTime="2025-12-07 19:34:26.231989773 +0000 UTC m=+1170.810979818" watchObservedRunningTime="2025-12-07 19:34:26.248463687 +0000 UTC m=+1170.827453732" Dec 07 19:34:26 crc kubenswrapper[4815]: I1207 19:34:26.360219 4815 patch_prober.go:28] interesting pod/machine-config-daemon-gkn4h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 07 19:34:26 crc kubenswrapper[4815]: I1207 19:34:26.360562 4815 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 07 19:34:26 crc kubenswrapper[4815]: I1207 19:34:26.545882 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-24gkx-config-jw62j"] Dec 07 19:34:26 crc kubenswrapper[4815]: I1207 19:34:26.552601 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-24gkx-config-jw62j"] Dec 07 19:34:27 crc kubenswrapper[4815]: I1207 19:34:27.778983 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2997b217-ae9b-4d50-b2cc-b39825b574f6" path="/var/lib/kubelet/pods/2997b217-ae9b-4d50-b2cc-b39825b574f6/volumes" Dec 07 19:34:29 crc kubenswrapper[4815]: I1207 19:34:29.227292 4815 generic.go:334] "Generic (PLEG): container finished" podID="4bb0c957-b8cf-48ef-8a8d-f409fa031e2f" containerID="7700c80d760dfd0819f2f9de3d138f2dafe643324fdbc5b091bdd0eda3b36356" exitCode=0 Dec 07 19:34:29 crc kubenswrapper[4815]: I1207 19:34:29.227332 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-pvvmf" event={"ID":"4bb0c957-b8cf-48ef-8a8d-f409fa031e2f","Type":"ContainerDied","Data":"7700c80d760dfd0819f2f9de3d138f2dafe643324fdbc5b091bdd0eda3b36356"} Dec 07 19:34:30 crc kubenswrapper[4815]: I1207 19:34:30.572722 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-pvvmf" Dec 07 19:34:30 crc kubenswrapper[4815]: I1207 19:34:30.596687 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bb0c957-b8cf-48ef-8a8d-f409fa031e2f-combined-ca-bundle\") pod \"4bb0c957-b8cf-48ef-8a8d-f409fa031e2f\" (UID: \"4bb0c957-b8cf-48ef-8a8d-f409fa031e2f\") " Dec 07 19:34:30 crc kubenswrapper[4815]: I1207 19:34:30.596723 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bb0c957-b8cf-48ef-8a8d-f409fa031e2f-config-data\") pod \"4bb0c957-b8cf-48ef-8a8d-f409fa031e2f\" (UID: \"4bb0c957-b8cf-48ef-8a8d-f409fa031e2f\") " Dec 07 19:34:30 crc kubenswrapper[4815]: I1207 19:34:30.596785 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6lqk6\" (UniqueName: \"kubernetes.io/projected/4bb0c957-b8cf-48ef-8a8d-f409fa031e2f-kube-api-access-6lqk6\") pod \"4bb0c957-b8cf-48ef-8a8d-f409fa031e2f\" (UID: \"4bb0c957-b8cf-48ef-8a8d-f409fa031e2f\") " Dec 07 19:34:30 crc kubenswrapper[4815]: I1207 19:34:30.629096 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb0c957-b8cf-48ef-8a8d-f409fa031e2f-kube-api-access-6lqk6" (OuterVolumeSpecName: "kube-api-access-6lqk6") pod "4bb0c957-b8cf-48ef-8a8d-f409fa031e2f" (UID: "4bb0c957-b8cf-48ef-8a8d-f409fa031e2f"). InnerVolumeSpecName "kube-api-access-6lqk6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:34:30 crc kubenswrapper[4815]: I1207 19:34:30.654272 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bb0c957-b8cf-48ef-8a8d-f409fa031e2f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4bb0c957-b8cf-48ef-8a8d-f409fa031e2f" (UID: "4bb0c957-b8cf-48ef-8a8d-f409fa031e2f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:34:30 crc kubenswrapper[4815]: I1207 19:34:30.684188 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bb0c957-b8cf-48ef-8a8d-f409fa031e2f-config-data" (OuterVolumeSpecName: "config-data") pod "4bb0c957-b8cf-48ef-8a8d-f409fa031e2f" (UID: "4bb0c957-b8cf-48ef-8a8d-f409fa031e2f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:34:30 crc kubenswrapper[4815]: I1207 19:34:30.697964 4815 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bb0c957-b8cf-48ef-8a8d-f409fa031e2f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 07 19:34:30 crc kubenswrapper[4815]: I1207 19:34:30.698185 4815 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bb0c957-b8cf-48ef-8a8d-f409fa031e2f-config-data\") on node \"crc\" DevicePath \"\"" Dec 07 19:34:30 crc kubenswrapper[4815]: I1207 19:34:30.698246 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6lqk6\" (UniqueName: \"kubernetes.io/projected/4bb0c957-b8cf-48ef-8a8d-f409fa031e2f-kube-api-access-6lqk6\") on node \"crc\" DevicePath \"\"" Dec 07 19:34:31 crc kubenswrapper[4815]: I1207 19:34:31.257318 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-pvvmf" event={"ID":"4bb0c957-b8cf-48ef-8a8d-f409fa031e2f","Type":"ContainerDied","Data":"b4b52acb7750d9e6abeef2e7c3f48dee0382a357998b485fc0cf626cf1fff80f"} Dec 07 19:34:31 crc kubenswrapper[4815]: I1207 19:34:31.257378 4815 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b4b52acb7750d9e6abeef2e7c3f48dee0382a357998b485fc0cf626cf1fff80f" Dec 07 19:34:31 crc kubenswrapper[4815]: I1207 19:34:31.257460 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-pvvmf" Dec 07 19:34:31 crc kubenswrapper[4815]: I1207 19:34:31.611449 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-66fbd85b65-jz6kd"] Dec 07 19:34:31 crc kubenswrapper[4815]: E1207 19:34:31.612522 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2b4585c-b74e-47d2-9b7e-6a61791104be" containerName="mariadb-account-create-update" Dec 07 19:34:31 crc kubenswrapper[4815]: I1207 19:34:31.612594 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2b4585c-b74e-47d2-9b7e-6a61791104be" containerName="mariadb-account-create-update" Dec 07 19:34:31 crc kubenswrapper[4815]: E1207 19:34:31.612660 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2997b217-ae9b-4d50-b2cc-b39825b574f6" containerName="ovn-config" Dec 07 19:34:31 crc kubenswrapper[4815]: I1207 19:34:31.612710 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="2997b217-ae9b-4d50-b2cc-b39825b574f6" containerName="ovn-config" Dec 07 19:34:31 crc kubenswrapper[4815]: E1207 19:34:31.612766 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bb0c957-b8cf-48ef-8a8d-f409fa031e2f" containerName="keystone-db-sync" Dec 07 19:34:31 crc kubenswrapper[4815]: I1207 19:34:31.612822 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bb0c957-b8cf-48ef-8a8d-f409fa031e2f" containerName="keystone-db-sync" Dec 07 19:34:31 crc kubenswrapper[4815]: E1207 19:34:31.612883 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08539e2c-7b61-45a4-8dad-7763d4cc8d01" containerName="mariadb-database-create" Dec 07 19:34:31 crc kubenswrapper[4815]: I1207 19:34:31.612948 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="08539e2c-7b61-45a4-8dad-7763d4cc8d01" containerName="mariadb-database-create" Dec 07 19:34:31 crc kubenswrapper[4815]: E1207 19:34:31.613011 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a63d827-767e-4965-871c-be277190b680" containerName="mariadb-database-create" Dec 07 19:34:31 crc kubenswrapper[4815]: I1207 19:34:31.613075 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a63d827-767e-4965-871c-be277190b680" containerName="mariadb-database-create" Dec 07 19:34:31 crc kubenswrapper[4815]: E1207 19:34:31.613144 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4ee2adb-b254-42cc-8100-09c598d670ef" containerName="mariadb-database-create" Dec 07 19:34:31 crc kubenswrapper[4815]: I1207 19:34:31.613198 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4ee2adb-b254-42cc-8100-09c598d670ef" containerName="mariadb-database-create" Dec 07 19:34:31 crc kubenswrapper[4815]: E1207 19:34:31.613271 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72214816-2ac0-4ba7-99f8-6d56479c7e40" containerName="mariadb-account-create-update" Dec 07 19:34:31 crc kubenswrapper[4815]: I1207 19:34:31.613327 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="72214816-2ac0-4ba7-99f8-6d56479c7e40" containerName="mariadb-account-create-update" Dec 07 19:34:31 crc kubenswrapper[4815]: E1207 19:34:31.613396 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7ceffd5-e9d5-42d6-a93f-cbbfa94c0492" containerName="mariadb-account-create-update" Dec 07 19:34:31 crc kubenswrapper[4815]: I1207 19:34:31.613449 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7ceffd5-e9d5-42d6-a93f-cbbfa94c0492" containerName="mariadb-account-create-update" Dec 07 19:34:31 crc kubenswrapper[4815]: I1207 19:34:31.613674 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a63d827-767e-4965-871c-be277190b680" containerName="mariadb-database-create" Dec 07 19:34:31 crc kubenswrapper[4815]: I1207 19:34:31.613755 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4ee2adb-b254-42cc-8100-09c598d670ef" containerName="mariadb-database-create" Dec 07 19:34:31 crc kubenswrapper[4815]: I1207 19:34:31.613832 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bb0c957-b8cf-48ef-8a8d-f409fa031e2f" containerName="keystone-db-sync" Dec 07 19:34:31 crc kubenswrapper[4815]: I1207 19:34:31.613902 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="72214816-2ac0-4ba7-99f8-6d56479c7e40" containerName="mariadb-account-create-update" Dec 07 19:34:31 crc kubenswrapper[4815]: I1207 19:34:31.613992 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7ceffd5-e9d5-42d6-a93f-cbbfa94c0492" containerName="mariadb-account-create-update" Dec 07 19:34:31 crc kubenswrapper[4815]: I1207 19:34:31.614056 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="08539e2c-7b61-45a4-8dad-7763d4cc8d01" containerName="mariadb-database-create" Dec 07 19:34:31 crc kubenswrapper[4815]: I1207 19:34:31.614119 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2b4585c-b74e-47d2-9b7e-6a61791104be" containerName="mariadb-account-create-update" Dec 07 19:34:31 crc kubenswrapper[4815]: I1207 19:34:31.614187 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="2997b217-ae9b-4d50-b2cc-b39825b574f6" containerName="ovn-config" Dec 07 19:34:31 crc kubenswrapper[4815]: I1207 19:34:31.615179 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66fbd85b65-jz6kd" Dec 07 19:34:31 crc kubenswrapper[4815]: I1207 19:34:31.620799 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jv725\" (UniqueName: \"kubernetes.io/projected/90d2c008-b1eb-4be6-b9aa-932ac14e287b-kube-api-access-jv725\") pod \"dnsmasq-dns-66fbd85b65-jz6kd\" (UID: \"90d2c008-b1eb-4be6-b9aa-932ac14e287b\") " pod="openstack/dnsmasq-dns-66fbd85b65-jz6kd" Dec 07 19:34:31 crc kubenswrapper[4815]: I1207 19:34:31.621042 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90d2c008-b1eb-4be6-b9aa-932ac14e287b-config\") pod \"dnsmasq-dns-66fbd85b65-jz6kd\" (UID: \"90d2c008-b1eb-4be6-b9aa-932ac14e287b\") " pod="openstack/dnsmasq-dns-66fbd85b65-jz6kd" Dec 07 19:34:31 crc kubenswrapper[4815]: I1207 19:34:31.621134 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/90d2c008-b1eb-4be6-b9aa-932ac14e287b-ovsdbserver-sb\") pod \"dnsmasq-dns-66fbd85b65-jz6kd\" (UID: \"90d2c008-b1eb-4be6-b9aa-932ac14e287b\") " pod="openstack/dnsmasq-dns-66fbd85b65-jz6kd" Dec 07 19:34:31 crc kubenswrapper[4815]: I1207 19:34:31.621221 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90d2c008-b1eb-4be6-b9aa-932ac14e287b-dns-svc\") pod \"dnsmasq-dns-66fbd85b65-jz6kd\" (UID: \"90d2c008-b1eb-4be6-b9aa-932ac14e287b\") " pod="openstack/dnsmasq-dns-66fbd85b65-jz6kd" Dec 07 19:34:31 crc kubenswrapper[4815]: I1207 19:34:31.621321 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/90d2c008-b1eb-4be6-b9aa-932ac14e287b-ovsdbserver-nb\") pod \"dnsmasq-dns-66fbd85b65-jz6kd\" (UID: \"90d2c008-b1eb-4be6-b9aa-932ac14e287b\") " pod="openstack/dnsmasq-dns-66fbd85b65-jz6kd" Dec 07 19:34:31 crc kubenswrapper[4815]: I1207 19:34:31.638773 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-66fbd85b65-jz6kd"] Dec 07 19:34:31 crc kubenswrapper[4815]: I1207 19:34:31.655520 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-qtv6m"] Dec 07 19:34:31 crc kubenswrapper[4815]: I1207 19:34:31.657137 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-qtv6m" Dec 07 19:34:31 crc kubenswrapper[4815]: I1207 19:34:31.659929 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-mdlc7" Dec 07 19:34:31 crc kubenswrapper[4815]: I1207 19:34:31.660108 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 07 19:34:31 crc kubenswrapper[4815]: I1207 19:34:31.666847 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 07 19:34:31 crc kubenswrapper[4815]: I1207 19:34:31.666847 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 07 19:34:31 crc kubenswrapper[4815]: I1207 19:34:31.667139 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 07 19:34:31 crc kubenswrapper[4815]: I1207 19:34:31.743819 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90d2c008-b1eb-4be6-b9aa-932ac14e287b-dns-svc\") pod \"dnsmasq-dns-66fbd85b65-jz6kd\" (UID: \"90d2c008-b1eb-4be6-b9aa-932ac14e287b\") " pod="openstack/dnsmasq-dns-66fbd85b65-jz6kd" Dec 07 19:34:31 crc kubenswrapper[4815]: I1207 19:34:31.743986 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/90d2c008-b1eb-4be6-b9aa-932ac14e287b-ovsdbserver-nb\") pod \"dnsmasq-dns-66fbd85b65-jz6kd\" (UID: \"90d2c008-b1eb-4be6-b9aa-932ac14e287b\") " pod="openstack/dnsmasq-dns-66fbd85b65-jz6kd" Dec 07 19:34:31 crc kubenswrapper[4815]: I1207 19:34:31.744435 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jv725\" (UniqueName: \"kubernetes.io/projected/90d2c008-b1eb-4be6-b9aa-932ac14e287b-kube-api-access-jv725\") pod \"dnsmasq-dns-66fbd85b65-jz6kd\" (UID: \"90d2c008-b1eb-4be6-b9aa-932ac14e287b\") " pod="openstack/dnsmasq-dns-66fbd85b65-jz6kd" Dec 07 19:34:31 crc kubenswrapper[4815]: I1207 19:34:31.744500 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90d2c008-b1eb-4be6-b9aa-932ac14e287b-config\") pod \"dnsmasq-dns-66fbd85b65-jz6kd\" (UID: \"90d2c008-b1eb-4be6-b9aa-932ac14e287b\") " pod="openstack/dnsmasq-dns-66fbd85b65-jz6kd" Dec 07 19:34:31 crc kubenswrapper[4815]: I1207 19:34:31.744538 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/90d2c008-b1eb-4be6-b9aa-932ac14e287b-ovsdbserver-sb\") pod \"dnsmasq-dns-66fbd85b65-jz6kd\" (UID: \"90d2c008-b1eb-4be6-b9aa-932ac14e287b\") " pod="openstack/dnsmasq-dns-66fbd85b65-jz6kd" Dec 07 19:34:31 crc kubenswrapper[4815]: I1207 19:34:31.744731 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/90d2c008-b1eb-4be6-b9aa-932ac14e287b-ovsdbserver-nb\") pod \"dnsmasq-dns-66fbd85b65-jz6kd\" (UID: \"90d2c008-b1eb-4be6-b9aa-932ac14e287b\") " pod="openstack/dnsmasq-dns-66fbd85b65-jz6kd" Dec 07 19:34:31 crc kubenswrapper[4815]: I1207 19:34:31.744732 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90d2c008-b1eb-4be6-b9aa-932ac14e287b-dns-svc\") pod \"dnsmasq-dns-66fbd85b65-jz6kd\" (UID: \"90d2c008-b1eb-4be6-b9aa-932ac14e287b\") " pod="openstack/dnsmasq-dns-66fbd85b65-jz6kd" Dec 07 19:34:31 crc kubenswrapper[4815]: I1207 19:34:31.745294 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/90d2c008-b1eb-4be6-b9aa-932ac14e287b-ovsdbserver-sb\") pod \"dnsmasq-dns-66fbd85b65-jz6kd\" (UID: \"90d2c008-b1eb-4be6-b9aa-932ac14e287b\") " pod="openstack/dnsmasq-dns-66fbd85b65-jz6kd" Dec 07 19:34:31 crc kubenswrapper[4815]: I1207 19:34:31.745607 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90d2c008-b1eb-4be6-b9aa-932ac14e287b-config\") pod \"dnsmasq-dns-66fbd85b65-jz6kd\" (UID: \"90d2c008-b1eb-4be6-b9aa-932ac14e287b\") " pod="openstack/dnsmasq-dns-66fbd85b65-jz6kd" Dec 07 19:34:31 crc kubenswrapper[4815]: I1207 19:34:31.810648 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jv725\" (UniqueName: \"kubernetes.io/projected/90d2c008-b1eb-4be6-b9aa-932ac14e287b-kube-api-access-jv725\") pod \"dnsmasq-dns-66fbd85b65-jz6kd\" (UID: \"90d2c008-b1eb-4be6-b9aa-932ac14e287b\") " pod="openstack/dnsmasq-dns-66fbd85b65-jz6kd" Dec 07 19:34:31 crc kubenswrapper[4815]: I1207 19:34:31.837358 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-qtv6m"] Dec 07 19:34:31 crc kubenswrapper[4815]: I1207 19:34:31.848259 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgcsv\" (UniqueName: \"kubernetes.io/projected/567c32b2-5ca2-4ebf-93f2-76c03a5add49-kube-api-access-rgcsv\") pod \"keystone-bootstrap-qtv6m\" (UID: \"567c32b2-5ca2-4ebf-93f2-76c03a5add49\") " pod="openstack/keystone-bootstrap-qtv6m" Dec 07 19:34:31 crc kubenswrapper[4815]: I1207 19:34:31.848638 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/567c32b2-5ca2-4ebf-93f2-76c03a5add49-combined-ca-bundle\") pod \"keystone-bootstrap-qtv6m\" (UID: \"567c32b2-5ca2-4ebf-93f2-76c03a5add49\") " pod="openstack/keystone-bootstrap-qtv6m" Dec 07 19:34:31 crc kubenswrapper[4815]: I1207 19:34:31.848655 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/567c32b2-5ca2-4ebf-93f2-76c03a5add49-scripts\") pod \"keystone-bootstrap-qtv6m\" (UID: \"567c32b2-5ca2-4ebf-93f2-76c03a5add49\") " pod="openstack/keystone-bootstrap-qtv6m" Dec 07 19:34:31 crc kubenswrapper[4815]: I1207 19:34:31.848731 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/567c32b2-5ca2-4ebf-93f2-76c03a5add49-fernet-keys\") pod \"keystone-bootstrap-qtv6m\" (UID: \"567c32b2-5ca2-4ebf-93f2-76c03a5add49\") " pod="openstack/keystone-bootstrap-qtv6m" Dec 07 19:34:31 crc kubenswrapper[4815]: I1207 19:34:31.848766 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/567c32b2-5ca2-4ebf-93f2-76c03a5add49-config-data\") pod \"keystone-bootstrap-qtv6m\" (UID: \"567c32b2-5ca2-4ebf-93f2-76c03a5add49\") " pod="openstack/keystone-bootstrap-qtv6m" Dec 07 19:34:31 crc kubenswrapper[4815]: I1207 19:34:31.848845 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/567c32b2-5ca2-4ebf-93f2-76c03a5add49-credential-keys\") pod \"keystone-bootstrap-qtv6m\" (UID: \"567c32b2-5ca2-4ebf-93f2-76c03a5add49\") " pod="openstack/keystone-bootstrap-qtv6m" Dec 07 19:34:31 crc kubenswrapper[4815]: I1207 19:34:31.951267 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66fbd85b65-jz6kd" Dec 07 19:34:31 crc kubenswrapper[4815]: I1207 19:34:31.952055 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/567c32b2-5ca2-4ebf-93f2-76c03a5add49-credential-keys\") pod \"keystone-bootstrap-qtv6m\" (UID: \"567c32b2-5ca2-4ebf-93f2-76c03a5add49\") " pod="openstack/keystone-bootstrap-qtv6m" Dec 07 19:34:31 crc kubenswrapper[4815]: I1207 19:34:31.952115 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgcsv\" (UniqueName: \"kubernetes.io/projected/567c32b2-5ca2-4ebf-93f2-76c03a5add49-kube-api-access-rgcsv\") pod \"keystone-bootstrap-qtv6m\" (UID: \"567c32b2-5ca2-4ebf-93f2-76c03a5add49\") " pod="openstack/keystone-bootstrap-qtv6m" Dec 07 19:34:31 crc kubenswrapper[4815]: I1207 19:34:31.952136 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/567c32b2-5ca2-4ebf-93f2-76c03a5add49-combined-ca-bundle\") pod \"keystone-bootstrap-qtv6m\" (UID: \"567c32b2-5ca2-4ebf-93f2-76c03a5add49\") " pod="openstack/keystone-bootstrap-qtv6m" Dec 07 19:34:31 crc kubenswrapper[4815]: I1207 19:34:31.952151 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/567c32b2-5ca2-4ebf-93f2-76c03a5add49-scripts\") pod \"keystone-bootstrap-qtv6m\" (UID: \"567c32b2-5ca2-4ebf-93f2-76c03a5add49\") " pod="openstack/keystone-bootstrap-qtv6m" Dec 07 19:34:31 crc kubenswrapper[4815]: I1207 19:34:31.952199 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/567c32b2-5ca2-4ebf-93f2-76c03a5add49-config-data\") pod \"keystone-bootstrap-qtv6m\" (UID: \"567c32b2-5ca2-4ebf-93f2-76c03a5add49\") " pod="openstack/keystone-bootstrap-qtv6m" Dec 07 19:34:31 crc kubenswrapper[4815]: I1207 19:34:31.952214 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/567c32b2-5ca2-4ebf-93f2-76c03a5add49-fernet-keys\") pod \"keystone-bootstrap-qtv6m\" (UID: \"567c32b2-5ca2-4ebf-93f2-76c03a5add49\") " pod="openstack/keystone-bootstrap-qtv6m" Dec 07 19:34:31 crc kubenswrapper[4815]: I1207 19:34:31.955222 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 07 19:34:31 crc kubenswrapper[4815]: I1207 19:34:31.957636 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 07 19:34:31 crc kubenswrapper[4815]: I1207 19:34:31.962536 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 07 19:34:31 crc kubenswrapper[4815]: I1207 19:34:31.962860 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 07 19:34:31 crc kubenswrapper[4815]: I1207 19:34:31.964864 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/567c32b2-5ca2-4ebf-93f2-76c03a5add49-config-data\") pod \"keystone-bootstrap-qtv6m\" (UID: \"567c32b2-5ca2-4ebf-93f2-76c03a5add49\") " pod="openstack/keystone-bootstrap-qtv6m" Dec 07 19:34:31 crc kubenswrapper[4815]: I1207 19:34:31.964895 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/567c32b2-5ca2-4ebf-93f2-76c03a5add49-combined-ca-bundle\") pod \"keystone-bootstrap-qtv6m\" (UID: \"567c32b2-5ca2-4ebf-93f2-76c03a5add49\") " pod="openstack/keystone-bootstrap-qtv6m" Dec 07 19:34:31 crc kubenswrapper[4815]: I1207 19:34:31.965333 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/567c32b2-5ca2-4ebf-93f2-76c03a5add49-scripts\") pod \"keystone-bootstrap-qtv6m\" (UID: \"567c32b2-5ca2-4ebf-93f2-76c03a5add49\") " pod="openstack/keystone-bootstrap-qtv6m" Dec 07 19:34:31 crc kubenswrapper[4815]: I1207 19:34:31.976508 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/567c32b2-5ca2-4ebf-93f2-76c03a5add49-credential-keys\") pod \"keystone-bootstrap-qtv6m\" (UID: \"567c32b2-5ca2-4ebf-93f2-76c03a5add49\") " pod="openstack/keystone-bootstrap-qtv6m" Dec 07 19:34:31 crc kubenswrapper[4815]: I1207 19:34:31.989677 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/567c32b2-5ca2-4ebf-93f2-76c03a5add49-fernet-keys\") pod \"keystone-bootstrap-qtv6m\" (UID: \"567c32b2-5ca2-4ebf-93f2-76c03a5add49\") " pod="openstack/keystone-bootstrap-qtv6m" Dec 07 19:34:31 crc kubenswrapper[4815]: I1207 19:34:31.999726 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgcsv\" (UniqueName: \"kubernetes.io/projected/567c32b2-5ca2-4ebf-93f2-76c03a5add49-kube-api-access-rgcsv\") pod \"keystone-bootstrap-qtv6m\" (UID: \"567c32b2-5ca2-4ebf-93f2-76c03a5add49\") " pod="openstack/keystone-bootstrap-qtv6m" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.012737 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.055118 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q89mk\" (UniqueName: \"kubernetes.io/projected/69daea8f-61a8-4a91-b782-afdcb01e0605-kube-api-access-q89mk\") pod \"ceilometer-0\" (UID: \"69daea8f-61a8-4a91-b782-afdcb01e0605\") " pod="openstack/ceilometer-0" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.055161 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69daea8f-61a8-4a91-b782-afdcb01e0605-run-httpd\") pod \"ceilometer-0\" (UID: \"69daea8f-61a8-4a91-b782-afdcb01e0605\") " pod="openstack/ceilometer-0" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.055194 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69daea8f-61a8-4a91-b782-afdcb01e0605-config-data\") pod \"ceilometer-0\" (UID: \"69daea8f-61a8-4a91-b782-afdcb01e0605\") " pod="openstack/ceilometer-0" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.055217 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69daea8f-61a8-4a91-b782-afdcb01e0605-scripts\") pod \"ceilometer-0\" (UID: \"69daea8f-61a8-4a91-b782-afdcb01e0605\") " pod="openstack/ceilometer-0" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.055270 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69daea8f-61a8-4a91-b782-afdcb01e0605-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"69daea8f-61a8-4a91-b782-afdcb01e0605\") " pod="openstack/ceilometer-0" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.055297 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69daea8f-61a8-4a91-b782-afdcb01e0605-log-httpd\") pod \"ceilometer-0\" (UID: \"69daea8f-61a8-4a91-b782-afdcb01e0605\") " pod="openstack/ceilometer-0" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.055340 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/69daea8f-61a8-4a91-b782-afdcb01e0605-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"69daea8f-61a8-4a91-b782-afdcb01e0605\") " pod="openstack/ceilometer-0" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.125532 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-ljqjq"] Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.126628 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-ljqjq" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.133400 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.133699 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-m5wjk" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.147125 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.156328 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q89mk\" (UniqueName: \"kubernetes.io/projected/69daea8f-61a8-4a91-b782-afdcb01e0605-kube-api-access-q89mk\") pod \"ceilometer-0\" (UID: \"69daea8f-61a8-4a91-b782-afdcb01e0605\") " pod="openstack/ceilometer-0" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.156358 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69daea8f-61a8-4a91-b782-afdcb01e0605-run-httpd\") pod \"ceilometer-0\" (UID: \"69daea8f-61a8-4a91-b782-afdcb01e0605\") " pod="openstack/ceilometer-0" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.156399 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69daea8f-61a8-4a91-b782-afdcb01e0605-config-data\") pod \"ceilometer-0\" (UID: \"69daea8f-61a8-4a91-b782-afdcb01e0605\") " pod="openstack/ceilometer-0" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.156419 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69daea8f-61a8-4a91-b782-afdcb01e0605-scripts\") pod \"ceilometer-0\" (UID: \"69daea8f-61a8-4a91-b782-afdcb01e0605\") " pod="openstack/ceilometer-0" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.156480 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69daea8f-61a8-4a91-b782-afdcb01e0605-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"69daea8f-61a8-4a91-b782-afdcb01e0605\") " pod="openstack/ceilometer-0" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.156504 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69daea8f-61a8-4a91-b782-afdcb01e0605-log-httpd\") pod \"ceilometer-0\" (UID: \"69daea8f-61a8-4a91-b782-afdcb01e0605\") " pod="openstack/ceilometer-0" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.156540 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/69daea8f-61a8-4a91-b782-afdcb01e0605-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"69daea8f-61a8-4a91-b782-afdcb01e0605\") " pod="openstack/ceilometer-0" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.169037 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69daea8f-61a8-4a91-b782-afdcb01e0605-run-httpd\") pod \"ceilometer-0\" (UID: \"69daea8f-61a8-4a91-b782-afdcb01e0605\") " pod="openstack/ceilometer-0" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.169856 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-jfdqh"] Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.171245 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-jfdqh" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.172779 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69daea8f-61a8-4a91-b782-afdcb01e0605-log-httpd\") pod \"ceilometer-0\" (UID: \"69daea8f-61a8-4a91-b782-afdcb01e0605\") " pod="openstack/ceilometer-0" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.189147 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-p5hgv"] Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.190384 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-p5hgv" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.197432 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.197653 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-lv59v" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.197821 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-zlbgr" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.207806 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.208041 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.214346 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.233332 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q89mk\" (UniqueName: \"kubernetes.io/projected/69daea8f-61a8-4a91-b782-afdcb01e0605-kube-api-access-q89mk\") pod \"ceilometer-0\" (UID: \"69daea8f-61a8-4a91-b782-afdcb01e0605\") " pod="openstack/ceilometer-0" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.236239 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69daea8f-61a8-4a91-b782-afdcb01e0605-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"69daea8f-61a8-4a91-b782-afdcb01e0605\") " pod="openstack/ceilometer-0" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.248590 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/69daea8f-61a8-4a91-b782-afdcb01e0605-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"69daea8f-61a8-4a91-b782-afdcb01e0605\") " pod="openstack/ceilometer-0" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.253019 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69daea8f-61a8-4a91-b782-afdcb01e0605-scripts\") pod \"ceilometer-0\" (UID: \"69daea8f-61a8-4a91-b782-afdcb01e0605\") " pod="openstack/ceilometer-0" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.253932 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69daea8f-61a8-4a91-b782-afdcb01e0605-config-data\") pod \"ceilometer-0\" (UID: \"69daea8f-61a8-4a91-b782-afdcb01e0605\") " pod="openstack/ceilometer-0" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.258959 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cfb75e0b-4e7b-484f-832b-5ed69650f1f1-scripts\") pod \"cinder-db-sync-ljqjq\" (UID: \"cfb75e0b-4e7b-484f-832b-5ed69650f1f1\") " pod="openstack/cinder-db-sync-ljqjq" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.259040 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfb75e0b-4e7b-484f-832b-5ed69650f1f1-combined-ca-bundle\") pod \"cinder-db-sync-ljqjq\" (UID: \"cfb75e0b-4e7b-484f-832b-5ed69650f1f1\") " pod="openstack/cinder-db-sync-ljqjq" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.259078 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdzfx\" (UniqueName: \"kubernetes.io/projected/cfb75e0b-4e7b-484f-832b-5ed69650f1f1-kube-api-access-vdzfx\") pod \"cinder-db-sync-ljqjq\" (UID: \"cfb75e0b-4e7b-484f-832b-5ed69650f1f1\") " pod="openstack/cinder-db-sync-ljqjq" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.259105 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cfb75e0b-4e7b-484f-832b-5ed69650f1f1-etc-machine-id\") pod \"cinder-db-sync-ljqjq\" (UID: \"cfb75e0b-4e7b-484f-832b-5ed69650f1f1\") " pod="openstack/cinder-db-sync-ljqjq" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.259157 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfb75e0b-4e7b-484f-832b-5ed69650f1f1-config-data\") pod \"cinder-db-sync-ljqjq\" (UID: \"cfb75e0b-4e7b-484f-832b-5ed69650f1f1\") " pod="openstack/cinder-db-sync-ljqjq" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.259214 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/cfb75e0b-4e7b-484f-832b-5ed69650f1f1-db-sync-config-data\") pod \"cinder-db-sync-ljqjq\" (UID: \"cfb75e0b-4e7b-484f-832b-5ed69650f1f1\") " pod="openstack/cinder-db-sync-ljqjq" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.289481 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-qtv6m" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.299200 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-ljqjq"] Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.357222 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-jfdqh"] Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.365693 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfb75e0b-4e7b-484f-832b-5ed69650f1f1-config-data\") pod \"cinder-db-sync-ljqjq\" (UID: \"cfb75e0b-4e7b-484f-832b-5ed69650f1f1\") " pod="openstack/cinder-db-sync-ljqjq" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.365742 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/cfb75e0b-4e7b-484f-832b-5ed69650f1f1-db-sync-config-data\") pod \"cinder-db-sync-ljqjq\" (UID: \"cfb75e0b-4e7b-484f-832b-5ed69650f1f1\") " pod="openstack/cinder-db-sync-ljqjq" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.365765 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c31f2b72-9e36-4a29-90e2-a3599b27f94b-scripts\") pod \"placement-db-sync-jfdqh\" (UID: \"c31f2b72-9e36-4a29-90e2-a3599b27f94b\") " pod="openstack/placement-db-sync-jfdqh" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.365792 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a333b1b6-9ef2-4332-9622-e9fcc9d41854-config\") pod \"neutron-db-sync-p5hgv\" (UID: \"a333b1b6-9ef2-4332-9622-e9fcc9d41854\") " pod="openstack/neutron-db-sync-p5hgv" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.365827 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c31f2b72-9e36-4a29-90e2-a3599b27f94b-config-data\") pod \"placement-db-sync-jfdqh\" (UID: \"c31f2b72-9e36-4a29-90e2-a3599b27f94b\") " pod="openstack/placement-db-sync-jfdqh" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.365869 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crgd5\" (UniqueName: \"kubernetes.io/projected/a333b1b6-9ef2-4332-9622-e9fcc9d41854-kube-api-access-crgd5\") pod \"neutron-db-sync-p5hgv\" (UID: \"a333b1b6-9ef2-4332-9622-e9fcc9d41854\") " pod="openstack/neutron-db-sync-p5hgv" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.365899 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c31f2b72-9e36-4a29-90e2-a3599b27f94b-logs\") pod \"placement-db-sync-jfdqh\" (UID: \"c31f2b72-9e36-4a29-90e2-a3599b27f94b\") " pod="openstack/placement-db-sync-jfdqh" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.365937 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c31f2b72-9e36-4a29-90e2-a3599b27f94b-combined-ca-bundle\") pod \"placement-db-sync-jfdqh\" (UID: \"c31f2b72-9e36-4a29-90e2-a3599b27f94b\") " pod="openstack/placement-db-sync-jfdqh" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.365978 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cfb75e0b-4e7b-484f-832b-5ed69650f1f1-scripts\") pod \"cinder-db-sync-ljqjq\" (UID: \"cfb75e0b-4e7b-484f-832b-5ed69650f1f1\") " pod="openstack/cinder-db-sync-ljqjq" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.382088 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfb75e0b-4e7b-484f-832b-5ed69650f1f1-combined-ca-bundle\") pod \"cinder-db-sync-ljqjq\" (UID: \"cfb75e0b-4e7b-484f-832b-5ed69650f1f1\") " pod="openstack/cinder-db-sync-ljqjq" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.382134 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdzfx\" (UniqueName: \"kubernetes.io/projected/cfb75e0b-4e7b-484f-832b-5ed69650f1f1-kube-api-access-vdzfx\") pod \"cinder-db-sync-ljqjq\" (UID: \"cfb75e0b-4e7b-484f-832b-5ed69650f1f1\") " pod="openstack/cinder-db-sync-ljqjq" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.382172 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a333b1b6-9ef2-4332-9622-e9fcc9d41854-combined-ca-bundle\") pod \"neutron-db-sync-p5hgv\" (UID: \"a333b1b6-9ef2-4332-9622-e9fcc9d41854\") " pod="openstack/neutron-db-sync-p5hgv" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.382204 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cfb75e0b-4e7b-484f-832b-5ed69650f1f1-etc-machine-id\") pod \"cinder-db-sync-ljqjq\" (UID: \"cfb75e0b-4e7b-484f-832b-5ed69650f1f1\") " pod="openstack/cinder-db-sync-ljqjq" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.382244 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t77cc\" (UniqueName: \"kubernetes.io/projected/c31f2b72-9e36-4a29-90e2-a3599b27f94b-kube-api-access-t77cc\") pod \"placement-db-sync-jfdqh\" (UID: \"c31f2b72-9e36-4a29-90e2-a3599b27f94b\") " pod="openstack/placement-db-sync-jfdqh" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.405974 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfb75e0b-4e7b-484f-832b-5ed69650f1f1-config-data\") pod \"cinder-db-sync-ljqjq\" (UID: \"cfb75e0b-4e7b-484f-832b-5ed69650f1f1\") " pod="openstack/cinder-db-sync-ljqjq" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.426175 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/cfb75e0b-4e7b-484f-832b-5ed69650f1f1-db-sync-config-data\") pod \"cinder-db-sync-ljqjq\" (UID: \"cfb75e0b-4e7b-484f-832b-5ed69650f1f1\") " pod="openstack/cinder-db-sync-ljqjq" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.426279 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cfb75e0b-4e7b-484f-832b-5ed69650f1f1-etc-machine-id\") pod \"cinder-db-sync-ljqjq\" (UID: \"cfb75e0b-4e7b-484f-832b-5ed69650f1f1\") " pod="openstack/cinder-db-sync-ljqjq" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.426611 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-p5hgv"] Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.428903 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfb75e0b-4e7b-484f-832b-5ed69650f1f1-combined-ca-bundle\") pod \"cinder-db-sync-ljqjq\" (UID: \"cfb75e0b-4e7b-484f-832b-5ed69650f1f1\") " pod="openstack/cinder-db-sync-ljqjq" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.445627 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cfb75e0b-4e7b-484f-832b-5ed69650f1f1-scripts\") pod \"cinder-db-sync-ljqjq\" (UID: \"cfb75e0b-4e7b-484f-832b-5ed69650f1f1\") " pod="openstack/cinder-db-sync-ljqjq" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.456400 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-5s8vb"] Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.458424 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.462629 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-5s8vb" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.494567 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.563356 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a333b1b6-9ef2-4332-9622-e9fcc9d41854-combined-ca-bundle\") pod \"neutron-db-sync-p5hgv\" (UID: \"a333b1b6-9ef2-4332-9622-e9fcc9d41854\") " pod="openstack/neutron-db-sync-p5hgv" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.563423 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t77cc\" (UniqueName: \"kubernetes.io/projected/c31f2b72-9e36-4a29-90e2-a3599b27f94b-kube-api-access-t77cc\") pod \"placement-db-sync-jfdqh\" (UID: \"c31f2b72-9e36-4a29-90e2-a3599b27f94b\") " pod="openstack/placement-db-sync-jfdqh" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.563490 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c31f2b72-9e36-4a29-90e2-a3599b27f94b-scripts\") pod \"placement-db-sync-jfdqh\" (UID: \"c31f2b72-9e36-4a29-90e2-a3599b27f94b\") " pod="openstack/placement-db-sync-jfdqh" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.563529 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a333b1b6-9ef2-4332-9622-e9fcc9d41854-config\") pod \"neutron-db-sync-p5hgv\" (UID: \"a333b1b6-9ef2-4332-9622-e9fcc9d41854\") " pod="openstack/neutron-db-sync-p5hgv" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.563588 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c31f2b72-9e36-4a29-90e2-a3599b27f94b-config-data\") pod \"placement-db-sync-jfdqh\" (UID: \"c31f2b72-9e36-4a29-90e2-a3599b27f94b\") " pod="openstack/placement-db-sync-jfdqh" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.563661 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crgd5\" (UniqueName: \"kubernetes.io/projected/a333b1b6-9ef2-4332-9622-e9fcc9d41854-kube-api-access-crgd5\") pod \"neutron-db-sync-p5hgv\" (UID: \"a333b1b6-9ef2-4332-9622-e9fcc9d41854\") " pod="openstack/neutron-db-sync-p5hgv" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.563711 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c31f2b72-9e36-4a29-90e2-a3599b27f94b-logs\") pod \"placement-db-sync-jfdqh\" (UID: \"c31f2b72-9e36-4a29-90e2-a3599b27f94b\") " pod="openstack/placement-db-sync-jfdqh" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.563729 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c31f2b72-9e36-4a29-90e2-a3599b27f94b-combined-ca-bundle\") pod \"placement-db-sync-jfdqh\" (UID: \"c31f2b72-9e36-4a29-90e2-a3599b27f94b\") " pod="openstack/placement-db-sync-jfdqh" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.574392 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c31f2b72-9e36-4a29-90e2-a3599b27f94b-logs\") pod \"placement-db-sync-jfdqh\" (UID: \"c31f2b72-9e36-4a29-90e2-a3599b27f94b\") " pod="openstack/placement-db-sync-jfdqh" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.577407 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c31f2b72-9e36-4a29-90e2-a3599b27f94b-config-data\") pod \"placement-db-sync-jfdqh\" (UID: \"c31f2b72-9e36-4a29-90e2-a3599b27f94b\") " pod="openstack/placement-db-sync-jfdqh" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.582777 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdzfx\" (UniqueName: \"kubernetes.io/projected/cfb75e0b-4e7b-484f-832b-5ed69650f1f1-kube-api-access-vdzfx\") pod \"cinder-db-sync-ljqjq\" (UID: \"cfb75e0b-4e7b-484f-832b-5ed69650f1f1\") " pod="openstack/cinder-db-sync-ljqjq" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.583093 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c31f2b72-9e36-4a29-90e2-a3599b27f94b-scripts\") pod \"placement-db-sync-jfdqh\" (UID: \"c31f2b72-9e36-4a29-90e2-a3599b27f94b\") " pod="openstack/placement-db-sync-jfdqh" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.587380 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-8jlxs" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.587832 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/a333b1b6-9ef2-4332-9622-e9fcc9d41854-config\") pod \"neutron-db-sync-p5hgv\" (UID: \"a333b1b6-9ef2-4332-9622-e9fcc9d41854\") " pod="openstack/neutron-db-sync-p5hgv" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.588334 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a333b1b6-9ef2-4332-9622-e9fcc9d41854-combined-ca-bundle\") pod \"neutron-db-sync-p5hgv\" (UID: \"a333b1b6-9ef2-4332-9622-e9fcc9d41854\") " pod="openstack/neutron-db-sync-p5hgv" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.593990 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-66fbd85b65-jz6kd"] Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.607543 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crgd5\" (UniqueName: \"kubernetes.io/projected/a333b1b6-9ef2-4332-9622-e9fcc9d41854-kube-api-access-crgd5\") pod \"neutron-db-sync-p5hgv\" (UID: \"a333b1b6-9ef2-4332-9622-e9fcc9d41854\") " pod="openstack/neutron-db-sync-p5hgv" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.608124 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c31f2b72-9e36-4a29-90e2-a3599b27f94b-combined-ca-bundle\") pod \"placement-db-sync-jfdqh\" (UID: \"c31f2b72-9e36-4a29-90e2-a3599b27f94b\") " pod="openstack/placement-db-sync-jfdqh" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.613377 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t77cc\" (UniqueName: \"kubernetes.io/projected/c31f2b72-9e36-4a29-90e2-a3599b27f94b-kube-api-access-t77cc\") pod \"placement-db-sync-jfdqh\" (UID: \"c31f2b72-9e36-4a29-90e2-a3599b27f94b\") " pod="openstack/placement-db-sync-jfdqh" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.664749 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9829ff6-a37c-403a-9b72-8a2c1e3df5d5-combined-ca-bundle\") pod \"barbican-db-sync-5s8vb\" (UID: \"a9829ff6-a37c-403a-9b72-8a2c1e3df5d5\") " pod="openstack/barbican-db-sync-5s8vb" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.664844 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a9829ff6-a37c-403a-9b72-8a2c1e3df5d5-db-sync-config-data\") pod \"barbican-db-sync-5s8vb\" (UID: \"a9829ff6-a37c-403a-9b72-8a2c1e3df5d5\") " pod="openstack/barbican-db-sync-5s8vb" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.664875 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppxvc\" (UniqueName: \"kubernetes.io/projected/a9829ff6-a37c-403a-9b72-8a2c1e3df5d5-kube-api-access-ppxvc\") pod \"barbican-db-sync-5s8vb\" (UID: \"a9829ff6-a37c-403a-9b72-8a2c1e3df5d5\") " pod="openstack/barbican-db-sync-5s8vb" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.666485 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-5s8vb"] Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.668411 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-p5hgv" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.732966 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6bf59f66bf-dftst"] Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.734435 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bf59f66bf-dftst" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.744340 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bf59f66bf-dftst"] Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.789443 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a9829ff6-a37c-403a-9b72-8a2c1e3df5d5-db-sync-config-data\") pod \"barbican-db-sync-5s8vb\" (UID: \"a9829ff6-a37c-403a-9b72-8a2c1e3df5d5\") " pod="openstack/barbican-db-sync-5s8vb" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.789495 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppxvc\" (UniqueName: \"kubernetes.io/projected/a9829ff6-a37c-403a-9b72-8a2c1e3df5d5-kube-api-access-ppxvc\") pod \"barbican-db-sync-5s8vb\" (UID: \"a9829ff6-a37c-403a-9b72-8a2c1e3df5d5\") " pod="openstack/barbican-db-sync-5s8vb" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.789549 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9829ff6-a37c-403a-9b72-8a2c1e3df5d5-combined-ca-bundle\") pod \"barbican-db-sync-5s8vb\" (UID: \"a9829ff6-a37c-403a-9b72-8a2c1e3df5d5\") " pod="openstack/barbican-db-sync-5s8vb" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.802864 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-ljqjq" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.834107 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a9829ff6-a37c-403a-9b72-8a2c1e3df5d5-db-sync-config-data\") pod \"barbican-db-sync-5s8vb\" (UID: \"a9829ff6-a37c-403a-9b72-8a2c1e3df5d5\") " pod="openstack/barbican-db-sync-5s8vb" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.835030 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9829ff6-a37c-403a-9b72-8a2c1e3df5d5-combined-ca-bundle\") pod \"barbican-db-sync-5s8vb\" (UID: \"a9829ff6-a37c-403a-9b72-8a2c1e3df5d5\") " pod="openstack/barbican-db-sync-5s8vb" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.853255 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppxvc\" (UniqueName: \"kubernetes.io/projected/a9829ff6-a37c-403a-9b72-8a2c1e3df5d5-kube-api-access-ppxvc\") pod \"barbican-db-sync-5s8vb\" (UID: \"a9829ff6-a37c-403a-9b72-8a2c1e3df5d5\") " pod="openstack/barbican-db-sync-5s8vb" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.893054 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wwwc\" (UniqueName: \"kubernetes.io/projected/2a840d50-ae2e-4131-b162-f129b669461a-kube-api-access-9wwwc\") pod \"dnsmasq-dns-6bf59f66bf-dftst\" (UID: \"2a840d50-ae2e-4131-b162-f129b669461a\") " pod="openstack/dnsmasq-dns-6bf59f66bf-dftst" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.893155 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2a840d50-ae2e-4131-b162-f129b669461a-ovsdbserver-sb\") pod \"dnsmasq-dns-6bf59f66bf-dftst\" (UID: \"2a840d50-ae2e-4131-b162-f129b669461a\") " pod="openstack/dnsmasq-dns-6bf59f66bf-dftst" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.893188 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2a840d50-ae2e-4131-b162-f129b669461a-dns-svc\") pod \"dnsmasq-dns-6bf59f66bf-dftst\" (UID: \"2a840d50-ae2e-4131-b162-f129b669461a\") " pod="openstack/dnsmasq-dns-6bf59f66bf-dftst" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.893248 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a840d50-ae2e-4131-b162-f129b669461a-config\") pod \"dnsmasq-dns-6bf59f66bf-dftst\" (UID: \"2a840d50-ae2e-4131-b162-f129b669461a\") " pod="openstack/dnsmasq-dns-6bf59f66bf-dftst" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.893268 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2a840d50-ae2e-4131-b162-f129b669461a-ovsdbserver-nb\") pod \"dnsmasq-dns-6bf59f66bf-dftst\" (UID: \"2a840d50-ae2e-4131-b162-f129b669461a\") " pod="openstack/dnsmasq-dns-6bf59f66bf-dftst" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.901689 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-jfdqh" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.953104 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-5s8vb" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.968854 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-66fbd85b65-jz6kd"] Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.994411 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a840d50-ae2e-4131-b162-f129b669461a-config\") pod \"dnsmasq-dns-6bf59f66bf-dftst\" (UID: \"2a840d50-ae2e-4131-b162-f129b669461a\") " pod="openstack/dnsmasq-dns-6bf59f66bf-dftst" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.994469 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2a840d50-ae2e-4131-b162-f129b669461a-ovsdbserver-nb\") pod \"dnsmasq-dns-6bf59f66bf-dftst\" (UID: \"2a840d50-ae2e-4131-b162-f129b669461a\") " pod="openstack/dnsmasq-dns-6bf59f66bf-dftst" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.994547 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wwwc\" (UniqueName: \"kubernetes.io/projected/2a840d50-ae2e-4131-b162-f129b669461a-kube-api-access-9wwwc\") pod \"dnsmasq-dns-6bf59f66bf-dftst\" (UID: \"2a840d50-ae2e-4131-b162-f129b669461a\") " pod="openstack/dnsmasq-dns-6bf59f66bf-dftst" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.994592 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2a840d50-ae2e-4131-b162-f129b669461a-ovsdbserver-sb\") pod \"dnsmasq-dns-6bf59f66bf-dftst\" (UID: \"2a840d50-ae2e-4131-b162-f129b669461a\") " pod="openstack/dnsmasq-dns-6bf59f66bf-dftst" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.994629 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2a840d50-ae2e-4131-b162-f129b669461a-dns-svc\") pod \"dnsmasq-dns-6bf59f66bf-dftst\" (UID: \"2a840d50-ae2e-4131-b162-f129b669461a\") " pod="openstack/dnsmasq-dns-6bf59f66bf-dftst" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.995903 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2a840d50-ae2e-4131-b162-f129b669461a-dns-svc\") pod \"dnsmasq-dns-6bf59f66bf-dftst\" (UID: \"2a840d50-ae2e-4131-b162-f129b669461a\") " pod="openstack/dnsmasq-dns-6bf59f66bf-dftst" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.996685 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2a840d50-ae2e-4131-b162-f129b669461a-ovsdbserver-nb\") pod \"dnsmasq-dns-6bf59f66bf-dftst\" (UID: \"2a840d50-ae2e-4131-b162-f129b669461a\") " pod="openstack/dnsmasq-dns-6bf59f66bf-dftst" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.997465 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a840d50-ae2e-4131-b162-f129b669461a-config\") pod \"dnsmasq-dns-6bf59f66bf-dftst\" (UID: \"2a840d50-ae2e-4131-b162-f129b669461a\") " pod="openstack/dnsmasq-dns-6bf59f66bf-dftst" Dec 07 19:34:32 crc kubenswrapper[4815]: I1207 19:34:32.999376 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2a840d50-ae2e-4131-b162-f129b669461a-ovsdbserver-sb\") pod \"dnsmasq-dns-6bf59f66bf-dftst\" (UID: \"2a840d50-ae2e-4131-b162-f129b669461a\") " pod="openstack/dnsmasq-dns-6bf59f66bf-dftst" Dec 07 19:34:33 crc kubenswrapper[4815]: I1207 19:34:33.035371 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wwwc\" (UniqueName: \"kubernetes.io/projected/2a840d50-ae2e-4131-b162-f129b669461a-kube-api-access-9wwwc\") pod \"dnsmasq-dns-6bf59f66bf-dftst\" (UID: \"2a840d50-ae2e-4131-b162-f129b669461a\") " pod="openstack/dnsmasq-dns-6bf59f66bf-dftst" Dec 07 19:34:33 crc kubenswrapper[4815]: I1207 19:34:33.216382 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bf59f66bf-dftst" Dec 07 19:34:33 crc kubenswrapper[4815]: I1207 19:34:33.392848 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66fbd85b65-jz6kd" event={"ID":"90d2c008-b1eb-4be6-b9aa-932ac14e287b","Type":"ContainerStarted","Data":"b74b9cf197af134eedc463e696aa7de5de24deaa5f9699dc37a8cb9edd690495"} Dec 07 19:34:33 crc kubenswrapper[4815]: I1207 19:34:33.633343 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 07 19:34:33 crc kubenswrapper[4815]: I1207 19:34:33.845349 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-ljqjq"] Dec 07 19:34:33 crc kubenswrapper[4815]: W1207 19:34:33.855394 4815 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcfb75e0b_4e7b_484f_832b_5ed69650f1f1.slice/crio-4a9d5b25d318176c42c38f15e65ae67741be53499e8415195dc8a81c2c15b71b WatchSource:0}: Error finding container 4a9d5b25d318176c42c38f15e65ae67741be53499e8415195dc8a81c2c15b71b: Status 404 returned error can't find the container with id 4a9d5b25d318176c42c38f15e65ae67741be53499e8415195dc8a81c2c15b71b Dec 07 19:34:33 crc kubenswrapper[4815]: I1207 19:34:33.857493 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-qtv6m"] Dec 07 19:34:33 crc kubenswrapper[4815]: I1207 19:34:33.992492 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-p5hgv"] Dec 07 19:34:34 crc kubenswrapper[4815]: I1207 19:34:34.053094 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-5s8vb"] Dec 07 19:34:34 crc kubenswrapper[4815]: I1207 19:34:34.376173 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-jfdqh"] Dec 07 19:34:34 crc kubenswrapper[4815]: I1207 19:34:34.405573 4815 generic.go:334] "Generic (PLEG): container finished" podID="90d2c008-b1eb-4be6-b9aa-932ac14e287b" containerID="01cd1073b088755e137920627334a3246f625f8fd93a94c4ab4d26d863144d42" exitCode=0 Dec 07 19:34:34 crc kubenswrapper[4815]: I1207 19:34:34.405619 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66fbd85b65-jz6kd" event={"ID":"90d2c008-b1eb-4be6-b9aa-932ac14e287b","Type":"ContainerDied","Data":"01cd1073b088755e137920627334a3246f625f8fd93a94c4ab4d26d863144d42"} Dec 07 19:34:34 crc kubenswrapper[4815]: I1207 19:34:34.407554 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-qtv6m" event={"ID":"567c32b2-5ca2-4ebf-93f2-76c03a5add49","Type":"ContainerStarted","Data":"0ee1085ac12971aaec815c4546ec9f44dc4c3d76c683ba6fb5073a35a0387092"} Dec 07 19:34:34 crc kubenswrapper[4815]: I1207 19:34:34.407580 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-qtv6m" event={"ID":"567c32b2-5ca2-4ebf-93f2-76c03a5add49","Type":"ContainerStarted","Data":"5ba5493c676011ec953fa9a0ee649acc9c7bbf7b9423425d6cd0a0e8ff0ad60d"} Dec 07 19:34:34 crc kubenswrapper[4815]: I1207 19:34:34.412054 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-p5hgv" event={"ID":"a333b1b6-9ef2-4332-9622-e9fcc9d41854","Type":"ContainerStarted","Data":"eac252735903913f8661462e07b24ff978ccdcc2edc9ab499bb5168fbaab94b7"} Dec 07 19:34:34 crc kubenswrapper[4815]: I1207 19:34:34.412079 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-p5hgv" event={"ID":"a333b1b6-9ef2-4332-9622-e9fcc9d41854","Type":"ContainerStarted","Data":"e1c35647606272b607d17c35a2f8175481e0c64dcdb3341f6dd5339334895fd2"} Dec 07 19:34:34 crc kubenswrapper[4815]: I1207 19:34:34.413878 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-5s8vb" event={"ID":"a9829ff6-a37c-403a-9b72-8a2c1e3df5d5","Type":"ContainerStarted","Data":"ebe0f88ef9bb3be4324e7432244a37717949e30188dbac95fcf83eebfc1a1c66"} Dec 07 19:34:34 crc kubenswrapper[4815]: I1207 19:34:34.415253 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-ljqjq" event={"ID":"cfb75e0b-4e7b-484f-832b-5ed69650f1f1","Type":"ContainerStarted","Data":"4a9d5b25d318176c42c38f15e65ae67741be53499e8415195dc8a81c2c15b71b"} Dec 07 19:34:34 crc kubenswrapper[4815]: I1207 19:34:34.416029 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69daea8f-61a8-4a91-b782-afdcb01e0605","Type":"ContainerStarted","Data":"37fbd0932e634d778b202d7543ffc23cb36b827899ffe2760a118b98c551d0a7"} Dec 07 19:34:34 crc kubenswrapper[4815]: I1207 19:34:34.455355 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bf59f66bf-dftst"] Dec 07 19:34:34 crc kubenswrapper[4815]: W1207 19:34:34.456222 4815 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2a840d50_ae2e_4131_b162_f129b669461a.slice/crio-5a8b785667a9078c79b24d69aedced11609d0ea8269bdc9993c08573e8a5edf3 WatchSource:0}: Error finding container 5a8b785667a9078c79b24d69aedced11609d0ea8269bdc9993c08573e8a5edf3: Status 404 returned error can't find the container with id 5a8b785667a9078c79b24d69aedced11609d0ea8269bdc9993c08573e8a5edf3 Dec 07 19:34:34 crc kubenswrapper[4815]: I1207 19:34:34.459832 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-qtv6m" podStartSLOduration=3.459810512 podStartE2EDuration="3.459810512s" podCreationTimestamp="2025-12-07 19:34:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:34:34.453112792 +0000 UTC m=+1179.032102837" watchObservedRunningTime="2025-12-07 19:34:34.459810512 +0000 UTC m=+1179.038800557" Dec 07 19:34:34 crc kubenswrapper[4815]: I1207 19:34:34.477110 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-p5hgv" podStartSLOduration=2.477090139 podStartE2EDuration="2.477090139s" podCreationTimestamp="2025-12-07 19:34:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:34:34.474418734 +0000 UTC m=+1179.053408779" watchObservedRunningTime="2025-12-07 19:34:34.477090139 +0000 UTC m=+1179.056080184" Dec 07 19:34:34 crc kubenswrapper[4815]: I1207 19:34:34.911821 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66fbd85b65-jz6kd" Dec 07 19:34:34 crc kubenswrapper[4815]: I1207 19:34:34.950962 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 07 19:34:35 crc kubenswrapper[4815]: I1207 19:34:35.023004 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90d2c008-b1eb-4be6-b9aa-932ac14e287b-config\") pod \"90d2c008-b1eb-4be6-b9aa-932ac14e287b\" (UID: \"90d2c008-b1eb-4be6-b9aa-932ac14e287b\") " Dec 07 19:34:35 crc kubenswrapper[4815]: I1207 19:34:35.023052 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jv725\" (UniqueName: \"kubernetes.io/projected/90d2c008-b1eb-4be6-b9aa-932ac14e287b-kube-api-access-jv725\") pod \"90d2c008-b1eb-4be6-b9aa-932ac14e287b\" (UID: \"90d2c008-b1eb-4be6-b9aa-932ac14e287b\") " Dec 07 19:34:35 crc kubenswrapper[4815]: I1207 19:34:35.023114 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/90d2c008-b1eb-4be6-b9aa-932ac14e287b-ovsdbserver-sb\") pod \"90d2c008-b1eb-4be6-b9aa-932ac14e287b\" (UID: \"90d2c008-b1eb-4be6-b9aa-932ac14e287b\") " Dec 07 19:34:35 crc kubenswrapper[4815]: I1207 19:34:35.023143 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/90d2c008-b1eb-4be6-b9aa-932ac14e287b-ovsdbserver-nb\") pod \"90d2c008-b1eb-4be6-b9aa-932ac14e287b\" (UID: \"90d2c008-b1eb-4be6-b9aa-932ac14e287b\") " Dec 07 19:34:35 crc kubenswrapper[4815]: I1207 19:34:35.023221 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90d2c008-b1eb-4be6-b9aa-932ac14e287b-dns-svc\") pod \"90d2c008-b1eb-4be6-b9aa-932ac14e287b\" (UID: \"90d2c008-b1eb-4be6-b9aa-932ac14e287b\") " Dec 07 19:34:35 crc kubenswrapper[4815]: I1207 19:34:35.054135 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90d2c008-b1eb-4be6-b9aa-932ac14e287b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "90d2c008-b1eb-4be6-b9aa-932ac14e287b" (UID: "90d2c008-b1eb-4be6-b9aa-932ac14e287b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:34:35 crc kubenswrapper[4815]: I1207 19:34:35.054161 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90d2c008-b1eb-4be6-b9aa-932ac14e287b-kube-api-access-jv725" (OuterVolumeSpecName: "kube-api-access-jv725") pod "90d2c008-b1eb-4be6-b9aa-932ac14e287b" (UID: "90d2c008-b1eb-4be6-b9aa-932ac14e287b"). InnerVolumeSpecName "kube-api-access-jv725". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:34:35 crc kubenswrapper[4815]: I1207 19:34:35.069531 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90d2c008-b1eb-4be6-b9aa-932ac14e287b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "90d2c008-b1eb-4be6-b9aa-932ac14e287b" (UID: "90d2c008-b1eb-4be6-b9aa-932ac14e287b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:34:35 crc kubenswrapper[4815]: I1207 19:34:35.090442 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90d2c008-b1eb-4be6-b9aa-932ac14e287b-config" (OuterVolumeSpecName: "config") pod "90d2c008-b1eb-4be6-b9aa-932ac14e287b" (UID: "90d2c008-b1eb-4be6-b9aa-932ac14e287b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:34:35 crc kubenswrapper[4815]: I1207 19:34:35.126245 4815 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90d2c008-b1eb-4be6-b9aa-932ac14e287b-config\") on node \"crc\" DevicePath \"\"" Dec 07 19:34:35 crc kubenswrapper[4815]: I1207 19:34:35.126276 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jv725\" (UniqueName: \"kubernetes.io/projected/90d2c008-b1eb-4be6-b9aa-932ac14e287b-kube-api-access-jv725\") on node \"crc\" DevicePath \"\"" Dec 07 19:34:35 crc kubenswrapper[4815]: I1207 19:34:35.126286 4815 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/90d2c008-b1eb-4be6-b9aa-932ac14e287b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 07 19:34:35 crc kubenswrapper[4815]: I1207 19:34:35.126297 4815 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90d2c008-b1eb-4be6-b9aa-932ac14e287b-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 07 19:34:35 crc kubenswrapper[4815]: I1207 19:34:35.132954 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90d2c008-b1eb-4be6-b9aa-932ac14e287b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "90d2c008-b1eb-4be6-b9aa-932ac14e287b" (UID: "90d2c008-b1eb-4be6-b9aa-932ac14e287b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:34:35 crc kubenswrapper[4815]: I1207 19:34:35.229666 4815 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/90d2c008-b1eb-4be6-b9aa-932ac14e287b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 07 19:34:35 crc kubenswrapper[4815]: I1207 19:34:35.451515 4815 generic.go:334] "Generic (PLEG): container finished" podID="2a840d50-ae2e-4131-b162-f129b669461a" containerID="91f4f698a97ffc4874f4ea704899058e682f7dd1c9919e2d9d822e853f3d30a0" exitCode=0 Dec 07 19:34:35 crc kubenswrapper[4815]: I1207 19:34:35.451903 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bf59f66bf-dftst" event={"ID":"2a840d50-ae2e-4131-b162-f129b669461a","Type":"ContainerDied","Data":"91f4f698a97ffc4874f4ea704899058e682f7dd1c9919e2d9d822e853f3d30a0"} Dec 07 19:34:35 crc kubenswrapper[4815]: I1207 19:34:35.451953 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bf59f66bf-dftst" event={"ID":"2a840d50-ae2e-4131-b162-f129b669461a","Type":"ContainerStarted","Data":"5a8b785667a9078c79b24d69aedced11609d0ea8269bdc9993c08573e8a5edf3"} Dec 07 19:34:35 crc kubenswrapper[4815]: I1207 19:34:35.494430 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66fbd85b65-jz6kd" event={"ID":"90d2c008-b1eb-4be6-b9aa-932ac14e287b","Type":"ContainerDied","Data":"b74b9cf197af134eedc463e696aa7de5de24deaa5f9699dc37a8cb9edd690495"} Dec 07 19:34:35 crc kubenswrapper[4815]: I1207 19:34:35.494475 4815 scope.go:117] "RemoveContainer" containerID="01cd1073b088755e137920627334a3246f625f8fd93a94c4ab4d26d863144d42" Dec 07 19:34:35 crc kubenswrapper[4815]: I1207 19:34:35.494573 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66fbd85b65-jz6kd" Dec 07 19:34:35 crc kubenswrapper[4815]: I1207 19:34:35.520690 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-s7wqs" event={"ID":"9e07853f-f24e-4bf6-8af1-15e4e9cccbc4","Type":"ContainerStarted","Data":"74bd87f41d19359e2f6de109e98f309e50a3be966bdc0417071bfb0fb9d3c5c8"} Dec 07 19:34:35 crc kubenswrapper[4815]: I1207 19:34:35.523991 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-jfdqh" event={"ID":"c31f2b72-9e36-4a29-90e2-a3599b27f94b","Type":"ContainerStarted","Data":"a58ffe3789bb11332244a2798cf538f509cacc1062d0e66cd1643a88cd756aec"} Dec 07 19:34:35 crc kubenswrapper[4815]: I1207 19:34:35.542109 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-s7wqs" podStartSLOduration=2.633386387 podStartE2EDuration="35.54209016s" podCreationTimestamp="2025-12-07 19:34:00 +0000 UTC" firstStartedPulling="2025-12-07 19:34:01.135180138 +0000 UTC m=+1145.714170183" lastFinishedPulling="2025-12-07 19:34:34.043883911 +0000 UTC m=+1178.622873956" observedRunningTime="2025-12-07 19:34:35.539139757 +0000 UTC m=+1180.118129792" watchObservedRunningTime="2025-12-07 19:34:35.54209016 +0000 UTC m=+1180.121080205" Dec 07 19:34:35 crc kubenswrapper[4815]: I1207 19:34:35.590809 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-66fbd85b65-jz6kd"] Dec 07 19:34:35 crc kubenswrapper[4815]: I1207 19:34:35.607468 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-66fbd85b65-jz6kd"] Dec 07 19:34:35 crc kubenswrapper[4815]: I1207 19:34:35.789860 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90d2c008-b1eb-4be6-b9aa-932ac14e287b" path="/var/lib/kubelet/pods/90d2c008-b1eb-4be6-b9aa-932ac14e287b/volumes" Dec 07 19:34:36 crc kubenswrapper[4815]: I1207 19:34:36.536353 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bf59f66bf-dftst" event={"ID":"2a840d50-ae2e-4131-b162-f129b669461a","Type":"ContainerStarted","Data":"47ef3d714c33868380940886b84dc9e96e24ff044e22ef62e630ed0bd9f78e52"} Dec 07 19:34:36 crc kubenswrapper[4815]: I1207 19:34:36.538048 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6bf59f66bf-dftst" Dec 07 19:34:36 crc kubenswrapper[4815]: I1207 19:34:36.560663 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6bf59f66bf-dftst" podStartSLOduration=4.560645811 podStartE2EDuration="4.560645811s" podCreationTimestamp="2025-12-07 19:34:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:34:36.556208895 +0000 UTC m=+1181.135198940" watchObservedRunningTime="2025-12-07 19:34:36.560645811 +0000 UTC m=+1181.139635856" Dec 07 19:34:40 crc kubenswrapper[4815]: I1207 19:34:40.574898 4815 generic.go:334] "Generic (PLEG): container finished" podID="567c32b2-5ca2-4ebf-93f2-76c03a5add49" containerID="0ee1085ac12971aaec815c4546ec9f44dc4c3d76c683ba6fb5073a35a0387092" exitCode=0 Dec 07 19:34:40 crc kubenswrapper[4815]: I1207 19:34:40.574993 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-qtv6m" event={"ID":"567c32b2-5ca2-4ebf-93f2-76c03a5add49","Type":"ContainerDied","Data":"0ee1085ac12971aaec815c4546ec9f44dc4c3d76c683ba6fb5073a35a0387092"} Dec 07 19:34:43 crc kubenswrapper[4815]: I1207 19:34:43.219252 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6bf59f66bf-dftst" Dec 07 19:34:43 crc kubenswrapper[4815]: I1207 19:34:43.279645 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-mhmdw"] Dec 07 19:34:43 crc kubenswrapper[4815]: I1207 19:34:43.280010 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8554648995-mhmdw" podUID="32f95e51-757c-4167-ad8d-32f472266fe5" containerName="dnsmasq-dns" containerID="cri-o://37094d35dbf9ecac4d2980e440888c791ee77a7066969174f63a4bb00a828281" gracePeriod=10 Dec 07 19:34:43 crc kubenswrapper[4815]: I1207 19:34:43.601131 4815 generic.go:334] "Generic (PLEG): container finished" podID="32f95e51-757c-4167-ad8d-32f472266fe5" containerID="37094d35dbf9ecac4d2980e440888c791ee77a7066969174f63a4bb00a828281" exitCode=0 Dec 07 19:34:43 crc kubenswrapper[4815]: I1207 19:34:43.601500 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-mhmdw" event={"ID":"32f95e51-757c-4167-ad8d-32f472266fe5","Type":"ContainerDied","Data":"37094d35dbf9ecac4d2980e440888c791ee77a7066969174f63a4bb00a828281"} Dec 07 19:34:50 crc kubenswrapper[4815]: I1207 19:34:50.677118 4815 generic.go:334] "Generic (PLEG): container finished" podID="9e07853f-f24e-4bf6-8af1-15e4e9cccbc4" containerID="74bd87f41d19359e2f6de109e98f309e50a3be966bdc0417071bfb0fb9d3c5c8" exitCode=0 Dec 07 19:34:50 crc kubenswrapper[4815]: I1207 19:34:50.677645 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-s7wqs" event={"ID":"9e07853f-f24e-4bf6-8af1-15e4e9cccbc4","Type":"ContainerDied","Data":"74bd87f41d19359e2f6de109e98f309e50a3be966bdc0417071bfb0fb9d3c5c8"} Dec 07 19:34:53 crc kubenswrapper[4815]: I1207 19:34:53.128425 4815 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-8554648995-mhmdw" podUID="32f95e51-757c-4167-ad8d-32f472266fe5" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.111:5353: i/o timeout" Dec 07 19:34:56 crc kubenswrapper[4815]: I1207 19:34:56.359509 4815 patch_prober.go:28] interesting pod/machine-config-daemon-gkn4h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 07 19:34:56 crc kubenswrapper[4815]: I1207 19:34:56.360187 4815 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 07 19:34:56 crc kubenswrapper[4815]: I1207 19:34:56.360266 4815 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" Dec 07 19:34:56 crc kubenswrapper[4815]: I1207 19:34:56.361175 4815 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f9fe1f41b8a7f99c3095c8366f5a8da29f411acbdda680568a628c2a0720c31e"} pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 07 19:34:56 crc kubenswrapper[4815]: I1207 19:34:56.361259 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" containerName="machine-config-daemon" containerID="cri-o://f9fe1f41b8a7f99c3095c8366f5a8da29f411acbdda680568a628c2a0720c31e" gracePeriod=600 Dec 07 19:34:56 crc kubenswrapper[4815]: I1207 19:34:56.580065 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-qtv6m" Dec 07 19:34:56 crc kubenswrapper[4815]: I1207 19:34:56.739018 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/567c32b2-5ca2-4ebf-93f2-76c03a5add49-credential-keys\") pod \"567c32b2-5ca2-4ebf-93f2-76c03a5add49\" (UID: \"567c32b2-5ca2-4ebf-93f2-76c03a5add49\") " Dec 07 19:34:56 crc kubenswrapper[4815]: I1207 19:34:56.739145 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/567c32b2-5ca2-4ebf-93f2-76c03a5add49-fernet-keys\") pod \"567c32b2-5ca2-4ebf-93f2-76c03a5add49\" (UID: \"567c32b2-5ca2-4ebf-93f2-76c03a5add49\") " Dec 07 19:34:56 crc kubenswrapper[4815]: I1207 19:34:56.739226 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/567c32b2-5ca2-4ebf-93f2-76c03a5add49-scripts\") pod \"567c32b2-5ca2-4ebf-93f2-76c03a5add49\" (UID: \"567c32b2-5ca2-4ebf-93f2-76c03a5add49\") " Dec 07 19:34:56 crc kubenswrapper[4815]: I1207 19:34:56.739278 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rgcsv\" (UniqueName: \"kubernetes.io/projected/567c32b2-5ca2-4ebf-93f2-76c03a5add49-kube-api-access-rgcsv\") pod \"567c32b2-5ca2-4ebf-93f2-76c03a5add49\" (UID: \"567c32b2-5ca2-4ebf-93f2-76c03a5add49\") " Dec 07 19:34:56 crc kubenswrapper[4815]: I1207 19:34:56.739368 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/567c32b2-5ca2-4ebf-93f2-76c03a5add49-config-data\") pod \"567c32b2-5ca2-4ebf-93f2-76c03a5add49\" (UID: \"567c32b2-5ca2-4ebf-93f2-76c03a5add49\") " Dec 07 19:34:56 crc kubenswrapper[4815]: I1207 19:34:56.739388 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/567c32b2-5ca2-4ebf-93f2-76c03a5add49-combined-ca-bundle\") pod \"567c32b2-5ca2-4ebf-93f2-76c03a5add49\" (UID: \"567c32b2-5ca2-4ebf-93f2-76c03a5add49\") " Dec 07 19:34:56 crc kubenswrapper[4815]: I1207 19:34:56.745551 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/567c32b2-5ca2-4ebf-93f2-76c03a5add49-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "567c32b2-5ca2-4ebf-93f2-76c03a5add49" (UID: "567c32b2-5ca2-4ebf-93f2-76c03a5add49"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:34:56 crc kubenswrapper[4815]: I1207 19:34:56.745578 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/567c32b2-5ca2-4ebf-93f2-76c03a5add49-scripts" (OuterVolumeSpecName: "scripts") pod "567c32b2-5ca2-4ebf-93f2-76c03a5add49" (UID: "567c32b2-5ca2-4ebf-93f2-76c03a5add49"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:34:56 crc kubenswrapper[4815]: I1207 19:34:56.747472 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/567c32b2-5ca2-4ebf-93f2-76c03a5add49-kube-api-access-rgcsv" (OuterVolumeSpecName: "kube-api-access-rgcsv") pod "567c32b2-5ca2-4ebf-93f2-76c03a5add49" (UID: "567c32b2-5ca2-4ebf-93f2-76c03a5add49"). InnerVolumeSpecName "kube-api-access-rgcsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:34:56 crc kubenswrapper[4815]: I1207 19:34:56.753584 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/567c32b2-5ca2-4ebf-93f2-76c03a5add49-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "567c32b2-5ca2-4ebf-93f2-76c03a5add49" (UID: "567c32b2-5ca2-4ebf-93f2-76c03a5add49"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:34:56 crc kubenswrapper[4815]: I1207 19:34:56.781130 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/567c32b2-5ca2-4ebf-93f2-76c03a5add49-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "567c32b2-5ca2-4ebf-93f2-76c03a5add49" (UID: "567c32b2-5ca2-4ebf-93f2-76c03a5add49"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:34:56 crc kubenswrapper[4815]: I1207 19:34:56.790492 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/567c32b2-5ca2-4ebf-93f2-76c03a5add49-config-data" (OuterVolumeSpecName: "config-data") pod "567c32b2-5ca2-4ebf-93f2-76c03a5add49" (UID: "567c32b2-5ca2-4ebf-93f2-76c03a5add49"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:34:56 crc kubenswrapper[4815]: I1207 19:34:56.840731 4815 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/567c32b2-5ca2-4ebf-93f2-76c03a5add49-scripts\") on node \"crc\" DevicePath \"\"" Dec 07 19:34:56 crc kubenswrapper[4815]: I1207 19:34:56.840768 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rgcsv\" (UniqueName: \"kubernetes.io/projected/567c32b2-5ca2-4ebf-93f2-76c03a5add49-kube-api-access-rgcsv\") on node \"crc\" DevicePath \"\"" Dec 07 19:34:56 crc kubenswrapper[4815]: I1207 19:34:56.840779 4815 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/567c32b2-5ca2-4ebf-93f2-76c03a5add49-config-data\") on node \"crc\" DevicePath \"\"" Dec 07 19:34:56 crc kubenswrapper[4815]: I1207 19:34:56.840788 4815 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/567c32b2-5ca2-4ebf-93f2-76c03a5add49-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 07 19:34:56 crc kubenswrapper[4815]: I1207 19:34:56.840796 4815 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/567c32b2-5ca2-4ebf-93f2-76c03a5add49-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 07 19:34:56 crc kubenswrapper[4815]: I1207 19:34:56.840805 4815 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/567c32b2-5ca2-4ebf-93f2-76c03a5add49-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 07 19:34:57 crc kubenswrapper[4815]: E1207 19:34:57.061364 4815 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Dec 07 19:34:57 crc kubenswrapper[4815]: E1207 19:34:57.061515 4815 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ppxvc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-5s8vb_openstack(a9829ff6-a37c-403a-9b72-8a2c1e3df5d5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 07 19:34:57 crc kubenswrapper[4815]: E1207 19:34:57.062871 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-5s8vb" podUID="a9829ff6-a37c-403a-9b72-8a2c1e3df5d5" Dec 07 19:34:57 crc kubenswrapper[4815]: I1207 19:34:57.087825 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-mhmdw" Dec 07 19:34:57 crc kubenswrapper[4815]: I1207 19:34:57.095523 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-s7wqs" Dec 07 19:34:57 crc kubenswrapper[4815]: I1207 19:34:57.246321 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e07853f-f24e-4bf6-8af1-15e4e9cccbc4-config-data\") pod \"9e07853f-f24e-4bf6-8af1-15e4e9cccbc4\" (UID: \"9e07853f-f24e-4bf6-8af1-15e4e9cccbc4\") " Dec 07 19:34:57 crc kubenswrapper[4815]: I1207 19:34:57.246460 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/32f95e51-757c-4167-ad8d-32f472266fe5-dns-svc\") pod \"32f95e51-757c-4167-ad8d-32f472266fe5\" (UID: \"32f95e51-757c-4167-ad8d-32f472266fe5\") " Dec 07 19:34:57 crc kubenswrapper[4815]: I1207 19:34:57.246511 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/32f95e51-757c-4167-ad8d-32f472266fe5-ovsdbserver-nb\") pod \"32f95e51-757c-4167-ad8d-32f472266fe5\" (UID: \"32f95e51-757c-4167-ad8d-32f472266fe5\") " Dec 07 19:34:57 crc kubenswrapper[4815]: I1207 19:34:57.246564 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32f95e51-757c-4167-ad8d-32f472266fe5-config\") pod \"32f95e51-757c-4167-ad8d-32f472266fe5\" (UID: \"32f95e51-757c-4167-ad8d-32f472266fe5\") " Dec 07 19:34:57 crc kubenswrapper[4815]: I1207 19:34:57.246612 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e07853f-f24e-4bf6-8af1-15e4e9cccbc4-combined-ca-bundle\") pod \"9e07853f-f24e-4bf6-8af1-15e4e9cccbc4\" (UID: \"9e07853f-f24e-4bf6-8af1-15e4e9cccbc4\") " Dec 07 19:34:57 crc kubenswrapper[4815]: I1207 19:34:57.246638 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9e07853f-f24e-4bf6-8af1-15e4e9cccbc4-db-sync-config-data\") pod \"9e07853f-f24e-4bf6-8af1-15e4e9cccbc4\" (UID: \"9e07853f-f24e-4bf6-8af1-15e4e9cccbc4\") " Dec 07 19:34:57 crc kubenswrapper[4815]: I1207 19:34:57.246743 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pfq5r\" (UniqueName: \"kubernetes.io/projected/9e07853f-f24e-4bf6-8af1-15e4e9cccbc4-kube-api-access-pfq5r\") pod \"9e07853f-f24e-4bf6-8af1-15e4e9cccbc4\" (UID: \"9e07853f-f24e-4bf6-8af1-15e4e9cccbc4\") " Dec 07 19:34:57 crc kubenswrapper[4815]: I1207 19:34:57.247645 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/32f95e51-757c-4167-ad8d-32f472266fe5-ovsdbserver-sb\") pod \"32f95e51-757c-4167-ad8d-32f472266fe5\" (UID: \"32f95e51-757c-4167-ad8d-32f472266fe5\") " Dec 07 19:34:57 crc kubenswrapper[4815]: I1207 19:34:57.247676 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m9nkb\" (UniqueName: \"kubernetes.io/projected/32f95e51-757c-4167-ad8d-32f472266fe5-kube-api-access-m9nkb\") pod \"32f95e51-757c-4167-ad8d-32f472266fe5\" (UID: \"32f95e51-757c-4167-ad8d-32f472266fe5\") " Dec 07 19:34:57 crc kubenswrapper[4815]: I1207 19:34:57.250288 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e07853f-f24e-4bf6-8af1-15e4e9cccbc4-kube-api-access-pfq5r" (OuterVolumeSpecName: "kube-api-access-pfq5r") pod "9e07853f-f24e-4bf6-8af1-15e4e9cccbc4" (UID: "9e07853f-f24e-4bf6-8af1-15e4e9cccbc4"). InnerVolumeSpecName "kube-api-access-pfq5r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:34:57 crc kubenswrapper[4815]: I1207 19:34:57.256669 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e07853f-f24e-4bf6-8af1-15e4e9cccbc4-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "9e07853f-f24e-4bf6-8af1-15e4e9cccbc4" (UID: "9e07853f-f24e-4bf6-8af1-15e4e9cccbc4"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:34:57 crc kubenswrapper[4815]: I1207 19:34:57.265682 4815 generic.go:334] "Generic (PLEG): container finished" podID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" containerID="f9fe1f41b8a7f99c3095c8366f5a8da29f411acbdda680568a628c2a0720c31e" exitCode=0 Dec 07 19:34:57 crc kubenswrapper[4815]: I1207 19:34:57.265754 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" event={"ID":"3d662ba2-aa03-4eea-bd30-8ad40638f6c7","Type":"ContainerDied","Data":"f9fe1f41b8a7f99c3095c8366f5a8da29f411acbdda680568a628c2a0720c31e"} Dec 07 19:34:57 crc kubenswrapper[4815]: I1207 19:34:57.265792 4815 scope.go:117] "RemoveContainer" containerID="79cc7c5fc46172fc78f3ba5349136e2b03ed048984c9a5cc4bf7488a43b013ac" Dec 07 19:34:57 crc kubenswrapper[4815]: I1207 19:34:57.266071 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32f95e51-757c-4167-ad8d-32f472266fe5-kube-api-access-m9nkb" (OuterVolumeSpecName: "kube-api-access-m9nkb") pod "32f95e51-757c-4167-ad8d-32f472266fe5" (UID: "32f95e51-757c-4167-ad8d-32f472266fe5"). InnerVolumeSpecName "kube-api-access-m9nkb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:34:57 crc kubenswrapper[4815]: I1207 19:34:57.269730 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-s7wqs" event={"ID":"9e07853f-f24e-4bf6-8af1-15e4e9cccbc4","Type":"ContainerDied","Data":"0708348e73ca8b638ead18552c8d669357b12970682a4beaad8e3da5e1e21d87"} Dec 07 19:34:57 crc kubenswrapper[4815]: I1207 19:34:57.269754 4815 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0708348e73ca8b638ead18552c8d669357b12970682a4beaad8e3da5e1e21d87" Dec 07 19:34:57 crc kubenswrapper[4815]: I1207 19:34:57.269853 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-s7wqs" Dec 07 19:34:57 crc kubenswrapper[4815]: I1207 19:34:57.273043 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e07853f-f24e-4bf6-8af1-15e4e9cccbc4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9e07853f-f24e-4bf6-8af1-15e4e9cccbc4" (UID: "9e07853f-f24e-4bf6-8af1-15e4e9cccbc4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:34:57 crc kubenswrapper[4815]: I1207 19:34:57.273530 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-qtv6m" event={"ID":"567c32b2-5ca2-4ebf-93f2-76c03a5add49","Type":"ContainerDied","Data":"5ba5493c676011ec953fa9a0ee649acc9c7bbf7b9423425d6cd0a0e8ff0ad60d"} Dec 07 19:34:57 crc kubenswrapper[4815]: I1207 19:34:57.273554 4815 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ba5493c676011ec953fa9a0ee649acc9c7bbf7b9423425d6cd0a0e8ff0ad60d" Dec 07 19:34:57 crc kubenswrapper[4815]: I1207 19:34:57.273595 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-qtv6m" Dec 07 19:34:57 crc kubenswrapper[4815]: I1207 19:34:57.284112 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-mhmdw" event={"ID":"32f95e51-757c-4167-ad8d-32f472266fe5","Type":"ContainerDied","Data":"fa7a4880bd65fed8bace853161447f89603fc2e95d799864354b90ee528c5d4c"} Dec 07 19:34:57 crc kubenswrapper[4815]: I1207 19:34:57.284140 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-mhmdw" Dec 07 19:34:57 crc kubenswrapper[4815]: E1207 19:34:57.288123 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-5s8vb" podUID="a9829ff6-a37c-403a-9b72-8a2c1e3df5d5" Dec 07 19:34:57 crc kubenswrapper[4815]: I1207 19:34:57.313765 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32f95e51-757c-4167-ad8d-32f472266fe5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "32f95e51-757c-4167-ad8d-32f472266fe5" (UID: "32f95e51-757c-4167-ad8d-32f472266fe5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:34:57 crc kubenswrapper[4815]: I1207 19:34:57.313785 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32f95e51-757c-4167-ad8d-32f472266fe5-config" (OuterVolumeSpecName: "config") pod "32f95e51-757c-4167-ad8d-32f472266fe5" (UID: "32f95e51-757c-4167-ad8d-32f472266fe5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:34:57 crc kubenswrapper[4815]: I1207 19:34:57.314589 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32f95e51-757c-4167-ad8d-32f472266fe5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "32f95e51-757c-4167-ad8d-32f472266fe5" (UID: "32f95e51-757c-4167-ad8d-32f472266fe5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:34:57 crc kubenswrapper[4815]: I1207 19:34:57.324719 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32f95e51-757c-4167-ad8d-32f472266fe5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "32f95e51-757c-4167-ad8d-32f472266fe5" (UID: "32f95e51-757c-4167-ad8d-32f472266fe5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:34:57 crc kubenswrapper[4815]: I1207 19:34:57.328090 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e07853f-f24e-4bf6-8af1-15e4e9cccbc4-config-data" (OuterVolumeSpecName: "config-data") pod "9e07853f-f24e-4bf6-8af1-15e4e9cccbc4" (UID: "9e07853f-f24e-4bf6-8af1-15e4e9cccbc4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:34:57 crc kubenswrapper[4815]: I1207 19:34:57.349940 4815 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32f95e51-757c-4167-ad8d-32f472266fe5-config\") on node \"crc\" DevicePath \"\"" Dec 07 19:34:57 crc kubenswrapper[4815]: I1207 19:34:57.349967 4815 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e07853f-f24e-4bf6-8af1-15e4e9cccbc4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 07 19:34:57 crc kubenswrapper[4815]: I1207 19:34:57.349977 4815 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9e07853f-f24e-4bf6-8af1-15e4e9cccbc4-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 07 19:34:57 crc kubenswrapper[4815]: I1207 19:34:57.349987 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pfq5r\" (UniqueName: \"kubernetes.io/projected/9e07853f-f24e-4bf6-8af1-15e4e9cccbc4-kube-api-access-pfq5r\") on node \"crc\" DevicePath \"\"" Dec 07 19:34:57 crc kubenswrapper[4815]: I1207 19:34:57.350015 4815 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/32f95e51-757c-4167-ad8d-32f472266fe5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 07 19:34:57 crc kubenswrapper[4815]: I1207 19:34:57.350025 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m9nkb\" (UniqueName: \"kubernetes.io/projected/32f95e51-757c-4167-ad8d-32f472266fe5-kube-api-access-m9nkb\") on node \"crc\" DevicePath \"\"" Dec 07 19:34:57 crc kubenswrapper[4815]: I1207 19:34:57.350032 4815 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e07853f-f24e-4bf6-8af1-15e4e9cccbc4-config-data\") on node \"crc\" DevicePath \"\"" Dec 07 19:34:57 crc kubenswrapper[4815]: I1207 19:34:57.350043 4815 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/32f95e51-757c-4167-ad8d-32f472266fe5-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 07 19:34:57 crc kubenswrapper[4815]: I1207 19:34:57.350052 4815 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/32f95e51-757c-4167-ad8d-32f472266fe5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 07 19:34:57 crc kubenswrapper[4815]: I1207 19:34:57.659985 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-mhmdw"] Dec 07 19:34:57 crc kubenswrapper[4815]: I1207 19:34:57.677998 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8554648995-mhmdw"] Dec 07 19:34:57 crc kubenswrapper[4815]: I1207 19:34:57.692155 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-qtv6m"] Dec 07 19:34:57 crc kubenswrapper[4815]: I1207 19:34:57.699452 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-qtv6m"] Dec 07 19:34:57 crc kubenswrapper[4815]: I1207 19:34:57.794178 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32f95e51-757c-4167-ad8d-32f472266fe5" path="/var/lib/kubelet/pods/32f95e51-757c-4167-ad8d-32f472266fe5/volumes" Dec 07 19:34:57 crc kubenswrapper[4815]: I1207 19:34:57.795326 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="567c32b2-5ca2-4ebf-93f2-76c03a5add49" path="/var/lib/kubelet/pods/567c32b2-5ca2-4ebf-93f2-76c03a5add49/volumes" Dec 07 19:34:57 crc kubenswrapper[4815]: I1207 19:34:57.796044 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-xxh6g"] Dec 07 19:34:57 crc kubenswrapper[4815]: E1207 19:34:57.796336 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e07853f-f24e-4bf6-8af1-15e4e9cccbc4" containerName="glance-db-sync" Dec 07 19:34:57 crc kubenswrapper[4815]: I1207 19:34:57.796354 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e07853f-f24e-4bf6-8af1-15e4e9cccbc4" containerName="glance-db-sync" Dec 07 19:34:57 crc kubenswrapper[4815]: E1207 19:34:57.796373 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="567c32b2-5ca2-4ebf-93f2-76c03a5add49" containerName="keystone-bootstrap" Dec 07 19:34:57 crc kubenswrapper[4815]: I1207 19:34:57.796381 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="567c32b2-5ca2-4ebf-93f2-76c03a5add49" containerName="keystone-bootstrap" Dec 07 19:34:57 crc kubenswrapper[4815]: E1207 19:34:57.796401 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32f95e51-757c-4167-ad8d-32f472266fe5" containerName="dnsmasq-dns" Dec 07 19:34:57 crc kubenswrapper[4815]: I1207 19:34:57.796409 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="32f95e51-757c-4167-ad8d-32f472266fe5" containerName="dnsmasq-dns" Dec 07 19:34:57 crc kubenswrapper[4815]: E1207 19:34:57.796417 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90d2c008-b1eb-4be6-b9aa-932ac14e287b" containerName="init" Dec 07 19:34:57 crc kubenswrapper[4815]: I1207 19:34:57.796424 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="90d2c008-b1eb-4be6-b9aa-932ac14e287b" containerName="init" Dec 07 19:34:57 crc kubenswrapper[4815]: E1207 19:34:57.796440 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32f95e51-757c-4167-ad8d-32f472266fe5" containerName="init" Dec 07 19:34:57 crc kubenswrapper[4815]: I1207 19:34:57.796448 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="32f95e51-757c-4167-ad8d-32f472266fe5" containerName="init" Dec 07 19:34:57 crc kubenswrapper[4815]: I1207 19:34:57.796764 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="567c32b2-5ca2-4ebf-93f2-76c03a5add49" containerName="keystone-bootstrap" Dec 07 19:34:57 crc kubenswrapper[4815]: I1207 19:34:57.796796 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e07853f-f24e-4bf6-8af1-15e4e9cccbc4" containerName="glance-db-sync" Dec 07 19:34:57 crc kubenswrapper[4815]: I1207 19:34:57.796810 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="90d2c008-b1eb-4be6-b9aa-932ac14e287b" containerName="init" Dec 07 19:34:57 crc kubenswrapper[4815]: I1207 19:34:57.796827 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="32f95e51-757c-4167-ad8d-32f472266fe5" containerName="dnsmasq-dns" Dec 07 19:34:57 crc kubenswrapper[4815]: I1207 19:34:57.797771 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-xxh6g"] Dec 07 19:34:57 crc kubenswrapper[4815]: I1207 19:34:57.797882 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xxh6g" Dec 07 19:34:57 crc kubenswrapper[4815]: I1207 19:34:57.800255 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 07 19:34:57 crc kubenswrapper[4815]: I1207 19:34:57.800764 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 07 19:34:57 crc kubenswrapper[4815]: I1207 19:34:57.801006 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-mdlc7" Dec 07 19:34:57 crc kubenswrapper[4815]: I1207 19:34:57.801083 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 07 19:34:57 crc kubenswrapper[4815]: I1207 19:34:57.801147 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 07 19:34:57 crc kubenswrapper[4815]: I1207 19:34:57.967248 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6njtq\" (UniqueName: \"kubernetes.io/projected/2f0f688e-04dc-46a6-a4eb-1c8d5e635abe-kube-api-access-6njtq\") pod \"keystone-bootstrap-xxh6g\" (UID: \"2f0f688e-04dc-46a6-a4eb-1c8d5e635abe\") " pod="openstack/keystone-bootstrap-xxh6g" Dec 07 19:34:57 crc kubenswrapper[4815]: I1207 19:34:57.967315 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f0f688e-04dc-46a6-a4eb-1c8d5e635abe-scripts\") pod \"keystone-bootstrap-xxh6g\" (UID: \"2f0f688e-04dc-46a6-a4eb-1c8d5e635abe\") " pod="openstack/keystone-bootstrap-xxh6g" Dec 07 19:34:57 crc kubenswrapper[4815]: I1207 19:34:57.967396 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2f0f688e-04dc-46a6-a4eb-1c8d5e635abe-fernet-keys\") pod \"keystone-bootstrap-xxh6g\" (UID: \"2f0f688e-04dc-46a6-a4eb-1c8d5e635abe\") " pod="openstack/keystone-bootstrap-xxh6g" Dec 07 19:34:57 crc kubenswrapper[4815]: I1207 19:34:57.967435 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f0f688e-04dc-46a6-a4eb-1c8d5e635abe-combined-ca-bundle\") pod \"keystone-bootstrap-xxh6g\" (UID: \"2f0f688e-04dc-46a6-a4eb-1c8d5e635abe\") " pod="openstack/keystone-bootstrap-xxh6g" Dec 07 19:34:57 crc kubenswrapper[4815]: I1207 19:34:57.967455 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f0f688e-04dc-46a6-a4eb-1c8d5e635abe-config-data\") pod \"keystone-bootstrap-xxh6g\" (UID: \"2f0f688e-04dc-46a6-a4eb-1c8d5e635abe\") " pod="openstack/keystone-bootstrap-xxh6g" Dec 07 19:34:57 crc kubenswrapper[4815]: I1207 19:34:57.967470 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2f0f688e-04dc-46a6-a4eb-1c8d5e635abe-credential-keys\") pod \"keystone-bootstrap-xxh6g\" (UID: \"2f0f688e-04dc-46a6-a4eb-1c8d5e635abe\") " pod="openstack/keystone-bootstrap-xxh6g" Dec 07 19:34:58 crc kubenswrapper[4815]: I1207 19:34:58.069182 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f0f688e-04dc-46a6-a4eb-1c8d5e635abe-combined-ca-bundle\") pod \"keystone-bootstrap-xxh6g\" (UID: \"2f0f688e-04dc-46a6-a4eb-1c8d5e635abe\") " pod="openstack/keystone-bootstrap-xxh6g" Dec 07 19:34:58 crc kubenswrapper[4815]: I1207 19:34:58.069226 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f0f688e-04dc-46a6-a4eb-1c8d5e635abe-config-data\") pod \"keystone-bootstrap-xxh6g\" (UID: \"2f0f688e-04dc-46a6-a4eb-1c8d5e635abe\") " pod="openstack/keystone-bootstrap-xxh6g" Dec 07 19:34:58 crc kubenswrapper[4815]: I1207 19:34:58.069251 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2f0f688e-04dc-46a6-a4eb-1c8d5e635abe-credential-keys\") pod \"keystone-bootstrap-xxh6g\" (UID: \"2f0f688e-04dc-46a6-a4eb-1c8d5e635abe\") " pod="openstack/keystone-bootstrap-xxh6g" Dec 07 19:34:58 crc kubenswrapper[4815]: I1207 19:34:58.069314 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6njtq\" (UniqueName: \"kubernetes.io/projected/2f0f688e-04dc-46a6-a4eb-1c8d5e635abe-kube-api-access-6njtq\") pod \"keystone-bootstrap-xxh6g\" (UID: \"2f0f688e-04dc-46a6-a4eb-1c8d5e635abe\") " pod="openstack/keystone-bootstrap-xxh6g" Dec 07 19:34:58 crc kubenswrapper[4815]: I1207 19:34:58.069336 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f0f688e-04dc-46a6-a4eb-1c8d5e635abe-scripts\") pod \"keystone-bootstrap-xxh6g\" (UID: \"2f0f688e-04dc-46a6-a4eb-1c8d5e635abe\") " pod="openstack/keystone-bootstrap-xxh6g" Dec 07 19:34:58 crc kubenswrapper[4815]: I1207 19:34:58.069381 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2f0f688e-04dc-46a6-a4eb-1c8d5e635abe-fernet-keys\") pod \"keystone-bootstrap-xxh6g\" (UID: \"2f0f688e-04dc-46a6-a4eb-1c8d5e635abe\") " pod="openstack/keystone-bootstrap-xxh6g" Dec 07 19:34:58 crc kubenswrapper[4815]: I1207 19:34:58.075417 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2f0f688e-04dc-46a6-a4eb-1c8d5e635abe-credential-keys\") pod \"keystone-bootstrap-xxh6g\" (UID: \"2f0f688e-04dc-46a6-a4eb-1c8d5e635abe\") " pod="openstack/keystone-bootstrap-xxh6g" Dec 07 19:34:58 crc kubenswrapper[4815]: I1207 19:34:58.075603 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f0f688e-04dc-46a6-a4eb-1c8d5e635abe-combined-ca-bundle\") pod \"keystone-bootstrap-xxh6g\" (UID: \"2f0f688e-04dc-46a6-a4eb-1c8d5e635abe\") " pod="openstack/keystone-bootstrap-xxh6g" Dec 07 19:34:58 crc kubenswrapper[4815]: I1207 19:34:58.079477 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f0f688e-04dc-46a6-a4eb-1c8d5e635abe-scripts\") pod \"keystone-bootstrap-xxh6g\" (UID: \"2f0f688e-04dc-46a6-a4eb-1c8d5e635abe\") " pod="openstack/keystone-bootstrap-xxh6g" Dec 07 19:34:58 crc kubenswrapper[4815]: I1207 19:34:58.084696 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f0f688e-04dc-46a6-a4eb-1c8d5e635abe-config-data\") pod \"keystone-bootstrap-xxh6g\" (UID: \"2f0f688e-04dc-46a6-a4eb-1c8d5e635abe\") " pod="openstack/keystone-bootstrap-xxh6g" Dec 07 19:34:58 crc kubenswrapper[4815]: I1207 19:34:58.088323 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2f0f688e-04dc-46a6-a4eb-1c8d5e635abe-fernet-keys\") pod \"keystone-bootstrap-xxh6g\" (UID: \"2f0f688e-04dc-46a6-a4eb-1c8d5e635abe\") " pod="openstack/keystone-bootstrap-xxh6g" Dec 07 19:34:58 crc kubenswrapper[4815]: I1207 19:34:58.088731 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6njtq\" (UniqueName: \"kubernetes.io/projected/2f0f688e-04dc-46a6-a4eb-1c8d5e635abe-kube-api-access-6njtq\") pod \"keystone-bootstrap-xxh6g\" (UID: \"2f0f688e-04dc-46a6-a4eb-1c8d5e635abe\") " pod="openstack/keystone-bootstrap-xxh6g" Dec 07 19:34:58 crc kubenswrapper[4815]: I1207 19:34:58.122232 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xxh6g" Dec 07 19:34:58 crc kubenswrapper[4815]: I1207 19:34:58.129250 4815 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-8554648995-mhmdw" podUID="32f95e51-757c-4167-ad8d-32f472266fe5" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.111:5353: i/o timeout" Dec 07 19:34:58 crc kubenswrapper[4815]: I1207 19:34:58.295950 4815 generic.go:334] "Generic (PLEG): container finished" podID="a333b1b6-9ef2-4332-9622-e9fcc9d41854" containerID="eac252735903913f8661462e07b24ff978ccdcc2edc9ab499bb5168fbaab94b7" exitCode=0 Dec 07 19:34:58 crc kubenswrapper[4815]: I1207 19:34:58.295995 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-p5hgv" event={"ID":"a333b1b6-9ef2-4332-9622-e9fcc9d41854","Type":"ContainerDied","Data":"eac252735903913f8661462e07b24ff978ccdcc2edc9ab499bb5168fbaab94b7"} Dec 07 19:34:58 crc kubenswrapper[4815]: I1207 19:34:58.602503 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b6dbdb6f5-qffnw"] Dec 07 19:34:58 crc kubenswrapper[4815]: I1207 19:34:58.604182 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b6dbdb6f5-qffnw" Dec 07 19:34:58 crc kubenswrapper[4815]: I1207 19:34:58.611744 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b6dbdb6f5-qffnw"] Dec 07 19:34:58 crc kubenswrapper[4815]: I1207 19:34:58.780741 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87674f69-b732-40d9-98aa-fa6827cc83f9-config\") pod \"dnsmasq-dns-5b6dbdb6f5-qffnw\" (UID: \"87674f69-b732-40d9-98aa-fa6827cc83f9\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-qffnw" Dec 07 19:34:58 crc kubenswrapper[4815]: I1207 19:34:58.780794 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/87674f69-b732-40d9-98aa-fa6827cc83f9-ovsdbserver-sb\") pod \"dnsmasq-dns-5b6dbdb6f5-qffnw\" (UID: \"87674f69-b732-40d9-98aa-fa6827cc83f9\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-qffnw" Dec 07 19:34:58 crc kubenswrapper[4815]: I1207 19:34:58.780945 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrlt5\" (UniqueName: \"kubernetes.io/projected/87674f69-b732-40d9-98aa-fa6827cc83f9-kube-api-access-vrlt5\") pod \"dnsmasq-dns-5b6dbdb6f5-qffnw\" (UID: \"87674f69-b732-40d9-98aa-fa6827cc83f9\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-qffnw" Dec 07 19:34:58 crc kubenswrapper[4815]: I1207 19:34:58.781017 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/87674f69-b732-40d9-98aa-fa6827cc83f9-ovsdbserver-nb\") pod \"dnsmasq-dns-5b6dbdb6f5-qffnw\" (UID: \"87674f69-b732-40d9-98aa-fa6827cc83f9\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-qffnw" Dec 07 19:34:58 crc kubenswrapper[4815]: I1207 19:34:58.781059 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87674f69-b732-40d9-98aa-fa6827cc83f9-dns-svc\") pod \"dnsmasq-dns-5b6dbdb6f5-qffnw\" (UID: \"87674f69-b732-40d9-98aa-fa6827cc83f9\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-qffnw" Dec 07 19:34:58 crc kubenswrapper[4815]: I1207 19:34:58.883171 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87674f69-b732-40d9-98aa-fa6827cc83f9-config\") pod \"dnsmasq-dns-5b6dbdb6f5-qffnw\" (UID: \"87674f69-b732-40d9-98aa-fa6827cc83f9\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-qffnw" Dec 07 19:34:58 crc kubenswrapper[4815]: I1207 19:34:58.883235 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/87674f69-b732-40d9-98aa-fa6827cc83f9-ovsdbserver-sb\") pod \"dnsmasq-dns-5b6dbdb6f5-qffnw\" (UID: \"87674f69-b732-40d9-98aa-fa6827cc83f9\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-qffnw" Dec 07 19:34:58 crc kubenswrapper[4815]: I1207 19:34:58.883338 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrlt5\" (UniqueName: \"kubernetes.io/projected/87674f69-b732-40d9-98aa-fa6827cc83f9-kube-api-access-vrlt5\") pod \"dnsmasq-dns-5b6dbdb6f5-qffnw\" (UID: \"87674f69-b732-40d9-98aa-fa6827cc83f9\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-qffnw" Dec 07 19:34:58 crc kubenswrapper[4815]: I1207 19:34:58.883386 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/87674f69-b732-40d9-98aa-fa6827cc83f9-ovsdbserver-nb\") pod \"dnsmasq-dns-5b6dbdb6f5-qffnw\" (UID: \"87674f69-b732-40d9-98aa-fa6827cc83f9\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-qffnw" Dec 07 19:34:58 crc kubenswrapper[4815]: I1207 19:34:58.883431 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87674f69-b732-40d9-98aa-fa6827cc83f9-dns-svc\") pod \"dnsmasq-dns-5b6dbdb6f5-qffnw\" (UID: \"87674f69-b732-40d9-98aa-fa6827cc83f9\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-qffnw" Dec 07 19:34:58 crc kubenswrapper[4815]: I1207 19:34:58.884714 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/87674f69-b732-40d9-98aa-fa6827cc83f9-ovsdbserver-sb\") pod \"dnsmasq-dns-5b6dbdb6f5-qffnw\" (UID: \"87674f69-b732-40d9-98aa-fa6827cc83f9\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-qffnw" Dec 07 19:34:58 crc kubenswrapper[4815]: I1207 19:34:58.884763 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87674f69-b732-40d9-98aa-fa6827cc83f9-dns-svc\") pod \"dnsmasq-dns-5b6dbdb6f5-qffnw\" (UID: \"87674f69-b732-40d9-98aa-fa6827cc83f9\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-qffnw" Dec 07 19:34:58 crc kubenswrapper[4815]: I1207 19:34:58.885184 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87674f69-b732-40d9-98aa-fa6827cc83f9-config\") pod \"dnsmasq-dns-5b6dbdb6f5-qffnw\" (UID: \"87674f69-b732-40d9-98aa-fa6827cc83f9\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-qffnw" Dec 07 19:34:58 crc kubenswrapper[4815]: I1207 19:34:58.885435 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/87674f69-b732-40d9-98aa-fa6827cc83f9-ovsdbserver-nb\") pod \"dnsmasq-dns-5b6dbdb6f5-qffnw\" (UID: \"87674f69-b732-40d9-98aa-fa6827cc83f9\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-qffnw" Dec 07 19:34:58 crc kubenswrapper[4815]: I1207 19:34:58.900268 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrlt5\" (UniqueName: \"kubernetes.io/projected/87674f69-b732-40d9-98aa-fa6827cc83f9-kube-api-access-vrlt5\") pod \"dnsmasq-dns-5b6dbdb6f5-qffnw\" (UID: \"87674f69-b732-40d9-98aa-fa6827cc83f9\") " pod="openstack/dnsmasq-dns-5b6dbdb6f5-qffnw" Dec 07 19:34:58 crc kubenswrapper[4815]: I1207 19:34:58.925154 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b6dbdb6f5-qffnw" Dec 07 19:35:00 crc kubenswrapper[4815]: E1207 19:35:00.126894 4815 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Dec 07 19:35:00 crc kubenswrapper[4815]: E1207 19:35:00.127279 4815 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vdzfx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-ljqjq_openstack(cfb75e0b-4e7b-484f-832b-5ed69650f1f1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 07 19:35:00 crc kubenswrapper[4815]: E1207 19:35:00.128401 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-ljqjq" podUID="cfb75e0b-4e7b-484f-832b-5ed69650f1f1" Dec 07 19:35:00 crc kubenswrapper[4815]: I1207 19:35:00.196595 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-p5hgv" Dec 07 19:35:00 crc kubenswrapper[4815]: I1207 19:35:00.208586 4815 scope.go:117] "RemoveContainer" containerID="37094d35dbf9ecac4d2980e440888c791ee77a7066969174f63a4bb00a828281" Dec 07 19:35:00 crc kubenswrapper[4815]: I1207 19:35:00.315881 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-p5hgv" event={"ID":"a333b1b6-9ef2-4332-9622-e9fcc9d41854","Type":"ContainerDied","Data":"e1c35647606272b607d17c35a2f8175481e0c64dcdb3341f6dd5339334895fd2"} Dec 07 19:35:00 crc kubenswrapper[4815]: I1207 19:35:00.316272 4815 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e1c35647606272b607d17c35a2f8175481e0c64dcdb3341f6dd5339334895fd2" Dec 07 19:35:00 crc kubenswrapper[4815]: I1207 19:35:00.315886 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-p5hgv" Dec 07 19:35:00 crc kubenswrapper[4815]: E1207 19:35:00.335608 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-ljqjq" podUID="cfb75e0b-4e7b-484f-832b-5ed69650f1f1" Dec 07 19:35:00 crc kubenswrapper[4815]: I1207 19:35:00.336434 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-crgd5\" (UniqueName: \"kubernetes.io/projected/a333b1b6-9ef2-4332-9622-e9fcc9d41854-kube-api-access-crgd5\") pod \"a333b1b6-9ef2-4332-9622-e9fcc9d41854\" (UID: \"a333b1b6-9ef2-4332-9622-e9fcc9d41854\") " Dec 07 19:35:00 crc kubenswrapper[4815]: I1207 19:35:00.336637 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a333b1b6-9ef2-4332-9622-e9fcc9d41854-combined-ca-bundle\") pod \"a333b1b6-9ef2-4332-9622-e9fcc9d41854\" (UID: \"a333b1b6-9ef2-4332-9622-e9fcc9d41854\") " Dec 07 19:35:00 crc kubenswrapper[4815]: I1207 19:35:00.336738 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a333b1b6-9ef2-4332-9622-e9fcc9d41854-config\") pod \"a333b1b6-9ef2-4332-9622-e9fcc9d41854\" (UID: \"a333b1b6-9ef2-4332-9622-e9fcc9d41854\") " Dec 07 19:35:00 crc kubenswrapper[4815]: I1207 19:35:00.347695 4815 scope.go:117] "RemoveContainer" containerID="d28702612d320e416cdc28f964877956ebcc35a1d48210bdfc819fcff8b5e4f7" Dec 07 19:35:00 crc kubenswrapper[4815]: I1207 19:35:00.371740 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a333b1b6-9ef2-4332-9622-e9fcc9d41854-kube-api-access-crgd5" (OuterVolumeSpecName: "kube-api-access-crgd5") pod "a333b1b6-9ef2-4332-9622-e9fcc9d41854" (UID: "a333b1b6-9ef2-4332-9622-e9fcc9d41854"). InnerVolumeSpecName "kube-api-access-crgd5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:35:00 crc kubenswrapper[4815]: I1207 19:35:00.421697 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a333b1b6-9ef2-4332-9622-e9fcc9d41854-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a333b1b6-9ef2-4332-9622-e9fcc9d41854" (UID: "a333b1b6-9ef2-4332-9622-e9fcc9d41854"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:35:00 crc kubenswrapper[4815]: I1207 19:35:00.439662 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-crgd5\" (UniqueName: \"kubernetes.io/projected/a333b1b6-9ef2-4332-9622-e9fcc9d41854-kube-api-access-crgd5\") on node \"crc\" DevicePath \"\"" Dec 07 19:35:00 crc kubenswrapper[4815]: I1207 19:35:00.439703 4815 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a333b1b6-9ef2-4332-9622-e9fcc9d41854-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 07 19:35:00 crc kubenswrapper[4815]: I1207 19:35:00.570036 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a333b1b6-9ef2-4332-9622-e9fcc9d41854-config" (OuterVolumeSpecName: "config") pod "a333b1b6-9ef2-4332-9622-e9fcc9d41854" (UID: "a333b1b6-9ef2-4332-9622-e9fcc9d41854"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:35:00 crc kubenswrapper[4815]: I1207 19:35:00.571748 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b6dbdb6f5-qffnw"] Dec 07 19:35:00 crc kubenswrapper[4815]: I1207 19:35:00.609621 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5f66db59b9-qc6xg"] Dec 07 19:35:00 crc kubenswrapper[4815]: E1207 19:35:00.610232 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a333b1b6-9ef2-4332-9622-e9fcc9d41854" containerName="neutron-db-sync" Dec 07 19:35:00 crc kubenswrapper[4815]: I1207 19:35:00.610250 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="a333b1b6-9ef2-4332-9622-e9fcc9d41854" containerName="neutron-db-sync" Dec 07 19:35:00 crc kubenswrapper[4815]: I1207 19:35:00.610503 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="a333b1b6-9ef2-4332-9622-e9fcc9d41854" containerName="neutron-db-sync" Dec 07 19:35:00 crc kubenswrapper[4815]: I1207 19:35:00.611909 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f66db59b9-qc6xg" Dec 07 19:35:00 crc kubenswrapper[4815]: I1207 19:35:00.617888 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f66db59b9-qc6xg"] Dec 07 19:35:00 crc kubenswrapper[4815]: I1207 19:35:00.671825 4815 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/a333b1b6-9ef2-4332-9622-e9fcc9d41854-config\") on node \"crc\" DevicePath \"\"" Dec 07 19:35:00 crc kubenswrapper[4815]: I1207 19:35:00.723111 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-66fb95d9b4-wdncv"] Dec 07 19:35:00 crc kubenswrapper[4815]: I1207 19:35:00.724596 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-66fb95d9b4-wdncv" Dec 07 19:35:00 crc kubenswrapper[4815]: I1207 19:35:00.740945 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 07 19:35:00 crc kubenswrapper[4815]: I1207 19:35:00.749520 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-zlbgr" Dec 07 19:35:00 crc kubenswrapper[4815]: I1207 19:35:00.750081 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 07 19:35:00 crc kubenswrapper[4815]: I1207 19:35:00.750230 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Dec 07 19:35:00 crc kubenswrapper[4815]: I1207 19:35:00.764804 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-66fb95d9b4-wdncv"] Dec 07 19:35:00 crc kubenswrapper[4815]: I1207 19:35:00.778292 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3d79808-d535-4247-b6d3-6152049a185e-config\") pod \"dnsmasq-dns-5f66db59b9-qc6xg\" (UID: \"d3d79808-d535-4247-b6d3-6152049a185e\") " pod="openstack/dnsmasq-dns-5f66db59b9-qc6xg" Dec 07 19:35:00 crc kubenswrapper[4815]: I1207 19:35:00.778348 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d3d79808-d535-4247-b6d3-6152049a185e-dns-svc\") pod \"dnsmasq-dns-5f66db59b9-qc6xg\" (UID: \"d3d79808-d535-4247-b6d3-6152049a185e\") " pod="openstack/dnsmasq-dns-5f66db59b9-qc6xg" Dec 07 19:35:00 crc kubenswrapper[4815]: I1207 19:35:00.778388 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjrh7\" (UniqueName: \"kubernetes.io/projected/d3d79808-d535-4247-b6d3-6152049a185e-kube-api-access-bjrh7\") pod \"dnsmasq-dns-5f66db59b9-qc6xg\" (UID: \"d3d79808-d535-4247-b6d3-6152049a185e\") " pod="openstack/dnsmasq-dns-5f66db59b9-qc6xg" Dec 07 19:35:00 crc kubenswrapper[4815]: I1207 19:35:00.778448 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d3d79808-d535-4247-b6d3-6152049a185e-ovsdbserver-sb\") pod \"dnsmasq-dns-5f66db59b9-qc6xg\" (UID: \"d3d79808-d535-4247-b6d3-6152049a185e\") " pod="openstack/dnsmasq-dns-5f66db59b9-qc6xg" Dec 07 19:35:00 crc kubenswrapper[4815]: I1207 19:35:00.778498 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d3d79808-d535-4247-b6d3-6152049a185e-ovsdbserver-nb\") pod \"dnsmasq-dns-5f66db59b9-qc6xg\" (UID: \"d3d79808-d535-4247-b6d3-6152049a185e\") " pod="openstack/dnsmasq-dns-5f66db59b9-qc6xg" Dec 07 19:35:00 crc kubenswrapper[4815]: I1207 19:35:00.869757 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-xxh6g"] Dec 07 19:35:00 crc kubenswrapper[4815]: I1207 19:35:00.881865 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/092eb215-4eaa-4402-9692-1cf4aa9928e3-httpd-config\") pod \"neutron-66fb95d9b4-wdncv\" (UID: \"092eb215-4eaa-4402-9692-1cf4aa9928e3\") " pod="openstack/neutron-66fb95d9b4-wdncv" Dec 07 19:35:00 crc kubenswrapper[4815]: I1207 19:35:00.881965 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3d79808-d535-4247-b6d3-6152049a185e-config\") pod \"dnsmasq-dns-5f66db59b9-qc6xg\" (UID: \"d3d79808-d535-4247-b6d3-6152049a185e\") " pod="openstack/dnsmasq-dns-5f66db59b9-qc6xg" Dec 07 19:35:00 crc kubenswrapper[4815]: I1207 19:35:00.882002 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/092eb215-4eaa-4402-9692-1cf4aa9928e3-combined-ca-bundle\") pod \"neutron-66fb95d9b4-wdncv\" (UID: \"092eb215-4eaa-4402-9692-1cf4aa9928e3\") " pod="openstack/neutron-66fb95d9b4-wdncv" Dec 07 19:35:00 crc kubenswrapper[4815]: I1207 19:35:00.882056 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhc6v\" (UniqueName: \"kubernetes.io/projected/092eb215-4eaa-4402-9692-1cf4aa9928e3-kube-api-access-hhc6v\") pod \"neutron-66fb95d9b4-wdncv\" (UID: \"092eb215-4eaa-4402-9692-1cf4aa9928e3\") " pod="openstack/neutron-66fb95d9b4-wdncv" Dec 07 19:35:00 crc kubenswrapper[4815]: I1207 19:35:00.882087 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d3d79808-d535-4247-b6d3-6152049a185e-dns-svc\") pod \"dnsmasq-dns-5f66db59b9-qc6xg\" (UID: \"d3d79808-d535-4247-b6d3-6152049a185e\") " pod="openstack/dnsmasq-dns-5f66db59b9-qc6xg" Dec 07 19:35:00 crc kubenswrapper[4815]: I1207 19:35:00.882119 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjrh7\" (UniqueName: \"kubernetes.io/projected/d3d79808-d535-4247-b6d3-6152049a185e-kube-api-access-bjrh7\") pod \"dnsmasq-dns-5f66db59b9-qc6xg\" (UID: \"d3d79808-d535-4247-b6d3-6152049a185e\") " pod="openstack/dnsmasq-dns-5f66db59b9-qc6xg" Dec 07 19:35:00 crc kubenswrapper[4815]: I1207 19:35:00.882163 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/092eb215-4eaa-4402-9692-1cf4aa9928e3-ovndb-tls-certs\") pod \"neutron-66fb95d9b4-wdncv\" (UID: \"092eb215-4eaa-4402-9692-1cf4aa9928e3\") " pod="openstack/neutron-66fb95d9b4-wdncv" Dec 07 19:35:00 crc kubenswrapper[4815]: I1207 19:35:00.882187 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/092eb215-4eaa-4402-9692-1cf4aa9928e3-config\") pod \"neutron-66fb95d9b4-wdncv\" (UID: \"092eb215-4eaa-4402-9692-1cf4aa9928e3\") " pod="openstack/neutron-66fb95d9b4-wdncv" Dec 07 19:35:00 crc kubenswrapper[4815]: I1207 19:35:00.882215 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d3d79808-d535-4247-b6d3-6152049a185e-ovsdbserver-sb\") pod \"dnsmasq-dns-5f66db59b9-qc6xg\" (UID: \"d3d79808-d535-4247-b6d3-6152049a185e\") " pod="openstack/dnsmasq-dns-5f66db59b9-qc6xg" Dec 07 19:35:00 crc kubenswrapper[4815]: I1207 19:35:00.882284 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d3d79808-d535-4247-b6d3-6152049a185e-ovsdbserver-nb\") pod \"dnsmasq-dns-5f66db59b9-qc6xg\" (UID: \"d3d79808-d535-4247-b6d3-6152049a185e\") " pod="openstack/dnsmasq-dns-5f66db59b9-qc6xg" Dec 07 19:35:00 crc kubenswrapper[4815]: I1207 19:35:00.904596 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d3d79808-d535-4247-b6d3-6152049a185e-ovsdbserver-nb\") pod \"dnsmasq-dns-5f66db59b9-qc6xg\" (UID: \"d3d79808-d535-4247-b6d3-6152049a185e\") " pod="openstack/dnsmasq-dns-5f66db59b9-qc6xg" Dec 07 19:35:00 crc kubenswrapper[4815]: I1207 19:35:00.938965 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjrh7\" (UniqueName: \"kubernetes.io/projected/d3d79808-d535-4247-b6d3-6152049a185e-kube-api-access-bjrh7\") pod \"dnsmasq-dns-5f66db59b9-qc6xg\" (UID: \"d3d79808-d535-4247-b6d3-6152049a185e\") " pod="openstack/dnsmasq-dns-5f66db59b9-qc6xg" Dec 07 19:35:00 crc kubenswrapper[4815]: I1207 19:35:00.950843 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d3d79808-d535-4247-b6d3-6152049a185e-ovsdbserver-sb\") pod \"dnsmasq-dns-5f66db59b9-qc6xg\" (UID: \"d3d79808-d535-4247-b6d3-6152049a185e\") " pod="openstack/dnsmasq-dns-5f66db59b9-qc6xg" Dec 07 19:35:00 crc kubenswrapper[4815]: I1207 19:35:00.955038 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3d79808-d535-4247-b6d3-6152049a185e-config\") pod \"dnsmasq-dns-5f66db59b9-qc6xg\" (UID: \"d3d79808-d535-4247-b6d3-6152049a185e\") " pod="openstack/dnsmasq-dns-5f66db59b9-qc6xg" Dec 07 19:35:00 crc kubenswrapper[4815]: I1207 19:35:00.956494 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d3d79808-d535-4247-b6d3-6152049a185e-dns-svc\") pod \"dnsmasq-dns-5f66db59b9-qc6xg\" (UID: \"d3d79808-d535-4247-b6d3-6152049a185e\") " pod="openstack/dnsmasq-dns-5f66db59b9-qc6xg" Dec 07 19:35:00 crc kubenswrapper[4815]: I1207 19:35:00.983435 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/092eb215-4eaa-4402-9692-1cf4aa9928e3-httpd-config\") pod \"neutron-66fb95d9b4-wdncv\" (UID: \"092eb215-4eaa-4402-9692-1cf4aa9928e3\") " pod="openstack/neutron-66fb95d9b4-wdncv" Dec 07 19:35:00 crc kubenswrapper[4815]: I1207 19:35:00.983477 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/092eb215-4eaa-4402-9692-1cf4aa9928e3-combined-ca-bundle\") pod \"neutron-66fb95d9b4-wdncv\" (UID: \"092eb215-4eaa-4402-9692-1cf4aa9928e3\") " pod="openstack/neutron-66fb95d9b4-wdncv" Dec 07 19:35:00 crc kubenswrapper[4815]: I1207 19:35:00.983511 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhc6v\" (UniqueName: \"kubernetes.io/projected/092eb215-4eaa-4402-9692-1cf4aa9928e3-kube-api-access-hhc6v\") pod \"neutron-66fb95d9b4-wdncv\" (UID: \"092eb215-4eaa-4402-9692-1cf4aa9928e3\") " pod="openstack/neutron-66fb95d9b4-wdncv" Dec 07 19:35:00 crc kubenswrapper[4815]: I1207 19:35:00.983554 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/092eb215-4eaa-4402-9692-1cf4aa9928e3-ovndb-tls-certs\") pod \"neutron-66fb95d9b4-wdncv\" (UID: \"092eb215-4eaa-4402-9692-1cf4aa9928e3\") " pod="openstack/neutron-66fb95d9b4-wdncv" Dec 07 19:35:00 crc kubenswrapper[4815]: I1207 19:35:00.983571 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/092eb215-4eaa-4402-9692-1cf4aa9928e3-config\") pod \"neutron-66fb95d9b4-wdncv\" (UID: \"092eb215-4eaa-4402-9692-1cf4aa9928e3\") " pod="openstack/neutron-66fb95d9b4-wdncv" Dec 07 19:35:00 crc kubenswrapper[4815]: I1207 19:35:00.989862 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/092eb215-4eaa-4402-9692-1cf4aa9928e3-combined-ca-bundle\") pod \"neutron-66fb95d9b4-wdncv\" (UID: \"092eb215-4eaa-4402-9692-1cf4aa9928e3\") " pod="openstack/neutron-66fb95d9b4-wdncv" Dec 07 19:35:00 crc kubenswrapper[4815]: I1207 19:35:00.990747 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/092eb215-4eaa-4402-9692-1cf4aa9928e3-config\") pod \"neutron-66fb95d9b4-wdncv\" (UID: \"092eb215-4eaa-4402-9692-1cf4aa9928e3\") " pod="openstack/neutron-66fb95d9b4-wdncv" Dec 07 19:35:00 crc kubenswrapper[4815]: I1207 19:35:00.991439 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/092eb215-4eaa-4402-9692-1cf4aa9928e3-httpd-config\") pod \"neutron-66fb95d9b4-wdncv\" (UID: \"092eb215-4eaa-4402-9692-1cf4aa9928e3\") " pod="openstack/neutron-66fb95d9b4-wdncv" Dec 07 19:35:00 crc kubenswrapper[4815]: I1207 19:35:00.994066 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/092eb215-4eaa-4402-9692-1cf4aa9928e3-ovndb-tls-certs\") pod \"neutron-66fb95d9b4-wdncv\" (UID: \"092eb215-4eaa-4402-9692-1cf4aa9928e3\") " pod="openstack/neutron-66fb95d9b4-wdncv" Dec 07 19:35:01 crc kubenswrapper[4815]: I1207 19:35:01.001730 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f66db59b9-qc6xg" Dec 07 19:35:01 crc kubenswrapper[4815]: I1207 19:35:01.010861 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhc6v\" (UniqueName: \"kubernetes.io/projected/092eb215-4eaa-4402-9692-1cf4aa9928e3-kube-api-access-hhc6v\") pod \"neutron-66fb95d9b4-wdncv\" (UID: \"092eb215-4eaa-4402-9692-1cf4aa9928e3\") " pod="openstack/neutron-66fb95d9b4-wdncv" Dec 07 19:35:01 crc kubenswrapper[4815]: I1207 19:35:01.067852 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-66fb95d9b4-wdncv" Dec 07 19:35:01 crc kubenswrapper[4815]: I1207 19:35:01.335712 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b6dbdb6f5-qffnw"] Dec 07 19:35:01 crc kubenswrapper[4815]: W1207 19:35:01.362776 4815 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod87674f69_b732_40d9_98aa_fa6827cc83f9.slice/crio-960b17a63076a69e2820e65d04e274d07abc8e302870fdfbe3beaf755900a182 WatchSource:0}: Error finding container 960b17a63076a69e2820e65d04e274d07abc8e302870fdfbe3beaf755900a182: Status 404 returned error can't find the container with id 960b17a63076a69e2820e65d04e274d07abc8e302870fdfbe3beaf755900a182 Dec 07 19:35:01 crc kubenswrapper[4815]: I1207 19:35:01.451673 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69daea8f-61a8-4a91-b782-afdcb01e0605","Type":"ContainerStarted","Data":"6cfc9f445c2c6b5c1aaf487517b32ed8cdf2aa13f5153d0592c6d4402a336c89"} Dec 07 19:35:01 crc kubenswrapper[4815]: I1207 19:35:01.462939 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xxh6g" event={"ID":"2f0f688e-04dc-46a6-a4eb-1c8d5e635abe","Type":"ContainerStarted","Data":"db2dba635269f6582f669f45136de443af753ca439629cb6f37da9e9526aacf2"} Dec 07 19:35:01 crc kubenswrapper[4815]: I1207 19:35:01.469159 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-jfdqh" event={"ID":"c31f2b72-9e36-4a29-90e2-a3599b27f94b","Type":"ContainerStarted","Data":"6eb415b1b752efa9ed7b88727e7e457320edcf19312d3224929a6a5dbd27ff24"} Dec 07 19:35:01 crc kubenswrapper[4815]: I1207 19:35:01.498146 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" event={"ID":"3d662ba2-aa03-4eea-bd30-8ad40638f6c7","Type":"ContainerStarted","Data":"5abc2074a486ad753dc90f2524ca4ee8cd9d4b0e73ed194398bf623a9d215d17"} Dec 07 19:35:01 crc kubenswrapper[4815]: I1207 19:35:01.513621 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-jfdqh" podStartSLOduration=3.878207238 podStartE2EDuration="29.513601167s" podCreationTimestamp="2025-12-07 19:34:32 +0000 UTC" firstStartedPulling="2025-12-07 19:34:34.398768969 +0000 UTC m=+1178.977759004" lastFinishedPulling="2025-12-07 19:35:00.034162898 +0000 UTC m=+1204.613152933" observedRunningTime="2025-12-07 19:35:01.511329433 +0000 UTC m=+1206.090319478" watchObservedRunningTime="2025-12-07 19:35:01.513601167 +0000 UTC m=+1206.092591212" Dec 07 19:35:01 crc kubenswrapper[4815]: I1207 19:35:01.516696 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-xxh6g" podStartSLOduration=4.5166851040000005 podStartE2EDuration="4.516685104s" podCreationTimestamp="2025-12-07 19:34:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:35:01.488459018 +0000 UTC m=+1206.067449063" watchObservedRunningTime="2025-12-07 19:35:01.516685104 +0000 UTC m=+1206.095675149" Dec 07 19:35:01 crc kubenswrapper[4815]: I1207 19:35:01.950588 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f66db59b9-qc6xg"] Dec 07 19:35:02 crc kubenswrapper[4815]: I1207 19:35:02.043620 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-66fb95d9b4-wdncv"] Dec 07 19:35:02 crc kubenswrapper[4815]: I1207 19:35:02.541246 4815 generic.go:334] "Generic (PLEG): container finished" podID="87674f69-b732-40d9-98aa-fa6827cc83f9" containerID="91dee73f94c09932eac6be45c404a4dff4d4b5de11b273830c72247fbc808477" exitCode=0 Dec 07 19:35:02 crc kubenswrapper[4815]: I1207 19:35:02.541789 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b6dbdb6f5-qffnw" event={"ID":"87674f69-b732-40d9-98aa-fa6827cc83f9","Type":"ContainerDied","Data":"91dee73f94c09932eac6be45c404a4dff4d4b5de11b273830c72247fbc808477"} Dec 07 19:35:02 crc kubenswrapper[4815]: I1207 19:35:02.541815 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b6dbdb6f5-qffnw" event={"ID":"87674f69-b732-40d9-98aa-fa6827cc83f9","Type":"ContainerStarted","Data":"960b17a63076a69e2820e65d04e274d07abc8e302870fdfbe3beaf755900a182"} Dec 07 19:35:02 crc kubenswrapper[4815]: I1207 19:35:02.554247 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xxh6g" event={"ID":"2f0f688e-04dc-46a6-a4eb-1c8d5e635abe","Type":"ContainerStarted","Data":"cd537bca59835e29c964d50ab4b23521eee29ad171fc43572eeb1460e1e2bec4"} Dec 07 19:35:02 crc kubenswrapper[4815]: I1207 19:35:02.561880 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f66db59b9-qc6xg" event={"ID":"d3d79808-d535-4247-b6d3-6152049a185e","Type":"ContainerStarted","Data":"7b128be25a3c3150ddb35f02c1e40a485f80f6f81d6d3d10828698a48ada7942"} Dec 07 19:35:02 crc kubenswrapper[4815]: I1207 19:35:02.561956 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f66db59b9-qc6xg" event={"ID":"d3d79808-d535-4247-b6d3-6152049a185e","Type":"ContainerStarted","Data":"10a39f47c10b2d312cf53aaec25cd38e4d05b84434262001d2240700c0c27579"} Dec 07 19:35:02 crc kubenswrapper[4815]: I1207 19:35:02.566561 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-66fb95d9b4-wdncv" event={"ID":"092eb215-4eaa-4402-9692-1cf4aa9928e3","Type":"ContainerStarted","Data":"c162cbf7bc1bea033b438651ed33a918c8b3eed3ab44859ca45384f607ec6be9"} Dec 07 19:35:02 crc kubenswrapper[4815]: I1207 19:35:02.566621 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-66fb95d9b4-wdncv" event={"ID":"092eb215-4eaa-4402-9692-1cf4aa9928e3","Type":"ContainerStarted","Data":"fd793563c416952c0bde61671b755698f845abca2a3d0032a3e8659a591a6643"} Dec 07 19:35:03 crc kubenswrapper[4815]: I1207 19:35:03.003834 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b6dbdb6f5-qffnw" Dec 07 19:35:03 crc kubenswrapper[4815]: I1207 19:35:03.065584 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87674f69-b732-40d9-98aa-fa6827cc83f9-dns-svc\") pod \"87674f69-b732-40d9-98aa-fa6827cc83f9\" (UID: \"87674f69-b732-40d9-98aa-fa6827cc83f9\") " Dec 07 19:35:03 crc kubenswrapper[4815]: I1207 19:35:03.066020 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/87674f69-b732-40d9-98aa-fa6827cc83f9-ovsdbserver-nb\") pod \"87674f69-b732-40d9-98aa-fa6827cc83f9\" (UID: \"87674f69-b732-40d9-98aa-fa6827cc83f9\") " Dec 07 19:35:03 crc kubenswrapper[4815]: I1207 19:35:03.066185 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87674f69-b732-40d9-98aa-fa6827cc83f9-config\") pod \"87674f69-b732-40d9-98aa-fa6827cc83f9\" (UID: \"87674f69-b732-40d9-98aa-fa6827cc83f9\") " Dec 07 19:35:03 crc kubenswrapper[4815]: I1207 19:35:03.066310 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/87674f69-b732-40d9-98aa-fa6827cc83f9-ovsdbserver-sb\") pod \"87674f69-b732-40d9-98aa-fa6827cc83f9\" (UID: \"87674f69-b732-40d9-98aa-fa6827cc83f9\") " Dec 07 19:35:03 crc kubenswrapper[4815]: I1207 19:35:03.066462 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vrlt5\" (UniqueName: \"kubernetes.io/projected/87674f69-b732-40d9-98aa-fa6827cc83f9-kube-api-access-vrlt5\") pod \"87674f69-b732-40d9-98aa-fa6827cc83f9\" (UID: \"87674f69-b732-40d9-98aa-fa6827cc83f9\") " Dec 07 19:35:03 crc kubenswrapper[4815]: I1207 19:35:03.070331 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87674f69-b732-40d9-98aa-fa6827cc83f9-kube-api-access-vrlt5" (OuterVolumeSpecName: "kube-api-access-vrlt5") pod "87674f69-b732-40d9-98aa-fa6827cc83f9" (UID: "87674f69-b732-40d9-98aa-fa6827cc83f9"). InnerVolumeSpecName "kube-api-access-vrlt5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:35:03 crc kubenswrapper[4815]: I1207 19:35:03.110488 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87674f69-b732-40d9-98aa-fa6827cc83f9-config" (OuterVolumeSpecName: "config") pod "87674f69-b732-40d9-98aa-fa6827cc83f9" (UID: "87674f69-b732-40d9-98aa-fa6827cc83f9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:35:03 crc kubenswrapper[4815]: I1207 19:35:03.124610 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87674f69-b732-40d9-98aa-fa6827cc83f9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "87674f69-b732-40d9-98aa-fa6827cc83f9" (UID: "87674f69-b732-40d9-98aa-fa6827cc83f9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:35:03 crc kubenswrapper[4815]: I1207 19:35:03.131306 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87674f69-b732-40d9-98aa-fa6827cc83f9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "87674f69-b732-40d9-98aa-fa6827cc83f9" (UID: "87674f69-b732-40d9-98aa-fa6827cc83f9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:35:03 crc kubenswrapper[4815]: I1207 19:35:03.138163 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87674f69-b732-40d9-98aa-fa6827cc83f9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "87674f69-b732-40d9-98aa-fa6827cc83f9" (UID: "87674f69-b732-40d9-98aa-fa6827cc83f9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:35:03 crc kubenswrapper[4815]: I1207 19:35:03.168639 4815 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/87674f69-b732-40d9-98aa-fa6827cc83f9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 07 19:35:03 crc kubenswrapper[4815]: I1207 19:35:03.168678 4815 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87674f69-b732-40d9-98aa-fa6827cc83f9-config\") on node \"crc\" DevicePath \"\"" Dec 07 19:35:03 crc kubenswrapper[4815]: I1207 19:35:03.168691 4815 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/87674f69-b732-40d9-98aa-fa6827cc83f9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 07 19:35:03 crc kubenswrapper[4815]: I1207 19:35:03.168704 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vrlt5\" (UniqueName: \"kubernetes.io/projected/87674f69-b732-40d9-98aa-fa6827cc83f9-kube-api-access-vrlt5\") on node \"crc\" DevicePath \"\"" Dec 07 19:35:03 crc kubenswrapper[4815]: I1207 19:35:03.168718 4815 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87674f69-b732-40d9-98aa-fa6827cc83f9-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 07 19:35:03 crc kubenswrapper[4815]: I1207 19:35:03.576151 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-66fb95d9b4-wdncv" event={"ID":"092eb215-4eaa-4402-9692-1cf4aa9928e3","Type":"ContainerStarted","Data":"9a646cf1a20f69e63d455c594a9224f63eb6d8c2c3aff2790476f333402e5510"} Dec 07 19:35:03 crc kubenswrapper[4815]: I1207 19:35:03.576985 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-66fb95d9b4-wdncv" Dec 07 19:35:03 crc kubenswrapper[4815]: I1207 19:35:03.579269 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b6dbdb6f5-qffnw" event={"ID":"87674f69-b732-40d9-98aa-fa6827cc83f9","Type":"ContainerDied","Data":"960b17a63076a69e2820e65d04e274d07abc8e302870fdfbe3beaf755900a182"} Dec 07 19:35:03 crc kubenswrapper[4815]: I1207 19:35:03.579321 4815 scope.go:117] "RemoveContainer" containerID="91dee73f94c09932eac6be45c404a4dff4d4b5de11b273830c72247fbc808477" Dec 07 19:35:03 crc kubenswrapper[4815]: I1207 19:35:03.579429 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b6dbdb6f5-qffnw" Dec 07 19:35:03 crc kubenswrapper[4815]: I1207 19:35:03.592012 4815 generic.go:334] "Generic (PLEG): container finished" podID="d3d79808-d535-4247-b6d3-6152049a185e" containerID="7b128be25a3c3150ddb35f02c1e40a485f80f6f81d6d3d10828698a48ada7942" exitCode=0 Dec 07 19:35:03 crc kubenswrapper[4815]: I1207 19:35:03.592126 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f66db59b9-qc6xg" event={"ID":"d3d79808-d535-4247-b6d3-6152049a185e","Type":"ContainerDied","Data":"7b128be25a3c3150ddb35f02c1e40a485f80f6f81d6d3d10828698a48ada7942"} Dec 07 19:35:03 crc kubenswrapper[4815]: I1207 19:35:03.592152 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f66db59b9-qc6xg" event={"ID":"d3d79808-d535-4247-b6d3-6152049a185e","Type":"ContainerStarted","Data":"59dda7884a0a9118128cc92109b6420908376852f9733dc400881d66091bc444"} Dec 07 19:35:03 crc kubenswrapper[4815]: I1207 19:35:03.592236 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5f66db59b9-qc6xg" Dec 07 19:35:03 crc kubenswrapper[4815]: I1207 19:35:03.609187 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-66fb95d9b4-wdncv" podStartSLOduration=3.609167067 podStartE2EDuration="3.609167067s" podCreationTimestamp="2025-12-07 19:35:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:35:03.603362263 +0000 UTC m=+1208.182352308" watchObservedRunningTime="2025-12-07 19:35:03.609167067 +0000 UTC m=+1208.188157112" Dec 07 19:35:03 crc kubenswrapper[4815]: I1207 19:35:03.665797 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b6dbdb6f5-qffnw"] Dec 07 19:35:03 crc kubenswrapper[4815]: I1207 19:35:03.675361 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b6dbdb6f5-qffnw"] Dec 07 19:35:03 crc kubenswrapper[4815]: I1207 19:35:03.682700 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5f66db59b9-qc6xg" podStartSLOduration=3.6826791119999998 podStartE2EDuration="3.682679112s" podCreationTimestamp="2025-12-07 19:35:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:35:03.670250621 +0000 UTC m=+1208.249240666" watchObservedRunningTime="2025-12-07 19:35:03.682679112 +0000 UTC m=+1208.261669157" Dec 07 19:35:03 crc kubenswrapper[4815]: I1207 19:35:03.783894 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87674f69-b732-40d9-98aa-fa6827cc83f9" path="/var/lib/kubelet/pods/87674f69-b732-40d9-98aa-fa6827cc83f9/volumes" Dec 07 19:35:04 crc kubenswrapper[4815]: I1207 19:35:04.598433 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-869d694d95-htxnx"] Dec 07 19:35:04 crc kubenswrapper[4815]: E1207 19:35:04.599216 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87674f69-b732-40d9-98aa-fa6827cc83f9" containerName="init" Dec 07 19:35:04 crc kubenswrapper[4815]: I1207 19:35:04.599229 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="87674f69-b732-40d9-98aa-fa6827cc83f9" containerName="init" Dec 07 19:35:04 crc kubenswrapper[4815]: I1207 19:35:04.599388 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="87674f69-b732-40d9-98aa-fa6827cc83f9" containerName="init" Dec 07 19:35:04 crc kubenswrapper[4815]: I1207 19:35:04.600180 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-869d694d95-htxnx" Dec 07 19:35:04 crc kubenswrapper[4815]: I1207 19:35:04.610031 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Dec 07 19:35:04 crc kubenswrapper[4815]: I1207 19:35:04.615221 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Dec 07 19:35:04 crc kubenswrapper[4815]: I1207 19:35:04.635350 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-869d694d95-htxnx"] Dec 07 19:35:04 crc kubenswrapper[4815]: I1207 19:35:04.665589 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69daea8f-61a8-4a91-b782-afdcb01e0605","Type":"ContainerStarted","Data":"d8db10fcc665455ea782f9324b354461ff056f610d893659790bb546c69044ea"} Dec 07 19:35:04 crc kubenswrapper[4815]: I1207 19:35:04.677223 4815 generic.go:334] "Generic (PLEG): container finished" podID="c31f2b72-9e36-4a29-90e2-a3599b27f94b" containerID="6eb415b1b752efa9ed7b88727e7e457320edcf19312d3224929a6a5dbd27ff24" exitCode=0 Dec 07 19:35:04 crc kubenswrapper[4815]: I1207 19:35:04.678659 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-jfdqh" event={"ID":"c31f2b72-9e36-4a29-90e2-a3599b27f94b","Type":"ContainerDied","Data":"6eb415b1b752efa9ed7b88727e7e457320edcf19312d3224929a6a5dbd27ff24"} Dec 07 19:35:04 crc kubenswrapper[4815]: I1207 19:35:04.693107 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/565a9de4-ca65-41ab-83e0-1e7091486701-public-tls-certs\") pod \"neutron-869d694d95-htxnx\" (UID: \"565a9de4-ca65-41ab-83e0-1e7091486701\") " pod="openstack/neutron-869d694d95-htxnx" Dec 07 19:35:04 crc kubenswrapper[4815]: I1207 19:35:04.693181 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/565a9de4-ca65-41ab-83e0-1e7091486701-internal-tls-certs\") pod \"neutron-869d694d95-htxnx\" (UID: \"565a9de4-ca65-41ab-83e0-1e7091486701\") " pod="openstack/neutron-869d694d95-htxnx" Dec 07 19:35:04 crc kubenswrapper[4815]: I1207 19:35:04.693232 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvwgg\" (UniqueName: \"kubernetes.io/projected/565a9de4-ca65-41ab-83e0-1e7091486701-kube-api-access-dvwgg\") pod \"neutron-869d694d95-htxnx\" (UID: \"565a9de4-ca65-41ab-83e0-1e7091486701\") " pod="openstack/neutron-869d694d95-htxnx" Dec 07 19:35:04 crc kubenswrapper[4815]: I1207 19:35:04.693307 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/565a9de4-ca65-41ab-83e0-1e7091486701-config\") pod \"neutron-869d694d95-htxnx\" (UID: \"565a9de4-ca65-41ab-83e0-1e7091486701\") " pod="openstack/neutron-869d694d95-htxnx" Dec 07 19:35:04 crc kubenswrapper[4815]: I1207 19:35:04.693359 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/565a9de4-ca65-41ab-83e0-1e7091486701-httpd-config\") pod \"neutron-869d694d95-htxnx\" (UID: \"565a9de4-ca65-41ab-83e0-1e7091486701\") " pod="openstack/neutron-869d694d95-htxnx" Dec 07 19:35:04 crc kubenswrapper[4815]: I1207 19:35:04.693392 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/565a9de4-ca65-41ab-83e0-1e7091486701-combined-ca-bundle\") pod \"neutron-869d694d95-htxnx\" (UID: \"565a9de4-ca65-41ab-83e0-1e7091486701\") " pod="openstack/neutron-869d694d95-htxnx" Dec 07 19:35:04 crc kubenswrapper[4815]: I1207 19:35:04.693413 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/565a9de4-ca65-41ab-83e0-1e7091486701-ovndb-tls-certs\") pod \"neutron-869d694d95-htxnx\" (UID: \"565a9de4-ca65-41ab-83e0-1e7091486701\") " pod="openstack/neutron-869d694d95-htxnx" Dec 07 19:35:04 crc kubenswrapper[4815]: I1207 19:35:04.794645 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/565a9de4-ca65-41ab-83e0-1e7091486701-combined-ca-bundle\") pod \"neutron-869d694d95-htxnx\" (UID: \"565a9de4-ca65-41ab-83e0-1e7091486701\") " pod="openstack/neutron-869d694d95-htxnx" Dec 07 19:35:04 crc kubenswrapper[4815]: I1207 19:35:04.794702 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/565a9de4-ca65-41ab-83e0-1e7091486701-ovndb-tls-certs\") pod \"neutron-869d694d95-htxnx\" (UID: \"565a9de4-ca65-41ab-83e0-1e7091486701\") " pod="openstack/neutron-869d694d95-htxnx" Dec 07 19:35:04 crc kubenswrapper[4815]: I1207 19:35:04.794775 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/565a9de4-ca65-41ab-83e0-1e7091486701-public-tls-certs\") pod \"neutron-869d694d95-htxnx\" (UID: \"565a9de4-ca65-41ab-83e0-1e7091486701\") " pod="openstack/neutron-869d694d95-htxnx" Dec 07 19:35:04 crc kubenswrapper[4815]: I1207 19:35:04.794817 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/565a9de4-ca65-41ab-83e0-1e7091486701-internal-tls-certs\") pod \"neutron-869d694d95-htxnx\" (UID: \"565a9de4-ca65-41ab-83e0-1e7091486701\") " pod="openstack/neutron-869d694d95-htxnx" Dec 07 19:35:04 crc kubenswrapper[4815]: I1207 19:35:04.794893 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvwgg\" (UniqueName: \"kubernetes.io/projected/565a9de4-ca65-41ab-83e0-1e7091486701-kube-api-access-dvwgg\") pod \"neutron-869d694d95-htxnx\" (UID: \"565a9de4-ca65-41ab-83e0-1e7091486701\") " pod="openstack/neutron-869d694d95-htxnx" Dec 07 19:35:04 crc kubenswrapper[4815]: I1207 19:35:04.795111 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/565a9de4-ca65-41ab-83e0-1e7091486701-config\") pod \"neutron-869d694d95-htxnx\" (UID: \"565a9de4-ca65-41ab-83e0-1e7091486701\") " pod="openstack/neutron-869d694d95-htxnx" Dec 07 19:35:04 crc kubenswrapper[4815]: I1207 19:35:04.795167 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/565a9de4-ca65-41ab-83e0-1e7091486701-httpd-config\") pod \"neutron-869d694d95-htxnx\" (UID: \"565a9de4-ca65-41ab-83e0-1e7091486701\") " pod="openstack/neutron-869d694d95-htxnx" Dec 07 19:35:04 crc kubenswrapper[4815]: I1207 19:35:04.803552 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/565a9de4-ca65-41ab-83e0-1e7091486701-ovndb-tls-certs\") pod \"neutron-869d694d95-htxnx\" (UID: \"565a9de4-ca65-41ab-83e0-1e7091486701\") " pod="openstack/neutron-869d694d95-htxnx" Dec 07 19:35:04 crc kubenswrapper[4815]: I1207 19:35:04.804300 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/565a9de4-ca65-41ab-83e0-1e7091486701-internal-tls-certs\") pod \"neutron-869d694d95-htxnx\" (UID: \"565a9de4-ca65-41ab-83e0-1e7091486701\") " pod="openstack/neutron-869d694d95-htxnx" Dec 07 19:35:04 crc kubenswrapper[4815]: I1207 19:35:04.815250 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/565a9de4-ca65-41ab-83e0-1e7091486701-config\") pod \"neutron-869d694d95-htxnx\" (UID: \"565a9de4-ca65-41ab-83e0-1e7091486701\") " pod="openstack/neutron-869d694d95-htxnx" Dec 07 19:35:04 crc kubenswrapper[4815]: I1207 19:35:04.815489 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/565a9de4-ca65-41ab-83e0-1e7091486701-httpd-config\") pod \"neutron-869d694d95-htxnx\" (UID: \"565a9de4-ca65-41ab-83e0-1e7091486701\") " pod="openstack/neutron-869d694d95-htxnx" Dec 07 19:35:04 crc kubenswrapper[4815]: I1207 19:35:04.817715 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/565a9de4-ca65-41ab-83e0-1e7091486701-public-tls-certs\") pod \"neutron-869d694d95-htxnx\" (UID: \"565a9de4-ca65-41ab-83e0-1e7091486701\") " pod="openstack/neutron-869d694d95-htxnx" Dec 07 19:35:04 crc kubenswrapper[4815]: I1207 19:35:04.819033 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/565a9de4-ca65-41ab-83e0-1e7091486701-combined-ca-bundle\") pod \"neutron-869d694d95-htxnx\" (UID: \"565a9de4-ca65-41ab-83e0-1e7091486701\") " pod="openstack/neutron-869d694d95-htxnx" Dec 07 19:35:04 crc kubenswrapper[4815]: I1207 19:35:04.833612 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvwgg\" (UniqueName: \"kubernetes.io/projected/565a9de4-ca65-41ab-83e0-1e7091486701-kube-api-access-dvwgg\") pod \"neutron-869d694d95-htxnx\" (UID: \"565a9de4-ca65-41ab-83e0-1e7091486701\") " pod="openstack/neutron-869d694d95-htxnx" Dec 07 19:35:04 crc kubenswrapper[4815]: I1207 19:35:04.922441 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-869d694d95-htxnx" Dec 07 19:35:06 crc kubenswrapper[4815]: I1207 19:35:05.470785 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-869d694d95-htxnx"] Dec 07 19:35:06 crc kubenswrapper[4815]: I1207 19:35:05.693096 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-869d694d95-htxnx" event={"ID":"565a9de4-ca65-41ab-83e0-1e7091486701","Type":"ContainerStarted","Data":"309c8caa838647205a514591c0c41c77053a4fc462f45a63e50401da4c803710"} Dec 07 19:35:06 crc kubenswrapper[4815]: I1207 19:35:06.410192 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-jfdqh" Dec 07 19:35:06 crc kubenswrapper[4815]: I1207 19:35:06.441692 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t77cc\" (UniqueName: \"kubernetes.io/projected/c31f2b72-9e36-4a29-90e2-a3599b27f94b-kube-api-access-t77cc\") pod \"c31f2b72-9e36-4a29-90e2-a3599b27f94b\" (UID: \"c31f2b72-9e36-4a29-90e2-a3599b27f94b\") " Dec 07 19:35:06 crc kubenswrapper[4815]: I1207 19:35:06.441757 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c31f2b72-9e36-4a29-90e2-a3599b27f94b-combined-ca-bundle\") pod \"c31f2b72-9e36-4a29-90e2-a3599b27f94b\" (UID: \"c31f2b72-9e36-4a29-90e2-a3599b27f94b\") " Dec 07 19:35:06 crc kubenswrapper[4815]: I1207 19:35:06.441840 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c31f2b72-9e36-4a29-90e2-a3599b27f94b-scripts\") pod \"c31f2b72-9e36-4a29-90e2-a3599b27f94b\" (UID: \"c31f2b72-9e36-4a29-90e2-a3599b27f94b\") " Dec 07 19:35:06 crc kubenswrapper[4815]: I1207 19:35:06.441892 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c31f2b72-9e36-4a29-90e2-a3599b27f94b-logs\") pod \"c31f2b72-9e36-4a29-90e2-a3599b27f94b\" (UID: \"c31f2b72-9e36-4a29-90e2-a3599b27f94b\") " Dec 07 19:35:06 crc kubenswrapper[4815]: I1207 19:35:06.441957 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c31f2b72-9e36-4a29-90e2-a3599b27f94b-config-data\") pod \"c31f2b72-9e36-4a29-90e2-a3599b27f94b\" (UID: \"c31f2b72-9e36-4a29-90e2-a3599b27f94b\") " Dec 07 19:35:06 crc kubenswrapper[4815]: I1207 19:35:06.446282 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c31f2b72-9e36-4a29-90e2-a3599b27f94b-logs" (OuterVolumeSpecName: "logs") pod "c31f2b72-9e36-4a29-90e2-a3599b27f94b" (UID: "c31f2b72-9e36-4a29-90e2-a3599b27f94b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 07 19:35:06 crc kubenswrapper[4815]: I1207 19:35:06.446682 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c31f2b72-9e36-4a29-90e2-a3599b27f94b-kube-api-access-t77cc" (OuterVolumeSpecName: "kube-api-access-t77cc") pod "c31f2b72-9e36-4a29-90e2-a3599b27f94b" (UID: "c31f2b72-9e36-4a29-90e2-a3599b27f94b"). InnerVolumeSpecName "kube-api-access-t77cc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:35:06 crc kubenswrapper[4815]: I1207 19:35:06.448321 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c31f2b72-9e36-4a29-90e2-a3599b27f94b-scripts" (OuterVolumeSpecName: "scripts") pod "c31f2b72-9e36-4a29-90e2-a3599b27f94b" (UID: "c31f2b72-9e36-4a29-90e2-a3599b27f94b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:35:06 crc kubenswrapper[4815]: I1207 19:35:06.469264 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c31f2b72-9e36-4a29-90e2-a3599b27f94b-config-data" (OuterVolumeSpecName: "config-data") pod "c31f2b72-9e36-4a29-90e2-a3599b27f94b" (UID: "c31f2b72-9e36-4a29-90e2-a3599b27f94b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:35:06 crc kubenswrapper[4815]: I1207 19:35:06.502029 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c31f2b72-9e36-4a29-90e2-a3599b27f94b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c31f2b72-9e36-4a29-90e2-a3599b27f94b" (UID: "c31f2b72-9e36-4a29-90e2-a3599b27f94b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:35:06 crc kubenswrapper[4815]: I1207 19:35:06.543758 4815 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c31f2b72-9e36-4a29-90e2-a3599b27f94b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 07 19:35:06 crc kubenswrapper[4815]: I1207 19:35:06.543795 4815 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c31f2b72-9e36-4a29-90e2-a3599b27f94b-scripts\") on node \"crc\" DevicePath \"\"" Dec 07 19:35:06 crc kubenswrapper[4815]: I1207 19:35:06.543805 4815 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c31f2b72-9e36-4a29-90e2-a3599b27f94b-logs\") on node \"crc\" DevicePath \"\"" Dec 07 19:35:06 crc kubenswrapper[4815]: I1207 19:35:06.543813 4815 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c31f2b72-9e36-4a29-90e2-a3599b27f94b-config-data\") on node \"crc\" DevicePath \"\"" Dec 07 19:35:06 crc kubenswrapper[4815]: I1207 19:35:06.543821 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t77cc\" (UniqueName: \"kubernetes.io/projected/c31f2b72-9e36-4a29-90e2-a3599b27f94b-kube-api-access-t77cc\") on node \"crc\" DevicePath \"\"" Dec 07 19:35:06 crc kubenswrapper[4815]: I1207 19:35:06.705397 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-jfdqh" event={"ID":"c31f2b72-9e36-4a29-90e2-a3599b27f94b","Type":"ContainerDied","Data":"a58ffe3789bb11332244a2798cf538f509cacc1062d0e66cd1643a88cd756aec"} Dec 07 19:35:06 crc kubenswrapper[4815]: I1207 19:35:06.705438 4815 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a58ffe3789bb11332244a2798cf538f509cacc1062d0e66cd1643a88cd756aec" Dec 07 19:35:06 crc kubenswrapper[4815]: I1207 19:35:06.705516 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-jfdqh" Dec 07 19:35:06 crc kubenswrapper[4815]: I1207 19:35:06.717474 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-869d694d95-htxnx" event={"ID":"565a9de4-ca65-41ab-83e0-1e7091486701","Type":"ContainerStarted","Data":"ea49f679ce098143df5432062a95a509df69545f8341aa7626d3dffa69c7bace"} Dec 07 19:35:06 crc kubenswrapper[4815]: I1207 19:35:06.717521 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-869d694d95-htxnx" event={"ID":"565a9de4-ca65-41ab-83e0-1e7091486701","Type":"ContainerStarted","Data":"0f0e68ef852b697fc3c532c98a87570b839e1f64a2312e848a015eddc81693e1"} Dec 07 19:35:06 crc kubenswrapper[4815]: I1207 19:35:06.717767 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-869d694d95-htxnx" Dec 07 19:35:06 crc kubenswrapper[4815]: I1207 19:35:06.720580 4815 generic.go:334] "Generic (PLEG): container finished" podID="2f0f688e-04dc-46a6-a4eb-1c8d5e635abe" containerID="cd537bca59835e29c964d50ab4b23521eee29ad171fc43572eeb1460e1e2bec4" exitCode=0 Dec 07 19:35:06 crc kubenswrapper[4815]: I1207 19:35:06.720615 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xxh6g" event={"ID":"2f0f688e-04dc-46a6-a4eb-1c8d5e635abe","Type":"ContainerDied","Data":"cd537bca59835e29c964d50ab4b23521eee29ad171fc43572eeb1460e1e2bec4"} Dec 07 19:35:06 crc kubenswrapper[4815]: I1207 19:35:06.773408 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-869d694d95-htxnx" podStartSLOduration=2.773385 podStartE2EDuration="2.773385s" podCreationTimestamp="2025-12-07 19:35:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:35:06.741852801 +0000 UTC m=+1211.320842846" watchObservedRunningTime="2025-12-07 19:35:06.773385 +0000 UTC m=+1211.352375045" Dec 07 19:35:06 crc kubenswrapper[4815]: I1207 19:35:06.814574 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-8bbdc95bd-g568f"] Dec 07 19:35:06 crc kubenswrapper[4815]: E1207 19:35:06.814938 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c31f2b72-9e36-4a29-90e2-a3599b27f94b" containerName="placement-db-sync" Dec 07 19:35:06 crc kubenswrapper[4815]: I1207 19:35:06.814956 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="c31f2b72-9e36-4a29-90e2-a3599b27f94b" containerName="placement-db-sync" Dec 07 19:35:06 crc kubenswrapper[4815]: I1207 19:35:06.815481 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="c31f2b72-9e36-4a29-90e2-a3599b27f94b" containerName="placement-db-sync" Dec 07 19:35:06 crc kubenswrapper[4815]: I1207 19:35:06.816346 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8bbdc95bd-g568f" Dec 07 19:35:06 crc kubenswrapper[4815]: I1207 19:35:06.821370 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Dec 07 19:35:06 crc kubenswrapper[4815]: I1207 19:35:06.821595 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 07 19:35:06 crc kubenswrapper[4815]: I1207 19:35:06.821764 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 07 19:35:06 crc kubenswrapper[4815]: I1207 19:35:06.821902 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Dec 07 19:35:06 crc kubenswrapper[4815]: I1207 19:35:06.822157 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-lv59v" Dec 07 19:35:06 crc kubenswrapper[4815]: I1207 19:35:06.832806 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-8bbdc95bd-g568f"] Dec 07 19:35:06 crc kubenswrapper[4815]: I1207 19:35:06.862671 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f1e0e26-2abf-4719-8973-301f1d821a4e-combined-ca-bundle\") pod \"placement-8bbdc95bd-g568f\" (UID: \"7f1e0e26-2abf-4719-8973-301f1d821a4e\") " pod="openstack/placement-8bbdc95bd-g568f" Dec 07 19:35:06 crc kubenswrapper[4815]: I1207 19:35:06.862727 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f1e0e26-2abf-4719-8973-301f1d821a4e-scripts\") pod \"placement-8bbdc95bd-g568f\" (UID: \"7f1e0e26-2abf-4719-8973-301f1d821a4e\") " pod="openstack/placement-8bbdc95bd-g568f" Dec 07 19:35:06 crc kubenswrapper[4815]: I1207 19:35:06.862793 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f1e0e26-2abf-4719-8973-301f1d821a4e-config-data\") pod \"placement-8bbdc95bd-g568f\" (UID: \"7f1e0e26-2abf-4719-8973-301f1d821a4e\") " pod="openstack/placement-8bbdc95bd-g568f" Dec 07 19:35:06 crc kubenswrapper[4815]: I1207 19:35:06.862812 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f1e0e26-2abf-4719-8973-301f1d821a4e-logs\") pod \"placement-8bbdc95bd-g568f\" (UID: \"7f1e0e26-2abf-4719-8973-301f1d821a4e\") " pod="openstack/placement-8bbdc95bd-g568f" Dec 07 19:35:06 crc kubenswrapper[4815]: I1207 19:35:06.863141 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f1e0e26-2abf-4719-8973-301f1d821a4e-public-tls-certs\") pod \"placement-8bbdc95bd-g568f\" (UID: \"7f1e0e26-2abf-4719-8973-301f1d821a4e\") " pod="openstack/placement-8bbdc95bd-g568f" Dec 07 19:35:06 crc kubenswrapper[4815]: I1207 19:35:06.863207 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f1e0e26-2abf-4719-8973-301f1d821a4e-internal-tls-certs\") pod \"placement-8bbdc95bd-g568f\" (UID: \"7f1e0e26-2abf-4719-8973-301f1d821a4e\") " pod="openstack/placement-8bbdc95bd-g568f" Dec 07 19:35:06 crc kubenswrapper[4815]: I1207 19:35:06.863297 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k729m\" (UniqueName: \"kubernetes.io/projected/7f1e0e26-2abf-4719-8973-301f1d821a4e-kube-api-access-k729m\") pod \"placement-8bbdc95bd-g568f\" (UID: \"7f1e0e26-2abf-4719-8973-301f1d821a4e\") " pod="openstack/placement-8bbdc95bd-g568f" Dec 07 19:35:06 crc kubenswrapper[4815]: I1207 19:35:06.964164 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f1e0e26-2abf-4719-8973-301f1d821a4e-scripts\") pod \"placement-8bbdc95bd-g568f\" (UID: \"7f1e0e26-2abf-4719-8973-301f1d821a4e\") " pod="openstack/placement-8bbdc95bd-g568f" Dec 07 19:35:06 crc kubenswrapper[4815]: I1207 19:35:06.964231 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f1e0e26-2abf-4719-8973-301f1d821a4e-config-data\") pod \"placement-8bbdc95bd-g568f\" (UID: \"7f1e0e26-2abf-4719-8973-301f1d821a4e\") " pod="openstack/placement-8bbdc95bd-g568f" Dec 07 19:35:06 crc kubenswrapper[4815]: I1207 19:35:06.964253 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f1e0e26-2abf-4719-8973-301f1d821a4e-logs\") pod \"placement-8bbdc95bd-g568f\" (UID: \"7f1e0e26-2abf-4719-8973-301f1d821a4e\") " pod="openstack/placement-8bbdc95bd-g568f" Dec 07 19:35:06 crc kubenswrapper[4815]: I1207 19:35:06.964298 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f1e0e26-2abf-4719-8973-301f1d821a4e-public-tls-certs\") pod \"placement-8bbdc95bd-g568f\" (UID: \"7f1e0e26-2abf-4719-8973-301f1d821a4e\") " pod="openstack/placement-8bbdc95bd-g568f" Dec 07 19:35:06 crc kubenswrapper[4815]: I1207 19:35:06.964325 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f1e0e26-2abf-4719-8973-301f1d821a4e-internal-tls-certs\") pod \"placement-8bbdc95bd-g568f\" (UID: \"7f1e0e26-2abf-4719-8973-301f1d821a4e\") " pod="openstack/placement-8bbdc95bd-g568f" Dec 07 19:35:06 crc kubenswrapper[4815]: I1207 19:35:06.964354 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k729m\" (UniqueName: \"kubernetes.io/projected/7f1e0e26-2abf-4719-8973-301f1d821a4e-kube-api-access-k729m\") pod \"placement-8bbdc95bd-g568f\" (UID: \"7f1e0e26-2abf-4719-8973-301f1d821a4e\") " pod="openstack/placement-8bbdc95bd-g568f" Dec 07 19:35:06 crc kubenswrapper[4815]: I1207 19:35:06.964387 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f1e0e26-2abf-4719-8973-301f1d821a4e-combined-ca-bundle\") pod \"placement-8bbdc95bd-g568f\" (UID: \"7f1e0e26-2abf-4719-8973-301f1d821a4e\") " pod="openstack/placement-8bbdc95bd-g568f" Dec 07 19:35:06 crc kubenswrapper[4815]: I1207 19:35:06.964959 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f1e0e26-2abf-4719-8973-301f1d821a4e-logs\") pod \"placement-8bbdc95bd-g568f\" (UID: \"7f1e0e26-2abf-4719-8973-301f1d821a4e\") " pod="openstack/placement-8bbdc95bd-g568f" Dec 07 19:35:06 crc kubenswrapper[4815]: I1207 19:35:06.969655 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f1e0e26-2abf-4719-8973-301f1d821a4e-scripts\") pod \"placement-8bbdc95bd-g568f\" (UID: \"7f1e0e26-2abf-4719-8973-301f1d821a4e\") " pod="openstack/placement-8bbdc95bd-g568f" Dec 07 19:35:06 crc kubenswrapper[4815]: I1207 19:35:06.970983 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f1e0e26-2abf-4719-8973-301f1d821a4e-combined-ca-bundle\") pod \"placement-8bbdc95bd-g568f\" (UID: \"7f1e0e26-2abf-4719-8973-301f1d821a4e\") " pod="openstack/placement-8bbdc95bd-g568f" Dec 07 19:35:06 crc kubenswrapper[4815]: I1207 19:35:06.971661 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f1e0e26-2abf-4719-8973-301f1d821a4e-config-data\") pod \"placement-8bbdc95bd-g568f\" (UID: \"7f1e0e26-2abf-4719-8973-301f1d821a4e\") " pod="openstack/placement-8bbdc95bd-g568f" Dec 07 19:35:06 crc kubenswrapper[4815]: I1207 19:35:06.976065 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f1e0e26-2abf-4719-8973-301f1d821a4e-public-tls-certs\") pod \"placement-8bbdc95bd-g568f\" (UID: \"7f1e0e26-2abf-4719-8973-301f1d821a4e\") " pod="openstack/placement-8bbdc95bd-g568f" Dec 07 19:35:06 crc kubenswrapper[4815]: I1207 19:35:06.976243 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f1e0e26-2abf-4719-8973-301f1d821a4e-internal-tls-certs\") pod \"placement-8bbdc95bd-g568f\" (UID: \"7f1e0e26-2abf-4719-8973-301f1d821a4e\") " pod="openstack/placement-8bbdc95bd-g568f" Dec 07 19:35:06 crc kubenswrapper[4815]: I1207 19:35:06.983673 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k729m\" (UniqueName: \"kubernetes.io/projected/7f1e0e26-2abf-4719-8973-301f1d821a4e-kube-api-access-k729m\") pod \"placement-8bbdc95bd-g568f\" (UID: \"7f1e0e26-2abf-4719-8973-301f1d821a4e\") " pod="openstack/placement-8bbdc95bd-g568f" Dec 07 19:35:07 crc kubenswrapper[4815]: I1207 19:35:07.137891 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8bbdc95bd-g568f" Dec 07 19:35:08 crc kubenswrapper[4815]: I1207 19:35:08.831170 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xxh6g" Dec 07 19:35:09 crc kubenswrapper[4815]: I1207 19:35:09.001620 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f0f688e-04dc-46a6-a4eb-1c8d5e635abe-scripts\") pod \"2f0f688e-04dc-46a6-a4eb-1c8d5e635abe\" (UID: \"2f0f688e-04dc-46a6-a4eb-1c8d5e635abe\") " Dec 07 19:35:09 crc kubenswrapper[4815]: I1207 19:35:09.001764 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f0f688e-04dc-46a6-a4eb-1c8d5e635abe-config-data\") pod \"2f0f688e-04dc-46a6-a4eb-1c8d5e635abe\" (UID: \"2f0f688e-04dc-46a6-a4eb-1c8d5e635abe\") " Dec 07 19:35:09 crc kubenswrapper[4815]: I1207 19:35:09.001899 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6njtq\" (UniqueName: \"kubernetes.io/projected/2f0f688e-04dc-46a6-a4eb-1c8d5e635abe-kube-api-access-6njtq\") pod \"2f0f688e-04dc-46a6-a4eb-1c8d5e635abe\" (UID: \"2f0f688e-04dc-46a6-a4eb-1c8d5e635abe\") " Dec 07 19:35:09 crc kubenswrapper[4815]: I1207 19:35:09.001986 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2f0f688e-04dc-46a6-a4eb-1c8d5e635abe-credential-keys\") pod \"2f0f688e-04dc-46a6-a4eb-1c8d5e635abe\" (UID: \"2f0f688e-04dc-46a6-a4eb-1c8d5e635abe\") " Dec 07 19:35:09 crc kubenswrapper[4815]: I1207 19:35:09.002019 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2f0f688e-04dc-46a6-a4eb-1c8d5e635abe-fernet-keys\") pod \"2f0f688e-04dc-46a6-a4eb-1c8d5e635abe\" (UID: \"2f0f688e-04dc-46a6-a4eb-1c8d5e635abe\") " Dec 07 19:35:09 crc kubenswrapper[4815]: I1207 19:35:09.002066 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f0f688e-04dc-46a6-a4eb-1c8d5e635abe-combined-ca-bundle\") pod \"2f0f688e-04dc-46a6-a4eb-1c8d5e635abe\" (UID: \"2f0f688e-04dc-46a6-a4eb-1c8d5e635abe\") " Dec 07 19:35:09 crc kubenswrapper[4815]: I1207 19:35:09.009272 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f0f688e-04dc-46a6-a4eb-1c8d5e635abe-kube-api-access-6njtq" (OuterVolumeSpecName: "kube-api-access-6njtq") pod "2f0f688e-04dc-46a6-a4eb-1c8d5e635abe" (UID: "2f0f688e-04dc-46a6-a4eb-1c8d5e635abe"). InnerVolumeSpecName "kube-api-access-6njtq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:35:09 crc kubenswrapper[4815]: I1207 19:35:09.009605 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f0f688e-04dc-46a6-a4eb-1c8d5e635abe-scripts" (OuterVolumeSpecName: "scripts") pod "2f0f688e-04dc-46a6-a4eb-1c8d5e635abe" (UID: "2f0f688e-04dc-46a6-a4eb-1c8d5e635abe"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:35:09 crc kubenswrapper[4815]: I1207 19:35:09.012772 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f0f688e-04dc-46a6-a4eb-1c8d5e635abe-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "2f0f688e-04dc-46a6-a4eb-1c8d5e635abe" (UID: "2f0f688e-04dc-46a6-a4eb-1c8d5e635abe"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:35:09 crc kubenswrapper[4815]: I1207 19:35:09.022655 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f0f688e-04dc-46a6-a4eb-1c8d5e635abe-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "2f0f688e-04dc-46a6-a4eb-1c8d5e635abe" (UID: "2f0f688e-04dc-46a6-a4eb-1c8d5e635abe"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:35:09 crc kubenswrapper[4815]: I1207 19:35:09.042500 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f0f688e-04dc-46a6-a4eb-1c8d5e635abe-config-data" (OuterVolumeSpecName: "config-data") pod "2f0f688e-04dc-46a6-a4eb-1c8d5e635abe" (UID: "2f0f688e-04dc-46a6-a4eb-1c8d5e635abe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:35:09 crc kubenswrapper[4815]: I1207 19:35:09.050262 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f0f688e-04dc-46a6-a4eb-1c8d5e635abe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2f0f688e-04dc-46a6-a4eb-1c8d5e635abe" (UID: "2f0f688e-04dc-46a6-a4eb-1c8d5e635abe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:35:09 crc kubenswrapper[4815]: I1207 19:35:09.103096 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6njtq\" (UniqueName: \"kubernetes.io/projected/2f0f688e-04dc-46a6-a4eb-1c8d5e635abe-kube-api-access-6njtq\") on node \"crc\" DevicePath \"\"" Dec 07 19:35:09 crc kubenswrapper[4815]: I1207 19:35:09.103127 4815 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2f0f688e-04dc-46a6-a4eb-1c8d5e635abe-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 07 19:35:09 crc kubenswrapper[4815]: I1207 19:35:09.103137 4815 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2f0f688e-04dc-46a6-a4eb-1c8d5e635abe-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 07 19:35:09 crc kubenswrapper[4815]: I1207 19:35:09.103147 4815 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f0f688e-04dc-46a6-a4eb-1c8d5e635abe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 07 19:35:09 crc kubenswrapper[4815]: I1207 19:35:09.103156 4815 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f0f688e-04dc-46a6-a4eb-1c8d5e635abe-scripts\") on node \"crc\" DevicePath \"\"" Dec 07 19:35:09 crc kubenswrapper[4815]: I1207 19:35:09.103165 4815 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f0f688e-04dc-46a6-a4eb-1c8d5e635abe-config-data\") on node \"crc\" DevicePath \"\"" Dec 07 19:35:09 crc kubenswrapper[4815]: I1207 19:35:09.745199 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xxh6g" event={"ID":"2f0f688e-04dc-46a6-a4eb-1c8d5e635abe","Type":"ContainerDied","Data":"db2dba635269f6582f669f45136de443af753ca439629cb6f37da9e9526aacf2"} Dec 07 19:35:09 crc kubenswrapper[4815]: I1207 19:35:09.745247 4815 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db2dba635269f6582f669f45136de443af753ca439629cb6f37da9e9526aacf2" Dec 07 19:35:09 crc kubenswrapper[4815]: I1207 19:35:09.745288 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xxh6g" Dec 07 19:35:09 crc kubenswrapper[4815]: I1207 19:35:09.953944 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-687459c5bc-4dzjp"] Dec 07 19:35:09 crc kubenswrapper[4815]: E1207 19:35:09.954319 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f0f688e-04dc-46a6-a4eb-1c8d5e635abe" containerName="keystone-bootstrap" Dec 07 19:35:09 crc kubenswrapper[4815]: I1207 19:35:09.954358 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f0f688e-04dc-46a6-a4eb-1c8d5e635abe" containerName="keystone-bootstrap" Dec 07 19:35:09 crc kubenswrapper[4815]: I1207 19:35:09.954569 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f0f688e-04dc-46a6-a4eb-1c8d5e635abe" containerName="keystone-bootstrap" Dec 07 19:35:09 crc kubenswrapper[4815]: I1207 19:35:09.955213 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-687459c5bc-4dzjp" Dec 07 19:35:09 crc kubenswrapper[4815]: I1207 19:35:09.962460 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-mdlc7" Dec 07 19:35:09 crc kubenswrapper[4815]: I1207 19:35:09.962560 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 07 19:35:09 crc kubenswrapper[4815]: I1207 19:35:09.962717 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 07 19:35:09 crc kubenswrapper[4815]: I1207 19:35:09.962804 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Dec 07 19:35:09 crc kubenswrapper[4815]: I1207 19:35:09.963070 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Dec 07 19:35:09 crc kubenswrapper[4815]: I1207 19:35:09.966196 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-687459c5bc-4dzjp"] Dec 07 19:35:09 crc kubenswrapper[4815]: I1207 19:35:09.967820 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 07 19:35:10 crc kubenswrapper[4815]: I1207 19:35:10.018655 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/010b84d9-e9f5-4902-994f-9cbf1ce02d26-scripts\") pod \"keystone-687459c5bc-4dzjp\" (UID: \"010b84d9-e9f5-4902-994f-9cbf1ce02d26\") " pod="openstack/keystone-687459c5bc-4dzjp" Dec 07 19:35:10 crc kubenswrapper[4815]: I1207 19:35:10.018745 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/010b84d9-e9f5-4902-994f-9cbf1ce02d26-credential-keys\") pod \"keystone-687459c5bc-4dzjp\" (UID: \"010b84d9-e9f5-4902-994f-9cbf1ce02d26\") " pod="openstack/keystone-687459c5bc-4dzjp" Dec 07 19:35:10 crc kubenswrapper[4815]: I1207 19:35:10.018771 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/010b84d9-e9f5-4902-994f-9cbf1ce02d26-internal-tls-certs\") pod \"keystone-687459c5bc-4dzjp\" (UID: \"010b84d9-e9f5-4902-994f-9cbf1ce02d26\") " pod="openstack/keystone-687459c5bc-4dzjp" Dec 07 19:35:10 crc kubenswrapper[4815]: I1207 19:35:10.018800 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/010b84d9-e9f5-4902-994f-9cbf1ce02d26-fernet-keys\") pod \"keystone-687459c5bc-4dzjp\" (UID: \"010b84d9-e9f5-4902-994f-9cbf1ce02d26\") " pod="openstack/keystone-687459c5bc-4dzjp" Dec 07 19:35:10 crc kubenswrapper[4815]: I1207 19:35:10.018816 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/010b84d9-e9f5-4902-994f-9cbf1ce02d26-combined-ca-bundle\") pod \"keystone-687459c5bc-4dzjp\" (UID: \"010b84d9-e9f5-4902-994f-9cbf1ce02d26\") " pod="openstack/keystone-687459c5bc-4dzjp" Dec 07 19:35:10 crc kubenswrapper[4815]: I1207 19:35:10.018831 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/010b84d9-e9f5-4902-994f-9cbf1ce02d26-public-tls-certs\") pod \"keystone-687459c5bc-4dzjp\" (UID: \"010b84d9-e9f5-4902-994f-9cbf1ce02d26\") " pod="openstack/keystone-687459c5bc-4dzjp" Dec 07 19:35:10 crc kubenswrapper[4815]: I1207 19:35:10.018856 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2gvh\" (UniqueName: \"kubernetes.io/projected/010b84d9-e9f5-4902-994f-9cbf1ce02d26-kube-api-access-p2gvh\") pod \"keystone-687459c5bc-4dzjp\" (UID: \"010b84d9-e9f5-4902-994f-9cbf1ce02d26\") " pod="openstack/keystone-687459c5bc-4dzjp" Dec 07 19:35:10 crc kubenswrapper[4815]: I1207 19:35:10.018876 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/010b84d9-e9f5-4902-994f-9cbf1ce02d26-config-data\") pod \"keystone-687459c5bc-4dzjp\" (UID: \"010b84d9-e9f5-4902-994f-9cbf1ce02d26\") " pod="openstack/keystone-687459c5bc-4dzjp" Dec 07 19:35:10 crc kubenswrapper[4815]: I1207 19:35:10.120827 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/010b84d9-e9f5-4902-994f-9cbf1ce02d26-credential-keys\") pod \"keystone-687459c5bc-4dzjp\" (UID: \"010b84d9-e9f5-4902-994f-9cbf1ce02d26\") " pod="openstack/keystone-687459c5bc-4dzjp" Dec 07 19:35:10 crc kubenswrapper[4815]: I1207 19:35:10.120869 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/010b84d9-e9f5-4902-994f-9cbf1ce02d26-internal-tls-certs\") pod \"keystone-687459c5bc-4dzjp\" (UID: \"010b84d9-e9f5-4902-994f-9cbf1ce02d26\") " pod="openstack/keystone-687459c5bc-4dzjp" Dec 07 19:35:10 crc kubenswrapper[4815]: I1207 19:35:10.120902 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/010b84d9-e9f5-4902-994f-9cbf1ce02d26-fernet-keys\") pod \"keystone-687459c5bc-4dzjp\" (UID: \"010b84d9-e9f5-4902-994f-9cbf1ce02d26\") " pod="openstack/keystone-687459c5bc-4dzjp" Dec 07 19:35:10 crc kubenswrapper[4815]: I1207 19:35:10.120997 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/010b84d9-e9f5-4902-994f-9cbf1ce02d26-combined-ca-bundle\") pod \"keystone-687459c5bc-4dzjp\" (UID: \"010b84d9-e9f5-4902-994f-9cbf1ce02d26\") " pod="openstack/keystone-687459c5bc-4dzjp" Dec 07 19:35:10 crc kubenswrapper[4815]: I1207 19:35:10.121020 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/010b84d9-e9f5-4902-994f-9cbf1ce02d26-public-tls-certs\") pod \"keystone-687459c5bc-4dzjp\" (UID: \"010b84d9-e9f5-4902-994f-9cbf1ce02d26\") " pod="openstack/keystone-687459c5bc-4dzjp" Dec 07 19:35:10 crc kubenswrapper[4815]: I1207 19:35:10.121052 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2gvh\" (UniqueName: \"kubernetes.io/projected/010b84d9-e9f5-4902-994f-9cbf1ce02d26-kube-api-access-p2gvh\") pod \"keystone-687459c5bc-4dzjp\" (UID: \"010b84d9-e9f5-4902-994f-9cbf1ce02d26\") " pod="openstack/keystone-687459c5bc-4dzjp" Dec 07 19:35:10 crc kubenswrapper[4815]: I1207 19:35:10.121080 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/010b84d9-e9f5-4902-994f-9cbf1ce02d26-config-data\") pod \"keystone-687459c5bc-4dzjp\" (UID: \"010b84d9-e9f5-4902-994f-9cbf1ce02d26\") " pod="openstack/keystone-687459c5bc-4dzjp" Dec 07 19:35:10 crc kubenswrapper[4815]: I1207 19:35:10.121129 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/010b84d9-e9f5-4902-994f-9cbf1ce02d26-scripts\") pod \"keystone-687459c5bc-4dzjp\" (UID: \"010b84d9-e9f5-4902-994f-9cbf1ce02d26\") " pod="openstack/keystone-687459c5bc-4dzjp" Dec 07 19:35:10 crc kubenswrapper[4815]: I1207 19:35:10.130803 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/010b84d9-e9f5-4902-994f-9cbf1ce02d26-scripts\") pod \"keystone-687459c5bc-4dzjp\" (UID: \"010b84d9-e9f5-4902-994f-9cbf1ce02d26\") " pod="openstack/keystone-687459c5bc-4dzjp" Dec 07 19:35:10 crc kubenswrapper[4815]: I1207 19:35:10.131633 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/010b84d9-e9f5-4902-994f-9cbf1ce02d26-combined-ca-bundle\") pod \"keystone-687459c5bc-4dzjp\" (UID: \"010b84d9-e9f5-4902-994f-9cbf1ce02d26\") " pod="openstack/keystone-687459c5bc-4dzjp" Dec 07 19:35:10 crc kubenswrapper[4815]: I1207 19:35:10.131726 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/010b84d9-e9f5-4902-994f-9cbf1ce02d26-config-data\") pod \"keystone-687459c5bc-4dzjp\" (UID: \"010b84d9-e9f5-4902-994f-9cbf1ce02d26\") " pod="openstack/keystone-687459c5bc-4dzjp" Dec 07 19:35:10 crc kubenswrapper[4815]: I1207 19:35:10.132275 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/010b84d9-e9f5-4902-994f-9cbf1ce02d26-fernet-keys\") pod \"keystone-687459c5bc-4dzjp\" (UID: \"010b84d9-e9f5-4902-994f-9cbf1ce02d26\") " pod="openstack/keystone-687459c5bc-4dzjp" Dec 07 19:35:10 crc kubenswrapper[4815]: I1207 19:35:10.132463 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/010b84d9-e9f5-4902-994f-9cbf1ce02d26-credential-keys\") pod \"keystone-687459c5bc-4dzjp\" (UID: \"010b84d9-e9f5-4902-994f-9cbf1ce02d26\") " pod="openstack/keystone-687459c5bc-4dzjp" Dec 07 19:35:10 crc kubenswrapper[4815]: I1207 19:35:10.140643 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/010b84d9-e9f5-4902-994f-9cbf1ce02d26-internal-tls-certs\") pod \"keystone-687459c5bc-4dzjp\" (UID: \"010b84d9-e9f5-4902-994f-9cbf1ce02d26\") " pod="openstack/keystone-687459c5bc-4dzjp" Dec 07 19:35:10 crc kubenswrapper[4815]: I1207 19:35:10.150137 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/010b84d9-e9f5-4902-994f-9cbf1ce02d26-public-tls-certs\") pod \"keystone-687459c5bc-4dzjp\" (UID: \"010b84d9-e9f5-4902-994f-9cbf1ce02d26\") " pod="openstack/keystone-687459c5bc-4dzjp" Dec 07 19:35:10 crc kubenswrapper[4815]: I1207 19:35:10.156602 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2gvh\" (UniqueName: \"kubernetes.io/projected/010b84d9-e9f5-4902-994f-9cbf1ce02d26-kube-api-access-p2gvh\") pod \"keystone-687459c5bc-4dzjp\" (UID: \"010b84d9-e9f5-4902-994f-9cbf1ce02d26\") " pod="openstack/keystone-687459c5bc-4dzjp" Dec 07 19:35:10 crc kubenswrapper[4815]: I1207 19:35:10.283957 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-687459c5bc-4dzjp" Dec 07 19:35:11 crc kubenswrapper[4815]: I1207 19:35:11.005078 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5f66db59b9-qc6xg" Dec 07 19:35:11 crc kubenswrapper[4815]: I1207 19:35:11.072775 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bf59f66bf-dftst"] Dec 07 19:35:11 crc kubenswrapper[4815]: I1207 19:35:11.073080 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6bf59f66bf-dftst" podUID="2a840d50-ae2e-4131-b162-f129b669461a" containerName="dnsmasq-dns" containerID="cri-o://47ef3d714c33868380940886b84dc9e96e24ff044e22ef62e630ed0bd9f78e52" gracePeriod=10 Dec 07 19:35:11 crc kubenswrapper[4815]: I1207 19:35:11.765937 4815 generic.go:334] "Generic (PLEG): container finished" podID="2a840d50-ae2e-4131-b162-f129b669461a" containerID="47ef3d714c33868380940886b84dc9e96e24ff044e22ef62e630ed0bd9f78e52" exitCode=0 Dec 07 19:35:11 crc kubenswrapper[4815]: I1207 19:35:11.766012 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bf59f66bf-dftst" event={"ID":"2a840d50-ae2e-4131-b162-f129b669461a","Type":"ContainerDied","Data":"47ef3d714c33868380940886b84dc9e96e24ff044e22ef62e630ed0bd9f78e52"} Dec 07 19:35:12 crc kubenswrapper[4815]: I1207 19:35:12.918683 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bf59f66bf-dftst" Dec 07 19:35:13 crc kubenswrapper[4815]: I1207 19:35:13.069670 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2a840d50-ae2e-4131-b162-f129b669461a-dns-svc\") pod \"2a840d50-ae2e-4131-b162-f129b669461a\" (UID: \"2a840d50-ae2e-4131-b162-f129b669461a\") " Dec 07 19:35:13 crc kubenswrapper[4815]: I1207 19:35:13.069836 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a840d50-ae2e-4131-b162-f129b669461a-config\") pod \"2a840d50-ae2e-4131-b162-f129b669461a\" (UID: \"2a840d50-ae2e-4131-b162-f129b669461a\") " Dec 07 19:35:13 crc kubenswrapper[4815]: I1207 19:35:13.070072 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9wwwc\" (UniqueName: \"kubernetes.io/projected/2a840d50-ae2e-4131-b162-f129b669461a-kube-api-access-9wwwc\") pod \"2a840d50-ae2e-4131-b162-f129b669461a\" (UID: \"2a840d50-ae2e-4131-b162-f129b669461a\") " Dec 07 19:35:13 crc kubenswrapper[4815]: I1207 19:35:13.071128 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2a840d50-ae2e-4131-b162-f129b669461a-ovsdbserver-nb\") pod \"2a840d50-ae2e-4131-b162-f129b669461a\" (UID: \"2a840d50-ae2e-4131-b162-f129b669461a\") " Dec 07 19:35:13 crc kubenswrapper[4815]: I1207 19:35:13.071222 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2a840d50-ae2e-4131-b162-f129b669461a-ovsdbserver-sb\") pod \"2a840d50-ae2e-4131-b162-f129b669461a\" (UID: \"2a840d50-ae2e-4131-b162-f129b669461a\") " Dec 07 19:35:13 crc kubenswrapper[4815]: I1207 19:35:13.077740 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a840d50-ae2e-4131-b162-f129b669461a-kube-api-access-9wwwc" (OuterVolumeSpecName: "kube-api-access-9wwwc") pod "2a840d50-ae2e-4131-b162-f129b669461a" (UID: "2a840d50-ae2e-4131-b162-f129b669461a"). InnerVolumeSpecName "kube-api-access-9wwwc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:35:13 crc kubenswrapper[4815]: I1207 19:35:13.137885 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-8bbdc95bd-g568f"] Dec 07 19:35:13 crc kubenswrapper[4815]: I1207 19:35:13.149473 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a840d50-ae2e-4131-b162-f129b669461a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2a840d50-ae2e-4131-b162-f129b669461a" (UID: "2a840d50-ae2e-4131-b162-f129b669461a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:35:13 crc kubenswrapper[4815]: I1207 19:35:13.151098 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a840d50-ae2e-4131-b162-f129b669461a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2a840d50-ae2e-4131-b162-f129b669461a" (UID: "2a840d50-ae2e-4131-b162-f129b669461a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:35:13 crc kubenswrapper[4815]: I1207 19:35:13.153721 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a840d50-ae2e-4131-b162-f129b669461a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2a840d50-ae2e-4131-b162-f129b669461a" (UID: "2a840d50-ae2e-4131-b162-f129b669461a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:35:13 crc kubenswrapper[4815]: I1207 19:35:13.159868 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-687459c5bc-4dzjp"] Dec 07 19:35:13 crc kubenswrapper[4815]: I1207 19:35:13.163465 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a840d50-ae2e-4131-b162-f129b669461a-config" (OuterVolumeSpecName: "config") pod "2a840d50-ae2e-4131-b162-f129b669461a" (UID: "2a840d50-ae2e-4131-b162-f129b669461a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:35:13 crc kubenswrapper[4815]: I1207 19:35:13.173976 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9wwwc\" (UniqueName: \"kubernetes.io/projected/2a840d50-ae2e-4131-b162-f129b669461a-kube-api-access-9wwwc\") on node \"crc\" DevicePath \"\"" Dec 07 19:35:13 crc kubenswrapper[4815]: I1207 19:35:13.174004 4815 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2a840d50-ae2e-4131-b162-f129b669461a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 07 19:35:13 crc kubenswrapper[4815]: I1207 19:35:13.174015 4815 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2a840d50-ae2e-4131-b162-f129b669461a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 07 19:35:13 crc kubenswrapper[4815]: I1207 19:35:13.174034 4815 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2a840d50-ae2e-4131-b162-f129b669461a-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 07 19:35:13 crc kubenswrapper[4815]: I1207 19:35:13.174046 4815 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a840d50-ae2e-4131-b162-f129b669461a-config\") on node \"crc\" DevicePath \"\"" Dec 07 19:35:13 crc kubenswrapper[4815]: I1207 19:35:13.796743 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bf59f66bf-dftst" event={"ID":"2a840d50-ae2e-4131-b162-f129b669461a","Type":"ContainerDied","Data":"5a8b785667a9078c79b24d69aedced11609d0ea8269bdc9993c08573e8a5edf3"} Dec 07 19:35:13 crc kubenswrapper[4815]: I1207 19:35:13.797000 4815 scope.go:117] "RemoveContainer" containerID="47ef3d714c33868380940886b84dc9e96e24ff044e22ef62e630ed0bd9f78e52" Dec 07 19:35:13 crc kubenswrapper[4815]: I1207 19:35:13.797120 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bf59f66bf-dftst" Dec 07 19:35:13 crc kubenswrapper[4815]: I1207 19:35:13.799477 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-687459c5bc-4dzjp" event={"ID":"010b84d9-e9f5-4902-994f-9cbf1ce02d26","Type":"ContainerStarted","Data":"f281e1137efedd01e9c3ce041eb67c6653c3f9edc0c1da01580ce9c7cadc8c36"} Dec 07 19:35:13 crc kubenswrapper[4815]: I1207 19:35:13.799505 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-687459c5bc-4dzjp" event={"ID":"010b84d9-e9f5-4902-994f-9cbf1ce02d26","Type":"ContainerStarted","Data":"ef376fb936e767f097bf2f2f666c6b85b10e8eee70c956e152dc75ea247beef9"} Dec 07 19:35:13 crc kubenswrapper[4815]: I1207 19:35:13.800161 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-687459c5bc-4dzjp" Dec 07 19:35:13 crc kubenswrapper[4815]: I1207 19:35:13.810159 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8bbdc95bd-g568f" event={"ID":"7f1e0e26-2abf-4719-8973-301f1d821a4e","Type":"ContainerStarted","Data":"c550d6214c00327c9ce1e8ce7f7b3377b4f3fc5d69260ef7c1a6a90f8a6fd4be"} Dec 07 19:35:13 crc kubenswrapper[4815]: I1207 19:35:13.810271 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8bbdc95bd-g568f" event={"ID":"7f1e0e26-2abf-4719-8973-301f1d821a4e","Type":"ContainerStarted","Data":"5d28622184cd4b4f2b001f95a85d0686690cca239be9b3aaa888f0bbed58a8e1"} Dec 07 19:35:13 crc kubenswrapper[4815]: I1207 19:35:13.810347 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-8bbdc95bd-g568f" Dec 07 19:35:13 crc kubenswrapper[4815]: I1207 19:35:13.810414 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8bbdc95bd-g568f" event={"ID":"7f1e0e26-2abf-4719-8973-301f1d821a4e","Type":"ContainerStarted","Data":"ad1d436fd932240e709e78cb1b475077ac2c9b164e9c937c581765ff7c46cc28"} Dec 07 19:35:13 crc kubenswrapper[4815]: I1207 19:35:13.811814 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-5s8vb" event={"ID":"a9829ff6-a37c-403a-9b72-8a2c1e3df5d5","Type":"ContainerStarted","Data":"e5171d2eca80ad011fa8ee06ba4baec734cf54b36eb0a0c632b1146f3e2e3fd1"} Dec 07 19:35:13 crc kubenswrapper[4815]: I1207 19:35:13.813734 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69daea8f-61a8-4a91-b782-afdcb01e0605","Type":"ContainerStarted","Data":"03f5e66f8f929a2eb3d5b4a807b2b2019938d3990b67f87bb25a0df5ca30144c"} Dec 07 19:35:13 crc kubenswrapper[4815]: I1207 19:35:13.824030 4815 scope.go:117] "RemoveContainer" containerID="91f4f698a97ffc4874f4ea704899058e682f7dd1c9919e2d9d822e853f3d30a0" Dec 07 19:35:13 crc kubenswrapper[4815]: I1207 19:35:13.830343 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-687459c5bc-4dzjp" podStartSLOduration=4.830323559 podStartE2EDuration="4.830323559s" podCreationTimestamp="2025-12-07 19:35:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:35:13.827835929 +0000 UTC m=+1218.406825974" watchObservedRunningTime="2025-12-07 19:35:13.830323559 +0000 UTC m=+1218.409313604" Dec 07 19:35:13 crc kubenswrapper[4815]: I1207 19:35:13.912065 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bf59f66bf-dftst"] Dec 07 19:35:13 crc kubenswrapper[4815]: I1207 19:35:13.922514 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6bf59f66bf-dftst"] Dec 07 19:35:13 crc kubenswrapper[4815]: I1207 19:35:13.926078 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-8bbdc95bd-g568f" podStartSLOduration=7.926057392 podStartE2EDuration="7.926057392s" podCreationTimestamp="2025-12-07 19:35:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:35:13.883983784 +0000 UTC m=+1218.462973829" watchObservedRunningTime="2025-12-07 19:35:13.926057392 +0000 UTC m=+1218.505047437" Dec 07 19:35:13 crc kubenswrapper[4815]: I1207 19:35:13.939966 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-5s8vb" podStartSLOduration=3.229543638 podStartE2EDuration="41.939942934s" podCreationTimestamp="2025-12-07 19:34:32 +0000 UTC" firstStartedPulling="2025-12-07 19:34:34.039504378 +0000 UTC m=+1178.618494423" lastFinishedPulling="2025-12-07 19:35:12.749903674 +0000 UTC m=+1217.328893719" observedRunningTime="2025-12-07 19:35:13.902523757 +0000 UTC m=+1218.481513802" watchObservedRunningTime="2025-12-07 19:35:13.939942934 +0000 UTC m=+1218.518932979" Dec 07 19:35:14 crc kubenswrapper[4815]: I1207 19:35:14.825512 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-8bbdc95bd-g568f" Dec 07 19:35:15 crc kubenswrapper[4815]: I1207 19:35:15.780968 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a840d50-ae2e-4131-b162-f129b669461a" path="/var/lib/kubelet/pods/2a840d50-ae2e-4131-b162-f129b669461a/volumes" Dec 07 19:35:16 crc kubenswrapper[4815]: I1207 19:35:16.845623 4815 generic.go:334] "Generic (PLEG): container finished" podID="a9829ff6-a37c-403a-9b72-8a2c1e3df5d5" containerID="e5171d2eca80ad011fa8ee06ba4baec734cf54b36eb0a0c632b1146f3e2e3fd1" exitCode=0 Dec 07 19:35:16 crc kubenswrapper[4815]: I1207 19:35:16.846483 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-5s8vb" event={"ID":"a9829ff6-a37c-403a-9b72-8a2c1e3df5d5","Type":"ContainerDied","Data":"e5171d2eca80ad011fa8ee06ba4baec734cf54b36eb0a0c632b1146f3e2e3fd1"} Dec 07 19:35:16 crc kubenswrapper[4815]: I1207 19:35:16.849627 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-ljqjq" event={"ID":"cfb75e0b-4e7b-484f-832b-5ed69650f1f1","Type":"ContainerStarted","Data":"9807558ff6db72094e22cacfd87a8d67aa115435ff38a4b101e4eb6219c86929"} Dec 07 19:35:16 crc kubenswrapper[4815]: I1207 19:35:16.889048 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-ljqjq" podStartSLOduration=3.24274204 podStartE2EDuration="44.889029175s" podCreationTimestamp="2025-12-07 19:34:32 +0000 UTC" firstStartedPulling="2025-12-07 19:34:33.857107129 +0000 UTC m=+1178.436097174" lastFinishedPulling="2025-12-07 19:35:15.503394264 +0000 UTC m=+1220.082384309" observedRunningTime="2025-12-07 19:35:16.888541121 +0000 UTC m=+1221.467531166" watchObservedRunningTime="2025-12-07 19:35:16.889029175 +0000 UTC m=+1221.468019220" Dec 07 19:35:20 crc kubenswrapper[4815]: I1207 19:35:20.585978 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-5s8vb" Dec 07 19:35:20 crc kubenswrapper[4815]: I1207 19:35:20.623010 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a9829ff6-a37c-403a-9b72-8a2c1e3df5d5-db-sync-config-data\") pod \"a9829ff6-a37c-403a-9b72-8a2c1e3df5d5\" (UID: \"a9829ff6-a37c-403a-9b72-8a2c1e3df5d5\") " Dec 07 19:35:20 crc kubenswrapper[4815]: I1207 19:35:20.623079 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9829ff6-a37c-403a-9b72-8a2c1e3df5d5-combined-ca-bundle\") pod \"a9829ff6-a37c-403a-9b72-8a2c1e3df5d5\" (UID: \"a9829ff6-a37c-403a-9b72-8a2c1e3df5d5\") " Dec 07 19:35:20 crc kubenswrapper[4815]: I1207 19:35:20.623243 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ppxvc\" (UniqueName: \"kubernetes.io/projected/a9829ff6-a37c-403a-9b72-8a2c1e3df5d5-kube-api-access-ppxvc\") pod \"a9829ff6-a37c-403a-9b72-8a2c1e3df5d5\" (UID: \"a9829ff6-a37c-403a-9b72-8a2c1e3df5d5\") " Dec 07 19:35:20 crc kubenswrapper[4815]: I1207 19:35:20.628993 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9829ff6-a37c-403a-9b72-8a2c1e3df5d5-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "a9829ff6-a37c-403a-9b72-8a2c1e3df5d5" (UID: "a9829ff6-a37c-403a-9b72-8a2c1e3df5d5"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:35:20 crc kubenswrapper[4815]: I1207 19:35:20.631304 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9829ff6-a37c-403a-9b72-8a2c1e3df5d5-kube-api-access-ppxvc" (OuterVolumeSpecName: "kube-api-access-ppxvc") pod "a9829ff6-a37c-403a-9b72-8a2c1e3df5d5" (UID: "a9829ff6-a37c-403a-9b72-8a2c1e3df5d5"). InnerVolumeSpecName "kube-api-access-ppxvc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:35:20 crc kubenswrapper[4815]: I1207 19:35:20.679495 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9829ff6-a37c-403a-9b72-8a2c1e3df5d5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a9829ff6-a37c-403a-9b72-8a2c1e3df5d5" (UID: "a9829ff6-a37c-403a-9b72-8a2c1e3df5d5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:35:20 crc kubenswrapper[4815]: I1207 19:35:20.724977 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ppxvc\" (UniqueName: \"kubernetes.io/projected/a9829ff6-a37c-403a-9b72-8a2c1e3df5d5-kube-api-access-ppxvc\") on node \"crc\" DevicePath \"\"" Dec 07 19:35:20 crc kubenswrapper[4815]: I1207 19:35:20.725004 4815 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a9829ff6-a37c-403a-9b72-8a2c1e3df5d5-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 07 19:35:20 crc kubenswrapper[4815]: I1207 19:35:20.725013 4815 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9829ff6-a37c-403a-9b72-8a2c1e3df5d5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 07 19:35:20 crc kubenswrapper[4815]: I1207 19:35:20.888285 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-5s8vb" event={"ID":"a9829ff6-a37c-403a-9b72-8a2c1e3df5d5","Type":"ContainerDied","Data":"ebe0f88ef9bb3be4324e7432244a37717949e30188dbac95fcf83eebfc1a1c66"} Dec 07 19:35:20 crc kubenswrapper[4815]: I1207 19:35:20.888327 4815 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ebe0f88ef9bb3be4324e7432244a37717949e30188dbac95fcf83eebfc1a1c66" Dec 07 19:35:20 crc kubenswrapper[4815]: I1207 19:35:20.888382 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-5s8vb" Dec 07 19:35:21 crc kubenswrapper[4815]: I1207 19:35:21.983869 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-867657f79f-kqjvk"] Dec 07 19:35:21 crc kubenswrapper[4815]: E1207 19:35:21.984833 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a840d50-ae2e-4131-b162-f129b669461a" containerName="init" Dec 07 19:35:21 crc kubenswrapper[4815]: I1207 19:35:21.984846 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a840d50-ae2e-4131-b162-f129b669461a" containerName="init" Dec 07 19:35:21 crc kubenswrapper[4815]: E1207 19:35:21.984875 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9829ff6-a37c-403a-9b72-8a2c1e3df5d5" containerName="barbican-db-sync" Dec 07 19:35:21 crc kubenswrapper[4815]: I1207 19:35:21.984881 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9829ff6-a37c-403a-9b72-8a2c1e3df5d5" containerName="barbican-db-sync" Dec 07 19:35:21 crc kubenswrapper[4815]: E1207 19:35:21.984899 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a840d50-ae2e-4131-b162-f129b669461a" containerName="dnsmasq-dns" Dec 07 19:35:21 crc kubenswrapper[4815]: I1207 19:35:21.984904 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a840d50-ae2e-4131-b162-f129b669461a" containerName="dnsmasq-dns" Dec 07 19:35:21 crc kubenswrapper[4815]: I1207 19:35:21.985125 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9829ff6-a37c-403a-9b72-8a2c1e3df5d5" containerName="barbican-db-sync" Dec 07 19:35:21 crc kubenswrapper[4815]: I1207 19:35:21.985142 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a840d50-ae2e-4131-b162-f129b669461a" containerName="dnsmasq-dns" Dec 07 19:35:21 crc kubenswrapper[4815]: I1207 19:35:21.986023 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-867657f79f-kqjvk" Dec 07 19:35:21 crc kubenswrapper[4815]: I1207 19:35:21.991454 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-8jlxs" Dec 07 19:35:21 crc kubenswrapper[4815]: I1207 19:35:21.991654 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 07 19:35:21 crc kubenswrapper[4815]: I1207 19:35:21.991773 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Dec 07 19:35:22 crc kubenswrapper[4815]: I1207 19:35:22.005038 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69daea8f-61a8-4a91-b782-afdcb01e0605","Type":"ContainerStarted","Data":"0706952bb5f3c765ed6f14f7964fd27ddc584a9a073d3c8f103b0ccf8f4b49b5"} Dec 07 19:35:22 crc kubenswrapper[4815]: I1207 19:35:22.005218 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="69daea8f-61a8-4a91-b782-afdcb01e0605" containerName="ceilometer-central-agent" containerID="cri-o://6cfc9f445c2c6b5c1aaf487517b32ed8cdf2aa13f5153d0592c6d4402a336c89" gracePeriod=30 Dec 07 19:35:22 crc kubenswrapper[4815]: I1207 19:35:22.005528 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 07 19:35:22 crc kubenswrapper[4815]: I1207 19:35:22.005568 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="69daea8f-61a8-4a91-b782-afdcb01e0605" containerName="proxy-httpd" containerID="cri-o://0706952bb5f3c765ed6f14f7964fd27ddc584a9a073d3c8f103b0ccf8f4b49b5" gracePeriod=30 Dec 07 19:35:22 crc kubenswrapper[4815]: I1207 19:35:22.006081 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="69daea8f-61a8-4a91-b782-afdcb01e0605" containerName="ceilometer-notification-agent" containerID="cri-o://d8db10fcc665455ea782f9324b354461ff056f610d893659790bb546c69044ea" gracePeriod=30 Dec 07 19:35:22 crc kubenswrapper[4815]: I1207 19:35:22.011195 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-867657f79f-kqjvk"] Dec 07 19:35:22 crc kubenswrapper[4815]: I1207 19:35:22.013134 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="69daea8f-61a8-4a91-b782-afdcb01e0605" containerName="sg-core" containerID="cri-o://03f5e66f8f929a2eb3d5b4a807b2b2019938d3990b67f87bb25a0df5ca30144c" gracePeriod=30 Dec 07 19:35:22 crc kubenswrapper[4815]: I1207 19:35:22.039615 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-bc648dd8-m5s6x"] Dec 07 19:35:22 crc kubenswrapper[4815]: I1207 19:35:22.041037 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-bc648dd8-m5s6x" Dec 07 19:35:22 crc kubenswrapper[4815]: I1207 19:35:22.047179 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Dec 07 19:35:22 crc kubenswrapper[4815]: I1207 19:35:22.055022 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-bc648dd8-m5s6x"] Dec 07 19:35:22 crc kubenswrapper[4815]: I1207 19:35:22.127167 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.675100429 podStartE2EDuration="51.127147378s" podCreationTimestamp="2025-12-07 19:34:31 +0000 UTC" firstStartedPulling="2025-12-07 19:34:33.664967316 +0000 UTC m=+1178.243957361" lastFinishedPulling="2025-12-07 19:35:21.117014265 +0000 UTC m=+1225.696004310" observedRunningTime="2025-12-07 19:35:22.10276708 +0000 UTC m=+1226.681757125" watchObservedRunningTime="2025-12-07 19:35:22.127147378 +0000 UTC m=+1226.706137413" Dec 07 19:35:22 crc kubenswrapper[4815]: I1207 19:35:22.135495 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-869f779d85-crppt"] Dec 07 19:35:22 crc kubenswrapper[4815]: I1207 19:35:22.136943 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-869f779d85-crppt" Dec 07 19:35:22 crc kubenswrapper[4815]: I1207 19:35:22.182762 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-869f779d85-crppt"] Dec 07 19:35:22 crc kubenswrapper[4815]: I1207 19:35:22.189775 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22a26306-1a35-4196-8538-3361e51808fc-combined-ca-bundle\") pod \"barbican-worker-867657f79f-kqjvk\" (UID: \"22a26306-1a35-4196-8538-3361e51808fc\") " pod="openstack/barbican-worker-867657f79f-kqjvk" Dec 07 19:35:22 crc kubenswrapper[4815]: I1207 19:35:22.189819 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b554cad-0000-4ecb-97df-7f0fbdb8c7e8-config-data\") pod \"barbican-keystone-listener-bc648dd8-m5s6x\" (UID: \"7b554cad-0000-4ecb-97df-7f0fbdb8c7e8\") " pod="openstack/barbican-keystone-listener-bc648dd8-m5s6x" Dec 07 19:35:22 crc kubenswrapper[4815]: I1207 19:35:22.189841 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/890f70c1-67cc-4e1f-883f-3468d96efc77-ovsdbserver-nb\") pod \"dnsmasq-dns-869f779d85-crppt\" (UID: \"890f70c1-67cc-4e1f-883f-3468d96efc77\") " pod="openstack/dnsmasq-dns-869f779d85-crppt" Dec 07 19:35:22 crc kubenswrapper[4815]: I1207 19:35:22.189893 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/22a26306-1a35-4196-8538-3361e51808fc-logs\") pod \"barbican-worker-867657f79f-kqjvk\" (UID: \"22a26306-1a35-4196-8538-3361e51808fc\") " pod="openstack/barbican-worker-867657f79f-kqjvk" Dec 07 19:35:22 crc kubenswrapper[4815]: I1207 19:35:22.189954 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pf89\" (UniqueName: \"kubernetes.io/projected/22a26306-1a35-4196-8538-3361e51808fc-kube-api-access-9pf89\") pod \"barbican-worker-867657f79f-kqjvk\" (UID: \"22a26306-1a35-4196-8538-3361e51808fc\") " pod="openstack/barbican-worker-867657f79f-kqjvk" Dec 07 19:35:22 crc kubenswrapper[4815]: I1207 19:35:22.189982 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7b554cad-0000-4ecb-97df-7f0fbdb8c7e8-config-data-custom\") pod \"barbican-keystone-listener-bc648dd8-m5s6x\" (UID: \"7b554cad-0000-4ecb-97df-7f0fbdb8c7e8\") " pod="openstack/barbican-keystone-listener-bc648dd8-m5s6x" Dec 07 19:35:22 crc kubenswrapper[4815]: I1207 19:35:22.189996 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5n6rw\" (UniqueName: \"kubernetes.io/projected/7b554cad-0000-4ecb-97df-7f0fbdb8c7e8-kube-api-access-5n6rw\") pod \"barbican-keystone-listener-bc648dd8-m5s6x\" (UID: \"7b554cad-0000-4ecb-97df-7f0fbdb8c7e8\") " pod="openstack/barbican-keystone-listener-bc648dd8-m5s6x" Dec 07 19:35:22 crc kubenswrapper[4815]: I1207 19:35:22.190014 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrnmh\" (UniqueName: \"kubernetes.io/projected/890f70c1-67cc-4e1f-883f-3468d96efc77-kube-api-access-rrnmh\") pod \"dnsmasq-dns-869f779d85-crppt\" (UID: \"890f70c1-67cc-4e1f-883f-3468d96efc77\") " pod="openstack/dnsmasq-dns-869f779d85-crppt" Dec 07 19:35:22 crc kubenswrapper[4815]: I1207 19:35:22.190036 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/22a26306-1a35-4196-8538-3361e51808fc-config-data-custom\") pod \"barbican-worker-867657f79f-kqjvk\" (UID: \"22a26306-1a35-4196-8538-3361e51808fc\") " pod="openstack/barbican-worker-867657f79f-kqjvk" Dec 07 19:35:22 crc kubenswrapper[4815]: I1207 19:35:22.190069 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/890f70c1-67cc-4e1f-883f-3468d96efc77-config\") pod \"dnsmasq-dns-869f779d85-crppt\" (UID: \"890f70c1-67cc-4e1f-883f-3468d96efc77\") " pod="openstack/dnsmasq-dns-869f779d85-crppt" Dec 07 19:35:22 crc kubenswrapper[4815]: I1207 19:35:22.190099 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/890f70c1-67cc-4e1f-883f-3468d96efc77-dns-svc\") pod \"dnsmasq-dns-869f779d85-crppt\" (UID: \"890f70c1-67cc-4e1f-883f-3468d96efc77\") " pod="openstack/dnsmasq-dns-869f779d85-crppt" Dec 07 19:35:22 crc kubenswrapper[4815]: I1207 19:35:22.190117 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/890f70c1-67cc-4e1f-883f-3468d96efc77-ovsdbserver-sb\") pod \"dnsmasq-dns-869f779d85-crppt\" (UID: \"890f70c1-67cc-4e1f-883f-3468d96efc77\") " pod="openstack/dnsmasq-dns-869f779d85-crppt" Dec 07 19:35:22 crc kubenswrapper[4815]: I1207 19:35:22.190140 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b554cad-0000-4ecb-97df-7f0fbdb8c7e8-combined-ca-bundle\") pod \"barbican-keystone-listener-bc648dd8-m5s6x\" (UID: \"7b554cad-0000-4ecb-97df-7f0fbdb8c7e8\") " pod="openstack/barbican-keystone-listener-bc648dd8-m5s6x" Dec 07 19:35:22 crc kubenswrapper[4815]: I1207 19:35:22.190162 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b554cad-0000-4ecb-97df-7f0fbdb8c7e8-logs\") pod \"barbican-keystone-listener-bc648dd8-m5s6x\" (UID: \"7b554cad-0000-4ecb-97df-7f0fbdb8c7e8\") " pod="openstack/barbican-keystone-listener-bc648dd8-m5s6x" Dec 07 19:35:22 crc kubenswrapper[4815]: I1207 19:35:22.190180 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22a26306-1a35-4196-8538-3361e51808fc-config-data\") pod \"barbican-worker-867657f79f-kqjvk\" (UID: \"22a26306-1a35-4196-8538-3361e51808fc\") " pod="openstack/barbican-worker-867657f79f-kqjvk" Dec 07 19:35:22 crc kubenswrapper[4815]: I1207 19:35:22.240989 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6cb9b7fdf8-65cp6"] Dec 07 19:35:22 crc kubenswrapper[4815]: I1207 19:35:22.249790 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6cb9b7fdf8-65cp6" Dec 07 19:35:22 crc kubenswrapper[4815]: I1207 19:35:22.252258 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Dec 07 19:35:22 crc kubenswrapper[4815]: I1207 19:35:22.264818 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6cb9b7fdf8-65cp6"] Dec 07 19:35:22 crc kubenswrapper[4815]: I1207 19:35:22.291628 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22a26306-1a35-4196-8538-3361e51808fc-combined-ca-bundle\") pod \"barbican-worker-867657f79f-kqjvk\" (UID: \"22a26306-1a35-4196-8538-3361e51808fc\") " pod="openstack/barbican-worker-867657f79f-kqjvk" Dec 07 19:35:22 crc kubenswrapper[4815]: I1207 19:35:22.291692 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b554cad-0000-4ecb-97df-7f0fbdb8c7e8-config-data\") pod \"barbican-keystone-listener-bc648dd8-m5s6x\" (UID: \"7b554cad-0000-4ecb-97df-7f0fbdb8c7e8\") " pod="openstack/barbican-keystone-listener-bc648dd8-m5s6x" Dec 07 19:35:22 crc kubenswrapper[4815]: I1207 19:35:22.291792 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/890f70c1-67cc-4e1f-883f-3468d96efc77-ovsdbserver-nb\") pod \"dnsmasq-dns-869f779d85-crppt\" (UID: \"890f70c1-67cc-4e1f-883f-3468d96efc77\") " pod="openstack/dnsmasq-dns-869f779d85-crppt" Dec 07 19:35:22 crc kubenswrapper[4815]: I1207 19:35:22.291824 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/22a26306-1a35-4196-8538-3361e51808fc-logs\") pod \"barbican-worker-867657f79f-kqjvk\" (UID: \"22a26306-1a35-4196-8538-3361e51808fc\") " pod="openstack/barbican-worker-867657f79f-kqjvk" Dec 07 19:35:22 crc kubenswrapper[4815]: I1207 19:35:22.291885 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pf89\" (UniqueName: \"kubernetes.io/projected/22a26306-1a35-4196-8538-3361e51808fc-kube-api-access-9pf89\") pod \"barbican-worker-867657f79f-kqjvk\" (UID: \"22a26306-1a35-4196-8538-3361e51808fc\") " pod="openstack/barbican-worker-867657f79f-kqjvk" Dec 07 19:35:22 crc kubenswrapper[4815]: I1207 19:35:22.291943 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7b554cad-0000-4ecb-97df-7f0fbdb8c7e8-config-data-custom\") pod \"barbican-keystone-listener-bc648dd8-m5s6x\" (UID: \"7b554cad-0000-4ecb-97df-7f0fbdb8c7e8\") " pod="openstack/barbican-keystone-listener-bc648dd8-m5s6x" Dec 07 19:35:22 crc kubenswrapper[4815]: I1207 19:35:22.291970 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5n6rw\" (UniqueName: \"kubernetes.io/projected/7b554cad-0000-4ecb-97df-7f0fbdb8c7e8-kube-api-access-5n6rw\") pod \"barbican-keystone-listener-bc648dd8-m5s6x\" (UID: \"7b554cad-0000-4ecb-97df-7f0fbdb8c7e8\") " pod="openstack/barbican-keystone-listener-bc648dd8-m5s6x" Dec 07 19:35:22 crc kubenswrapper[4815]: I1207 19:35:22.291994 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrnmh\" (UniqueName: \"kubernetes.io/projected/890f70c1-67cc-4e1f-883f-3468d96efc77-kube-api-access-rrnmh\") pod \"dnsmasq-dns-869f779d85-crppt\" (UID: \"890f70c1-67cc-4e1f-883f-3468d96efc77\") " pod="openstack/dnsmasq-dns-869f779d85-crppt" Dec 07 19:35:22 crc kubenswrapper[4815]: I1207 19:35:22.292028 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/22a26306-1a35-4196-8538-3361e51808fc-config-data-custom\") pod \"barbican-worker-867657f79f-kqjvk\" (UID: \"22a26306-1a35-4196-8538-3361e51808fc\") " pod="openstack/barbican-worker-867657f79f-kqjvk" Dec 07 19:35:22 crc kubenswrapper[4815]: I1207 19:35:22.292082 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/890f70c1-67cc-4e1f-883f-3468d96efc77-config\") pod \"dnsmasq-dns-869f779d85-crppt\" (UID: \"890f70c1-67cc-4e1f-883f-3468d96efc77\") " pod="openstack/dnsmasq-dns-869f779d85-crppt" Dec 07 19:35:22 crc kubenswrapper[4815]: I1207 19:35:22.292117 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/890f70c1-67cc-4e1f-883f-3468d96efc77-dns-svc\") pod \"dnsmasq-dns-869f779d85-crppt\" (UID: \"890f70c1-67cc-4e1f-883f-3468d96efc77\") " pod="openstack/dnsmasq-dns-869f779d85-crppt" Dec 07 19:35:22 crc kubenswrapper[4815]: I1207 19:35:22.292145 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/890f70c1-67cc-4e1f-883f-3468d96efc77-ovsdbserver-sb\") pod \"dnsmasq-dns-869f779d85-crppt\" (UID: \"890f70c1-67cc-4e1f-883f-3468d96efc77\") " pod="openstack/dnsmasq-dns-869f779d85-crppt" Dec 07 19:35:22 crc kubenswrapper[4815]: I1207 19:35:22.292178 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b554cad-0000-4ecb-97df-7f0fbdb8c7e8-combined-ca-bundle\") pod \"barbican-keystone-listener-bc648dd8-m5s6x\" (UID: \"7b554cad-0000-4ecb-97df-7f0fbdb8c7e8\") " pod="openstack/barbican-keystone-listener-bc648dd8-m5s6x" Dec 07 19:35:22 crc kubenswrapper[4815]: I1207 19:35:22.292208 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b554cad-0000-4ecb-97df-7f0fbdb8c7e8-logs\") pod \"barbican-keystone-listener-bc648dd8-m5s6x\" (UID: \"7b554cad-0000-4ecb-97df-7f0fbdb8c7e8\") " pod="openstack/barbican-keystone-listener-bc648dd8-m5s6x" Dec 07 19:35:22 crc kubenswrapper[4815]: I1207 19:35:22.292241 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22a26306-1a35-4196-8538-3361e51808fc-config-data\") pod \"barbican-worker-867657f79f-kqjvk\" (UID: \"22a26306-1a35-4196-8538-3361e51808fc\") " pod="openstack/barbican-worker-867657f79f-kqjvk" Dec 07 19:35:22 crc kubenswrapper[4815]: I1207 19:35:22.293585 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/890f70c1-67cc-4e1f-883f-3468d96efc77-config\") pod \"dnsmasq-dns-869f779d85-crppt\" (UID: \"890f70c1-67cc-4e1f-883f-3468d96efc77\") " pod="openstack/dnsmasq-dns-869f779d85-crppt" Dec 07 19:35:22 crc kubenswrapper[4815]: I1207 19:35:22.296738 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/890f70c1-67cc-4e1f-883f-3468d96efc77-dns-svc\") pod \"dnsmasq-dns-869f779d85-crppt\" (UID: \"890f70c1-67cc-4e1f-883f-3468d96efc77\") " pod="openstack/dnsmasq-dns-869f779d85-crppt" Dec 07 19:35:22 crc kubenswrapper[4815]: I1207 19:35:22.297451 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b554cad-0000-4ecb-97df-7f0fbdb8c7e8-logs\") pod \"barbican-keystone-listener-bc648dd8-m5s6x\" (UID: \"7b554cad-0000-4ecb-97df-7f0fbdb8c7e8\") " pod="openstack/barbican-keystone-listener-bc648dd8-m5s6x" Dec 07 19:35:22 crc kubenswrapper[4815]: I1207 19:35:22.299559 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/890f70c1-67cc-4e1f-883f-3468d96efc77-ovsdbserver-sb\") pod \"dnsmasq-dns-869f779d85-crppt\" (UID: \"890f70c1-67cc-4e1f-883f-3468d96efc77\") " pod="openstack/dnsmasq-dns-869f779d85-crppt" Dec 07 19:35:22 crc kubenswrapper[4815]: I1207 19:35:22.300413 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/22a26306-1a35-4196-8538-3361e51808fc-logs\") pod \"barbican-worker-867657f79f-kqjvk\" (UID: \"22a26306-1a35-4196-8538-3361e51808fc\") " pod="openstack/barbican-worker-867657f79f-kqjvk" Dec 07 19:35:22 crc kubenswrapper[4815]: I1207 19:35:22.302768 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/890f70c1-67cc-4e1f-883f-3468d96efc77-ovsdbserver-nb\") pod \"dnsmasq-dns-869f779d85-crppt\" (UID: \"890f70c1-67cc-4e1f-883f-3468d96efc77\") " pod="openstack/dnsmasq-dns-869f779d85-crppt" Dec 07 19:35:22 crc kubenswrapper[4815]: I1207 19:35:22.309951 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22a26306-1a35-4196-8538-3361e51808fc-combined-ca-bundle\") pod \"barbican-worker-867657f79f-kqjvk\" (UID: \"22a26306-1a35-4196-8538-3361e51808fc\") " pod="openstack/barbican-worker-867657f79f-kqjvk" Dec 07 19:35:22 crc kubenswrapper[4815]: I1207 19:35:22.310627 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b554cad-0000-4ecb-97df-7f0fbdb8c7e8-combined-ca-bundle\") pod \"barbican-keystone-listener-bc648dd8-m5s6x\" (UID: \"7b554cad-0000-4ecb-97df-7f0fbdb8c7e8\") " pod="openstack/barbican-keystone-listener-bc648dd8-m5s6x" Dec 07 19:35:22 crc kubenswrapper[4815]: I1207 19:35:22.323005 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pf89\" (UniqueName: \"kubernetes.io/projected/22a26306-1a35-4196-8538-3361e51808fc-kube-api-access-9pf89\") pod \"barbican-worker-867657f79f-kqjvk\" (UID: \"22a26306-1a35-4196-8538-3361e51808fc\") " pod="openstack/barbican-worker-867657f79f-kqjvk" Dec 07 19:35:22 crc kubenswrapper[4815]: I1207 19:35:22.323392 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22a26306-1a35-4196-8538-3361e51808fc-config-data\") pod \"barbican-worker-867657f79f-kqjvk\" (UID: \"22a26306-1a35-4196-8538-3361e51808fc\") " pod="openstack/barbican-worker-867657f79f-kqjvk" Dec 07 19:35:22 crc kubenswrapper[4815]: I1207 19:35:22.323688 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7b554cad-0000-4ecb-97df-7f0fbdb8c7e8-config-data-custom\") pod \"barbican-keystone-listener-bc648dd8-m5s6x\" (UID: \"7b554cad-0000-4ecb-97df-7f0fbdb8c7e8\") " pod="openstack/barbican-keystone-listener-bc648dd8-m5s6x" Dec 07 19:35:22 crc kubenswrapper[4815]: I1207 19:35:22.324377 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b554cad-0000-4ecb-97df-7f0fbdb8c7e8-config-data\") pod \"barbican-keystone-listener-bc648dd8-m5s6x\" (UID: \"7b554cad-0000-4ecb-97df-7f0fbdb8c7e8\") " pod="openstack/barbican-keystone-listener-bc648dd8-m5s6x" Dec 07 19:35:22 crc kubenswrapper[4815]: I1207 19:35:22.325085 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/22a26306-1a35-4196-8538-3361e51808fc-config-data-custom\") pod \"barbican-worker-867657f79f-kqjvk\" (UID: \"22a26306-1a35-4196-8538-3361e51808fc\") " pod="openstack/barbican-worker-867657f79f-kqjvk" Dec 07 19:35:22 crc kubenswrapper[4815]: I1207 19:35:22.327847 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrnmh\" (UniqueName: \"kubernetes.io/projected/890f70c1-67cc-4e1f-883f-3468d96efc77-kube-api-access-rrnmh\") pod \"dnsmasq-dns-869f779d85-crppt\" (UID: \"890f70c1-67cc-4e1f-883f-3468d96efc77\") " pod="openstack/dnsmasq-dns-869f779d85-crppt" Dec 07 19:35:22 crc kubenswrapper[4815]: I1207 19:35:22.332550 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5n6rw\" (UniqueName: \"kubernetes.io/projected/7b554cad-0000-4ecb-97df-7f0fbdb8c7e8-kube-api-access-5n6rw\") pod \"barbican-keystone-listener-bc648dd8-m5s6x\" (UID: \"7b554cad-0000-4ecb-97df-7f0fbdb8c7e8\") " pod="openstack/barbican-keystone-listener-bc648dd8-m5s6x" Dec 07 19:35:22 crc kubenswrapper[4815]: I1207 19:35:22.370936 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-bc648dd8-m5s6x" Dec 07 19:35:22 crc kubenswrapper[4815]: I1207 19:35:22.386205 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-869f779d85-crppt" Dec 07 19:35:22 crc kubenswrapper[4815]: I1207 19:35:22.393309 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03745531-bf65-4651-8ce0-de448fb99038-combined-ca-bundle\") pod \"barbican-api-6cb9b7fdf8-65cp6\" (UID: \"03745531-bf65-4651-8ce0-de448fb99038\") " pod="openstack/barbican-api-6cb9b7fdf8-65cp6" Dec 07 19:35:22 crc kubenswrapper[4815]: I1207 19:35:22.393343 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/03745531-bf65-4651-8ce0-de448fb99038-logs\") pod \"barbican-api-6cb9b7fdf8-65cp6\" (UID: \"03745531-bf65-4651-8ce0-de448fb99038\") " pod="openstack/barbican-api-6cb9b7fdf8-65cp6" Dec 07 19:35:22 crc kubenswrapper[4815]: I1207 19:35:22.393366 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03745531-bf65-4651-8ce0-de448fb99038-config-data\") pod \"barbican-api-6cb9b7fdf8-65cp6\" (UID: \"03745531-bf65-4651-8ce0-de448fb99038\") " pod="openstack/barbican-api-6cb9b7fdf8-65cp6" Dec 07 19:35:22 crc kubenswrapper[4815]: I1207 19:35:22.393426 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5m6q\" (UniqueName: \"kubernetes.io/projected/03745531-bf65-4651-8ce0-de448fb99038-kube-api-access-r5m6q\") pod \"barbican-api-6cb9b7fdf8-65cp6\" (UID: \"03745531-bf65-4651-8ce0-de448fb99038\") " pod="openstack/barbican-api-6cb9b7fdf8-65cp6" Dec 07 19:35:22 crc kubenswrapper[4815]: I1207 19:35:22.393514 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/03745531-bf65-4651-8ce0-de448fb99038-config-data-custom\") pod \"barbican-api-6cb9b7fdf8-65cp6\" (UID: \"03745531-bf65-4651-8ce0-de448fb99038\") " pod="openstack/barbican-api-6cb9b7fdf8-65cp6" Dec 07 19:35:22 crc kubenswrapper[4815]: I1207 19:35:22.495178 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5m6q\" (UniqueName: \"kubernetes.io/projected/03745531-bf65-4651-8ce0-de448fb99038-kube-api-access-r5m6q\") pod \"barbican-api-6cb9b7fdf8-65cp6\" (UID: \"03745531-bf65-4651-8ce0-de448fb99038\") " pod="openstack/barbican-api-6cb9b7fdf8-65cp6" Dec 07 19:35:22 crc kubenswrapper[4815]: I1207 19:35:22.495521 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/03745531-bf65-4651-8ce0-de448fb99038-config-data-custom\") pod \"barbican-api-6cb9b7fdf8-65cp6\" (UID: \"03745531-bf65-4651-8ce0-de448fb99038\") " pod="openstack/barbican-api-6cb9b7fdf8-65cp6" Dec 07 19:35:22 crc kubenswrapper[4815]: I1207 19:35:22.495562 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03745531-bf65-4651-8ce0-de448fb99038-combined-ca-bundle\") pod \"barbican-api-6cb9b7fdf8-65cp6\" (UID: \"03745531-bf65-4651-8ce0-de448fb99038\") " pod="openstack/barbican-api-6cb9b7fdf8-65cp6" Dec 07 19:35:22 crc kubenswrapper[4815]: I1207 19:35:22.495579 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/03745531-bf65-4651-8ce0-de448fb99038-logs\") pod \"barbican-api-6cb9b7fdf8-65cp6\" (UID: \"03745531-bf65-4651-8ce0-de448fb99038\") " pod="openstack/barbican-api-6cb9b7fdf8-65cp6" Dec 07 19:35:22 crc kubenswrapper[4815]: I1207 19:35:22.495607 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03745531-bf65-4651-8ce0-de448fb99038-config-data\") pod \"barbican-api-6cb9b7fdf8-65cp6\" (UID: \"03745531-bf65-4651-8ce0-de448fb99038\") " pod="openstack/barbican-api-6cb9b7fdf8-65cp6" Dec 07 19:35:22 crc kubenswrapper[4815]: I1207 19:35:22.497675 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/03745531-bf65-4651-8ce0-de448fb99038-logs\") pod \"barbican-api-6cb9b7fdf8-65cp6\" (UID: \"03745531-bf65-4651-8ce0-de448fb99038\") " pod="openstack/barbican-api-6cb9b7fdf8-65cp6" Dec 07 19:35:22 crc kubenswrapper[4815]: I1207 19:35:22.506313 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03745531-bf65-4651-8ce0-de448fb99038-config-data\") pod \"barbican-api-6cb9b7fdf8-65cp6\" (UID: \"03745531-bf65-4651-8ce0-de448fb99038\") " pod="openstack/barbican-api-6cb9b7fdf8-65cp6" Dec 07 19:35:22 crc kubenswrapper[4815]: I1207 19:35:22.511463 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03745531-bf65-4651-8ce0-de448fb99038-combined-ca-bundle\") pod \"barbican-api-6cb9b7fdf8-65cp6\" (UID: \"03745531-bf65-4651-8ce0-de448fb99038\") " pod="openstack/barbican-api-6cb9b7fdf8-65cp6" Dec 07 19:35:22 crc kubenswrapper[4815]: I1207 19:35:22.515593 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/03745531-bf65-4651-8ce0-de448fb99038-config-data-custom\") pod \"barbican-api-6cb9b7fdf8-65cp6\" (UID: \"03745531-bf65-4651-8ce0-de448fb99038\") " pod="openstack/barbican-api-6cb9b7fdf8-65cp6" Dec 07 19:35:22 crc kubenswrapper[4815]: I1207 19:35:22.519009 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5m6q\" (UniqueName: \"kubernetes.io/projected/03745531-bf65-4651-8ce0-de448fb99038-kube-api-access-r5m6q\") pod \"barbican-api-6cb9b7fdf8-65cp6\" (UID: \"03745531-bf65-4651-8ce0-de448fb99038\") " pod="openstack/barbican-api-6cb9b7fdf8-65cp6" Dec 07 19:35:22 crc kubenswrapper[4815]: I1207 19:35:22.616157 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-867657f79f-kqjvk" Dec 07 19:35:22 crc kubenswrapper[4815]: I1207 19:35:22.697906 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6cb9b7fdf8-65cp6" Dec 07 19:35:22 crc kubenswrapper[4815]: I1207 19:35:22.987996 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-869f779d85-crppt"] Dec 07 19:35:23 crc kubenswrapper[4815]: I1207 19:35:23.028369 4815 generic.go:334] "Generic (PLEG): container finished" podID="cfb75e0b-4e7b-484f-832b-5ed69650f1f1" containerID="9807558ff6db72094e22cacfd87a8d67aa115435ff38a4b101e4eb6219c86929" exitCode=0 Dec 07 19:35:23 crc kubenswrapper[4815]: I1207 19:35:23.028435 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-ljqjq" event={"ID":"cfb75e0b-4e7b-484f-832b-5ed69650f1f1","Type":"ContainerDied","Data":"9807558ff6db72094e22cacfd87a8d67aa115435ff38a4b101e4eb6219c86929"} Dec 07 19:35:23 crc kubenswrapper[4815]: I1207 19:35:23.029831 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-869f779d85-crppt" event={"ID":"890f70c1-67cc-4e1f-883f-3468d96efc77","Type":"ContainerStarted","Data":"67f5ea7334f56eceae22c95a0065bb2fe809856bfa80d4d2d01d05b13a6096f0"} Dec 07 19:35:23 crc kubenswrapper[4815]: I1207 19:35:23.051344 4815 generic.go:334] "Generic (PLEG): container finished" podID="69daea8f-61a8-4a91-b782-afdcb01e0605" containerID="0706952bb5f3c765ed6f14f7964fd27ddc584a9a073d3c8f103b0ccf8f4b49b5" exitCode=0 Dec 07 19:35:23 crc kubenswrapper[4815]: I1207 19:35:23.051379 4815 generic.go:334] "Generic (PLEG): container finished" podID="69daea8f-61a8-4a91-b782-afdcb01e0605" containerID="03f5e66f8f929a2eb3d5b4a807b2b2019938d3990b67f87bb25a0df5ca30144c" exitCode=2 Dec 07 19:35:23 crc kubenswrapper[4815]: I1207 19:35:23.051389 4815 generic.go:334] "Generic (PLEG): container finished" podID="69daea8f-61a8-4a91-b782-afdcb01e0605" containerID="6cfc9f445c2c6b5c1aaf487517b32ed8cdf2aa13f5153d0592c6d4402a336c89" exitCode=0 Dec 07 19:35:23 crc kubenswrapper[4815]: I1207 19:35:23.051413 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69daea8f-61a8-4a91-b782-afdcb01e0605","Type":"ContainerDied","Data":"0706952bb5f3c765ed6f14f7964fd27ddc584a9a073d3c8f103b0ccf8f4b49b5"} Dec 07 19:35:23 crc kubenswrapper[4815]: I1207 19:35:23.051443 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69daea8f-61a8-4a91-b782-afdcb01e0605","Type":"ContainerDied","Data":"03f5e66f8f929a2eb3d5b4a807b2b2019938d3990b67f87bb25a0df5ca30144c"} Dec 07 19:35:23 crc kubenswrapper[4815]: I1207 19:35:23.051454 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69daea8f-61a8-4a91-b782-afdcb01e0605","Type":"ContainerDied","Data":"6cfc9f445c2c6b5c1aaf487517b32ed8cdf2aa13f5153d0592c6d4402a336c89"} Dec 07 19:35:23 crc kubenswrapper[4815]: I1207 19:35:23.087239 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-bc648dd8-m5s6x"] Dec 07 19:35:23 crc kubenswrapper[4815]: W1207 19:35:23.093188 4815 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7b554cad_0000_4ecb_97df_7f0fbdb8c7e8.slice/crio-6b8e981c6a3bd4ff12cbe774c0b5aa72ba8e04a5e6b7e7a1d4d5bb36b9f7b1af WatchSource:0}: Error finding container 6b8e981c6a3bd4ff12cbe774c0b5aa72ba8e04a5e6b7e7a1d4d5bb36b9f7b1af: Status 404 returned error can't find the container with id 6b8e981c6a3bd4ff12cbe774c0b5aa72ba8e04a5e6b7e7a1d4d5bb36b9f7b1af Dec 07 19:35:23 crc kubenswrapper[4815]: I1207 19:35:23.192627 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6cb9b7fdf8-65cp6"] Dec 07 19:35:23 crc kubenswrapper[4815]: W1207 19:35:23.206121 4815 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod03745531_bf65_4651_8ce0_de448fb99038.slice/crio-4e124be2aca8b2ca6ce39e2d7244ec4885a5c1a02a9116350b06737962af79f2 WatchSource:0}: Error finding container 4e124be2aca8b2ca6ce39e2d7244ec4885a5c1a02a9116350b06737962af79f2: Status 404 returned error can't find the container with id 4e124be2aca8b2ca6ce39e2d7244ec4885a5c1a02a9116350b06737962af79f2 Dec 07 19:35:23 crc kubenswrapper[4815]: W1207 19:35:23.213895 4815 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod22a26306_1a35_4196_8538_3361e51808fc.slice/crio-cbb4adc244f0712e70a47111c5232a2a22053934a2b6e214972cc9e0d637d7fc WatchSource:0}: Error finding container cbb4adc244f0712e70a47111c5232a2a22053934a2b6e214972cc9e0d637d7fc: Status 404 returned error can't find the container with id cbb4adc244f0712e70a47111c5232a2a22053934a2b6e214972cc9e0d637d7fc Dec 07 19:35:23 crc kubenswrapper[4815]: I1207 19:35:23.216365 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-867657f79f-kqjvk"] Dec 07 19:35:24 crc kubenswrapper[4815]: I1207 19:35:24.085544 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-867657f79f-kqjvk" event={"ID":"22a26306-1a35-4196-8538-3361e51808fc","Type":"ContainerStarted","Data":"cbb4adc244f0712e70a47111c5232a2a22053934a2b6e214972cc9e0d637d7fc"} Dec 07 19:35:24 crc kubenswrapper[4815]: I1207 19:35:24.091981 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-bc648dd8-m5s6x" event={"ID":"7b554cad-0000-4ecb-97df-7f0fbdb8c7e8","Type":"ContainerStarted","Data":"6b8e981c6a3bd4ff12cbe774c0b5aa72ba8e04a5e6b7e7a1d4d5bb36b9f7b1af"} Dec 07 19:35:24 crc kubenswrapper[4815]: I1207 19:35:24.163443 4815 generic.go:334] "Generic (PLEG): container finished" podID="890f70c1-67cc-4e1f-883f-3468d96efc77" containerID="d792290c9f092857a09c62f3ff8f8dc19178a3467ae494c022580db194ea6a68" exitCode=0 Dec 07 19:35:24 crc kubenswrapper[4815]: I1207 19:35:24.163525 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-869f779d85-crppt" event={"ID":"890f70c1-67cc-4e1f-883f-3468d96efc77","Type":"ContainerDied","Data":"d792290c9f092857a09c62f3ff8f8dc19178a3467ae494c022580db194ea6a68"} Dec 07 19:35:24 crc kubenswrapper[4815]: I1207 19:35:24.174338 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6cb9b7fdf8-65cp6" event={"ID":"03745531-bf65-4651-8ce0-de448fb99038","Type":"ContainerStarted","Data":"3993ff65020a7db120f04627f871934000c74a34730cccb6fd8b45cab1a85bc1"} Dec 07 19:35:24 crc kubenswrapper[4815]: I1207 19:35:24.174390 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6cb9b7fdf8-65cp6" event={"ID":"03745531-bf65-4651-8ce0-de448fb99038","Type":"ContainerStarted","Data":"3e667753123e28d9fb46a094341f67ee9ff2fed6bf88d502bbc32bb56b5964a6"} Dec 07 19:35:24 crc kubenswrapper[4815]: I1207 19:35:24.174399 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6cb9b7fdf8-65cp6" event={"ID":"03745531-bf65-4651-8ce0-de448fb99038","Type":"ContainerStarted","Data":"4e124be2aca8b2ca6ce39e2d7244ec4885a5c1a02a9116350b06737962af79f2"} Dec 07 19:35:24 crc kubenswrapper[4815]: I1207 19:35:24.174970 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6cb9b7fdf8-65cp6" Dec 07 19:35:24 crc kubenswrapper[4815]: I1207 19:35:24.175421 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6cb9b7fdf8-65cp6" Dec 07 19:35:24 crc kubenswrapper[4815]: I1207 19:35:24.218900 4815 generic.go:334] "Generic (PLEG): container finished" podID="69daea8f-61a8-4a91-b782-afdcb01e0605" containerID="d8db10fcc665455ea782f9324b354461ff056f610d893659790bb546c69044ea" exitCode=0 Dec 07 19:35:24 crc kubenswrapper[4815]: I1207 19:35:24.219438 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69daea8f-61a8-4a91-b782-afdcb01e0605","Type":"ContainerDied","Data":"d8db10fcc665455ea782f9324b354461ff056f610d893659790bb546c69044ea"} Dec 07 19:35:24 crc kubenswrapper[4815]: I1207 19:35:24.225741 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6cb9b7fdf8-65cp6" podStartSLOduration=2.225724112 podStartE2EDuration="2.225724112s" podCreationTimestamp="2025-12-07 19:35:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:35:24.211169451 +0000 UTC m=+1228.790159496" watchObservedRunningTime="2025-12-07 19:35:24.225724112 +0000 UTC m=+1228.804714157" Dec 07 19:35:24 crc kubenswrapper[4815]: I1207 19:35:24.252152 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 07 19:35:24 crc kubenswrapper[4815]: I1207 19:35:24.393314 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69daea8f-61a8-4a91-b782-afdcb01e0605-log-httpd\") pod \"69daea8f-61a8-4a91-b782-afdcb01e0605\" (UID: \"69daea8f-61a8-4a91-b782-afdcb01e0605\") " Dec 07 19:35:24 crc kubenswrapper[4815]: I1207 19:35:24.393355 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q89mk\" (UniqueName: \"kubernetes.io/projected/69daea8f-61a8-4a91-b782-afdcb01e0605-kube-api-access-q89mk\") pod \"69daea8f-61a8-4a91-b782-afdcb01e0605\" (UID: \"69daea8f-61a8-4a91-b782-afdcb01e0605\") " Dec 07 19:35:24 crc kubenswrapper[4815]: I1207 19:35:24.393456 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69daea8f-61a8-4a91-b782-afdcb01e0605-combined-ca-bundle\") pod \"69daea8f-61a8-4a91-b782-afdcb01e0605\" (UID: \"69daea8f-61a8-4a91-b782-afdcb01e0605\") " Dec 07 19:35:24 crc kubenswrapper[4815]: I1207 19:35:24.393487 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/69daea8f-61a8-4a91-b782-afdcb01e0605-sg-core-conf-yaml\") pod \"69daea8f-61a8-4a91-b782-afdcb01e0605\" (UID: \"69daea8f-61a8-4a91-b782-afdcb01e0605\") " Dec 07 19:35:24 crc kubenswrapper[4815]: I1207 19:35:24.393548 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69daea8f-61a8-4a91-b782-afdcb01e0605-scripts\") pod \"69daea8f-61a8-4a91-b782-afdcb01e0605\" (UID: \"69daea8f-61a8-4a91-b782-afdcb01e0605\") " Dec 07 19:35:24 crc kubenswrapper[4815]: I1207 19:35:24.393576 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69daea8f-61a8-4a91-b782-afdcb01e0605-run-httpd\") pod \"69daea8f-61a8-4a91-b782-afdcb01e0605\" (UID: \"69daea8f-61a8-4a91-b782-afdcb01e0605\") " Dec 07 19:35:24 crc kubenswrapper[4815]: I1207 19:35:24.393596 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69daea8f-61a8-4a91-b782-afdcb01e0605-config-data\") pod \"69daea8f-61a8-4a91-b782-afdcb01e0605\" (UID: \"69daea8f-61a8-4a91-b782-afdcb01e0605\") " Dec 07 19:35:24 crc kubenswrapper[4815]: I1207 19:35:24.396904 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69daea8f-61a8-4a91-b782-afdcb01e0605-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "69daea8f-61a8-4a91-b782-afdcb01e0605" (UID: "69daea8f-61a8-4a91-b782-afdcb01e0605"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 07 19:35:24 crc kubenswrapper[4815]: I1207 19:35:24.401052 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69daea8f-61a8-4a91-b782-afdcb01e0605-kube-api-access-q89mk" (OuterVolumeSpecName: "kube-api-access-q89mk") pod "69daea8f-61a8-4a91-b782-afdcb01e0605" (UID: "69daea8f-61a8-4a91-b782-afdcb01e0605"). InnerVolumeSpecName "kube-api-access-q89mk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:35:24 crc kubenswrapper[4815]: I1207 19:35:24.403417 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69daea8f-61a8-4a91-b782-afdcb01e0605-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "69daea8f-61a8-4a91-b782-afdcb01e0605" (UID: "69daea8f-61a8-4a91-b782-afdcb01e0605"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 07 19:35:24 crc kubenswrapper[4815]: I1207 19:35:24.405547 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69daea8f-61a8-4a91-b782-afdcb01e0605-scripts" (OuterVolumeSpecName: "scripts") pod "69daea8f-61a8-4a91-b782-afdcb01e0605" (UID: "69daea8f-61a8-4a91-b782-afdcb01e0605"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:35:24 crc kubenswrapper[4815]: I1207 19:35:24.435256 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69daea8f-61a8-4a91-b782-afdcb01e0605-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "69daea8f-61a8-4a91-b782-afdcb01e0605" (UID: "69daea8f-61a8-4a91-b782-afdcb01e0605"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:35:24 crc kubenswrapper[4815]: I1207 19:35:24.512231 4815 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69daea8f-61a8-4a91-b782-afdcb01e0605-scripts\") on node \"crc\" DevicePath \"\"" Dec 07 19:35:24 crc kubenswrapper[4815]: I1207 19:35:24.512256 4815 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69daea8f-61a8-4a91-b782-afdcb01e0605-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 07 19:35:24 crc kubenswrapper[4815]: I1207 19:35:24.512266 4815 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69daea8f-61a8-4a91-b782-afdcb01e0605-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 07 19:35:24 crc kubenswrapper[4815]: I1207 19:35:24.512277 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q89mk\" (UniqueName: \"kubernetes.io/projected/69daea8f-61a8-4a91-b782-afdcb01e0605-kube-api-access-q89mk\") on node \"crc\" DevicePath \"\"" Dec 07 19:35:24 crc kubenswrapper[4815]: I1207 19:35:24.512290 4815 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/69daea8f-61a8-4a91-b782-afdcb01e0605-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 07 19:35:24 crc kubenswrapper[4815]: I1207 19:35:24.582127 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69daea8f-61a8-4a91-b782-afdcb01e0605-config-data" (OuterVolumeSpecName: "config-data") pod "69daea8f-61a8-4a91-b782-afdcb01e0605" (UID: "69daea8f-61a8-4a91-b782-afdcb01e0605"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:35:24 crc kubenswrapper[4815]: I1207 19:35:24.593610 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69daea8f-61a8-4a91-b782-afdcb01e0605-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "69daea8f-61a8-4a91-b782-afdcb01e0605" (UID: "69daea8f-61a8-4a91-b782-afdcb01e0605"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:35:24 crc kubenswrapper[4815]: I1207 19:35:24.687273 4815 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69daea8f-61a8-4a91-b782-afdcb01e0605-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 07 19:35:24 crc kubenswrapper[4815]: I1207 19:35:24.687300 4815 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69daea8f-61a8-4a91-b782-afdcb01e0605-config-data\") on node \"crc\" DevicePath \"\"" Dec 07 19:35:24 crc kubenswrapper[4815]: I1207 19:35:24.821120 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-ljqjq" Dec 07 19:35:24 crc kubenswrapper[4815]: I1207 19:35:24.996509 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cfb75e0b-4e7b-484f-832b-5ed69650f1f1-scripts\") pod \"cfb75e0b-4e7b-484f-832b-5ed69650f1f1\" (UID: \"cfb75e0b-4e7b-484f-832b-5ed69650f1f1\") " Dec 07 19:35:24 crc kubenswrapper[4815]: I1207 19:35:24.996557 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfb75e0b-4e7b-484f-832b-5ed69650f1f1-combined-ca-bundle\") pod \"cfb75e0b-4e7b-484f-832b-5ed69650f1f1\" (UID: \"cfb75e0b-4e7b-484f-832b-5ed69650f1f1\") " Dec 07 19:35:24 crc kubenswrapper[4815]: I1207 19:35:24.996605 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cfb75e0b-4e7b-484f-832b-5ed69650f1f1-etc-machine-id\") pod \"cfb75e0b-4e7b-484f-832b-5ed69650f1f1\" (UID: \"cfb75e0b-4e7b-484f-832b-5ed69650f1f1\") " Dec 07 19:35:24 crc kubenswrapper[4815]: I1207 19:35:24.996672 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/cfb75e0b-4e7b-484f-832b-5ed69650f1f1-db-sync-config-data\") pod \"cfb75e0b-4e7b-484f-832b-5ed69650f1f1\" (UID: \"cfb75e0b-4e7b-484f-832b-5ed69650f1f1\") " Dec 07 19:35:24 crc kubenswrapper[4815]: I1207 19:35:24.996754 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cfb75e0b-4e7b-484f-832b-5ed69650f1f1-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "cfb75e0b-4e7b-484f-832b-5ed69650f1f1" (UID: "cfb75e0b-4e7b-484f-832b-5ed69650f1f1"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 07 19:35:24 crc kubenswrapper[4815]: I1207 19:35:24.997493 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vdzfx\" (UniqueName: \"kubernetes.io/projected/cfb75e0b-4e7b-484f-832b-5ed69650f1f1-kube-api-access-vdzfx\") pod \"cfb75e0b-4e7b-484f-832b-5ed69650f1f1\" (UID: \"cfb75e0b-4e7b-484f-832b-5ed69650f1f1\") " Dec 07 19:35:24 crc kubenswrapper[4815]: I1207 19:35:24.997531 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfb75e0b-4e7b-484f-832b-5ed69650f1f1-config-data\") pod \"cfb75e0b-4e7b-484f-832b-5ed69650f1f1\" (UID: \"cfb75e0b-4e7b-484f-832b-5ed69650f1f1\") " Dec 07 19:35:24 crc kubenswrapper[4815]: I1207 19:35:24.997869 4815 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cfb75e0b-4e7b-484f-832b-5ed69650f1f1-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 07 19:35:25 crc kubenswrapper[4815]: I1207 19:35:25.001338 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfb75e0b-4e7b-484f-832b-5ed69650f1f1-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "cfb75e0b-4e7b-484f-832b-5ed69650f1f1" (UID: "cfb75e0b-4e7b-484f-832b-5ed69650f1f1"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:35:25 crc kubenswrapper[4815]: I1207 19:35:25.001364 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfb75e0b-4e7b-484f-832b-5ed69650f1f1-scripts" (OuterVolumeSpecName: "scripts") pod "cfb75e0b-4e7b-484f-832b-5ed69650f1f1" (UID: "cfb75e0b-4e7b-484f-832b-5ed69650f1f1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:35:25 crc kubenswrapper[4815]: I1207 19:35:25.002076 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfb75e0b-4e7b-484f-832b-5ed69650f1f1-kube-api-access-vdzfx" (OuterVolumeSpecName: "kube-api-access-vdzfx") pod "cfb75e0b-4e7b-484f-832b-5ed69650f1f1" (UID: "cfb75e0b-4e7b-484f-832b-5ed69650f1f1"). InnerVolumeSpecName "kube-api-access-vdzfx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:35:25 crc kubenswrapper[4815]: I1207 19:35:25.034670 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfb75e0b-4e7b-484f-832b-5ed69650f1f1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cfb75e0b-4e7b-484f-832b-5ed69650f1f1" (UID: "cfb75e0b-4e7b-484f-832b-5ed69650f1f1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:35:25 crc kubenswrapper[4815]: I1207 19:35:25.069382 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfb75e0b-4e7b-484f-832b-5ed69650f1f1-config-data" (OuterVolumeSpecName: "config-data") pod "cfb75e0b-4e7b-484f-832b-5ed69650f1f1" (UID: "cfb75e0b-4e7b-484f-832b-5ed69650f1f1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:35:25 crc kubenswrapper[4815]: I1207 19:35:25.099668 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vdzfx\" (UniqueName: \"kubernetes.io/projected/cfb75e0b-4e7b-484f-832b-5ed69650f1f1-kube-api-access-vdzfx\") on node \"crc\" DevicePath \"\"" Dec 07 19:35:25 crc kubenswrapper[4815]: I1207 19:35:25.100871 4815 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfb75e0b-4e7b-484f-832b-5ed69650f1f1-config-data\") on node \"crc\" DevicePath \"\"" Dec 07 19:35:25 crc kubenswrapper[4815]: I1207 19:35:25.100996 4815 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cfb75e0b-4e7b-484f-832b-5ed69650f1f1-scripts\") on node \"crc\" DevicePath \"\"" Dec 07 19:35:25 crc kubenswrapper[4815]: I1207 19:35:25.101066 4815 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfb75e0b-4e7b-484f-832b-5ed69650f1f1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 07 19:35:25 crc kubenswrapper[4815]: I1207 19:35:25.101131 4815 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/cfb75e0b-4e7b-484f-832b-5ed69650f1f1-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 07 19:35:25 crc kubenswrapper[4815]: I1207 19:35:25.328358 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-ljqjq" event={"ID":"cfb75e0b-4e7b-484f-832b-5ed69650f1f1","Type":"ContainerDied","Data":"4a9d5b25d318176c42c38f15e65ae67741be53499e8415195dc8a81c2c15b71b"} Dec 07 19:35:25 crc kubenswrapper[4815]: I1207 19:35:25.328444 4815 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4a9d5b25d318176c42c38f15e65ae67741be53499e8415195dc8a81c2c15b71b" Dec 07 19:35:25 crc kubenswrapper[4815]: I1207 19:35:25.328537 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-ljqjq" Dec 07 19:35:25 crc kubenswrapper[4815]: I1207 19:35:25.346526 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-869f779d85-crppt" event={"ID":"890f70c1-67cc-4e1f-883f-3468d96efc77","Type":"ContainerStarted","Data":"397bd82632eef4997cdc00a7016e94aa8edc1b8c875cfa3d52ac98a62c900cec"} Dec 07 19:35:25 crc kubenswrapper[4815]: I1207 19:35:25.346623 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-869f779d85-crppt" Dec 07 19:35:25 crc kubenswrapper[4815]: I1207 19:35:25.363622 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 07 19:35:25 crc kubenswrapper[4815]: I1207 19:35:25.363873 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69daea8f-61a8-4a91-b782-afdcb01e0605","Type":"ContainerDied","Data":"37fbd0932e634d778b202d7543ffc23cb36b827899ffe2760a118b98c551d0a7"} Dec 07 19:35:25 crc kubenswrapper[4815]: I1207 19:35:25.364416 4815 scope.go:117] "RemoveContainer" containerID="0706952bb5f3c765ed6f14f7964fd27ddc584a9a073d3c8f103b0ccf8f4b49b5" Dec 07 19:35:25 crc kubenswrapper[4815]: I1207 19:35:25.418282 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-869f779d85-crppt" podStartSLOduration=3.418259963 podStartE2EDuration="3.418259963s" podCreationTimestamp="2025-12-07 19:35:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:35:25.402394916 +0000 UTC m=+1229.981384961" watchObservedRunningTime="2025-12-07 19:35:25.418259963 +0000 UTC m=+1229.997250008" Dec 07 19:35:25 crc kubenswrapper[4815]: I1207 19:35:25.512208 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 07 19:35:25 crc kubenswrapper[4815]: I1207 19:35:25.539171 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 07 19:35:25 crc kubenswrapper[4815]: I1207 19:35:25.571965 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 07 19:35:25 crc kubenswrapper[4815]: E1207 19:35:25.572410 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69daea8f-61a8-4a91-b782-afdcb01e0605" containerName="ceilometer-central-agent" Dec 07 19:35:25 crc kubenswrapper[4815]: I1207 19:35:25.572431 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="69daea8f-61a8-4a91-b782-afdcb01e0605" containerName="ceilometer-central-agent" Dec 07 19:35:25 crc kubenswrapper[4815]: E1207 19:35:25.572470 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfb75e0b-4e7b-484f-832b-5ed69650f1f1" containerName="cinder-db-sync" Dec 07 19:35:25 crc kubenswrapper[4815]: I1207 19:35:25.572481 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfb75e0b-4e7b-484f-832b-5ed69650f1f1" containerName="cinder-db-sync" Dec 07 19:35:25 crc kubenswrapper[4815]: E1207 19:35:25.572490 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69daea8f-61a8-4a91-b782-afdcb01e0605" containerName="ceilometer-notification-agent" Dec 07 19:35:25 crc kubenswrapper[4815]: I1207 19:35:25.572496 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="69daea8f-61a8-4a91-b782-afdcb01e0605" containerName="ceilometer-notification-agent" Dec 07 19:35:25 crc kubenswrapper[4815]: E1207 19:35:25.572506 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69daea8f-61a8-4a91-b782-afdcb01e0605" containerName="proxy-httpd" Dec 07 19:35:25 crc kubenswrapper[4815]: I1207 19:35:25.572512 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="69daea8f-61a8-4a91-b782-afdcb01e0605" containerName="proxy-httpd" Dec 07 19:35:25 crc kubenswrapper[4815]: E1207 19:35:25.572529 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69daea8f-61a8-4a91-b782-afdcb01e0605" containerName="sg-core" Dec 07 19:35:25 crc kubenswrapper[4815]: I1207 19:35:25.572535 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="69daea8f-61a8-4a91-b782-afdcb01e0605" containerName="sg-core" Dec 07 19:35:25 crc kubenswrapper[4815]: I1207 19:35:25.572692 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfb75e0b-4e7b-484f-832b-5ed69650f1f1" containerName="cinder-db-sync" Dec 07 19:35:25 crc kubenswrapper[4815]: I1207 19:35:25.572705 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="69daea8f-61a8-4a91-b782-afdcb01e0605" containerName="ceilometer-notification-agent" Dec 07 19:35:25 crc kubenswrapper[4815]: I1207 19:35:25.572712 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="69daea8f-61a8-4a91-b782-afdcb01e0605" containerName="sg-core" Dec 07 19:35:25 crc kubenswrapper[4815]: I1207 19:35:25.572720 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="69daea8f-61a8-4a91-b782-afdcb01e0605" containerName="ceilometer-central-agent" Dec 07 19:35:25 crc kubenswrapper[4815]: I1207 19:35:25.572733 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="69daea8f-61a8-4a91-b782-afdcb01e0605" containerName="proxy-httpd" Dec 07 19:35:25 crc kubenswrapper[4815]: I1207 19:35:25.573857 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 07 19:35:25 crc kubenswrapper[4815]: I1207 19:35:25.579498 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 07 19:35:25 crc kubenswrapper[4815]: I1207 19:35:25.579555 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 07 19:35:25 crc kubenswrapper[4815]: I1207 19:35:25.583550 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 07 19:35:25 crc kubenswrapper[4815]: I1207 19:35:25.585351 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-m5wjk" Dec 07 19:35:25 crc kubenswrapper[4815]: I1207 19:35:25.609967 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 07 19:35:25 crc kubenswrapper[4815]: I1207 19:35:25.640039 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 07 19:35:25 crc kubenswrapper[4815]: I1207 19:35:25.642478 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 07 19:35:25 crc kubenswrapper[4815]: I1207 19:35:25.655254 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 07 19:35:25 crc kubenswrapper[4815]: I1207 19:35:25.655452 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 07 19:35:25 crc kubenswrapper[4815]: I1207 19:35:25.679102 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 07 19:35:25 crc kubenswrapper[4815]: I1207 19:35:25.693155 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/861aea1c-8b59-408a-a5a5-8eafebb607f1-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"861aea1c-8b59-408a-a5a5-8eafebb607f1\") " pod="openstack/cinder-scheduler-0" Dec 07 19:35:25 crc kubenswrapper[4815]: I1207 19:35:25.693194 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46v96\" (UniqueName: \"kubernetes.io/projected/861aea1c-8b59-408a-a5a5-8eafebb607f1-kube-api-access-46v96\") pod \"cinder-scheduler-0\" (UID: \"861aea1c-8b59-408a-a5a5-8eafebb607f1\") " pod="openstack/cinder-scheduler-0" Dec 07 19:35:25 crc kubenswrapper[4815]: I1207 19:35:25.693248 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/861aea1c-8b59-408a-a5a5-8eafebb607f1-config-data\") pod \"cinder-scheduler-0\" (UID: \"861aea1c-8b59-408a-a5a5-8eafebb607f1\") " pod="openstack/cinder-scheduler-0" Dec 07 19:35:25 crc kubenswrapper[4815]: I1207 19:35:25.693272 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/861aea1c-8b59-408a-a5a5-8eafebb607f1-scripts\") pod \"cinder-scheduler-0\" (UID: \"861aea1c-8b59-408a-a5a5-8eafebb607f1\") " pod="openstack/cinder-scheduler-0" Dec 07 19:35:25 crc kubenswrapper[4815]: I1207 19:35:25.693309 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/861aea1c-8b59-408a-a5a5-8eafebb607f1-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"861aea1c-8b59-408a-a5a5-8eafebb607f1\") " pod="openstack/cinder-scheduler-0" Dec 07 19:35:25 crc kubenswrapper[4815]: I1207 19:35:25.693332 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/861aea1c-8b59-408a-a5a5-8eafebb607f1-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"861aea1c-8b59-408a-a5a5-8eafebb607f1\") " pod="openstack/cinder-scheduler-0" Dec 07 19:35:25 crc kubenswrapper[4815]: I1207 19:35:25.708389 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-869f779d85-crppt"] Dec 07 19:35:25 crc kubenswrapper[4815]: I1207 19:35:25.802513 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1\") " pod="openstack/ceilometer-0" Dec 07 19:35:25 crc kubenswrapper[4815]: I1207 19:35:25.802630 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/861aea1c-8b59-408a-a5a5-8eafebb607f1-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"861aea1c-8b59-408a-a5a5-8eafebb607f1\") " pod="openstack/cinder-scheduler-0" Dec 07 19:35:25 crc kubenswrapper[4815]: I1207 19:35:25.802674 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46v96\" (UniqueName: \"kubernetes.io/projected/861aea1c-8b59-408a-a5a5-8eafebb607f1-kube-api-access-46v96\") pod \"cinder-scheduler-0\" (UID: \"861aea1c-8b59-408a-a5a5-8eafebb607f1\") " pod="openstack/cinder-scheduler-0" Dec 07 19:35:25 crc kubenswrapper[4815]: I1207 19:35:25.802724 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1-config-data\") pod \"ceilometer-0\" (UID: \"392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1\") " pod="openstack/ceilometer-0" Dec 07 19:35:25 crc kubenswrapper[4815]: I1207 19:35:25.802761 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lm5wj\" (UniqueName: \"kubernetes.io/projected/392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1-kube-api-access-lm5wj\") pod \"ceilometer-0\" (UID: \"392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1\") " pod="openstack/ceilometer-0" Dec 07 19:35:25 crc kubenswrapper[4815]: I1207 19:35:25.802851 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/861aea1c-8b59-408a-a5a5-8eafebb607f1-config-data\") pod \"cinder-scheduler-0\" (UID: \"861aea1c-8b59-408a-a5a5-8eafebb607f1\") " pod="openstack/cinder-scheduler-0" Dec 07 19:35:25 crc kubenswrapper[4815]: I1207 19:35:25.802873 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1-log-httpd\") pod \"ceilometer-0\" (UID: \"392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1\") " pod="openstack/ceilometer-0" Dec 07 19:35:25 crc kubenswrapper[4815]: I1207 19:35:25.802926 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/861aea1c-8b59-408a-a5a5-8eafebb607f1-scripts\") pod \"cinder-scheduler-0\" (UID: \"861aea1c-8b59-408a-a5a5-8eafebb607f1\") " pod="openstack/cinder-scheduler-0" Dec 07 19:35:25 crc kubenswrapper[4815]: I1207 19:35:25.803193 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1\") " pod="openstack/ceilometer-0" Dec 07 19:35:25 crc kubenswrapper[4815]: I1207 19:35:25.803362 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/861aea1c-8b59-408a-a5a5-8eafebb607f1-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"861aea1c-8b59-408a-a5a5-8eafebb607f1\") " pod="openstack/cinder-scheduler-0" Dec 07 19:35:25 crc kubenswrapper[4815]: I1207 19:35:25.803484 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/861aea1c-8b59-408a-a5a5-8eafebb607f1-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"861aea1c-8b59-408a-a5a5-8eafebb607f1\") " pod="openstack/cinder-scheduler-0" Dec 07 19:35:25 crc kubenswrapper[4815]: I1207 19:35:25.803654 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1-run-httpd\") pod \"ceilometer-0\" (UID: \"392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1\") " pod="openstack/ceilometer-0" Dec 07 19:35:25 crc kubenswrapper[4815]: I1207 19:35:25.803740 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1-scripts\") pod \"ceilometer-0\" (UID: \"392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1\") " pod="openstack/ceilometer-0" Dec 07 19:35:25 crc kubenswrapper[4815]: I1207 19:35:25.804038 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/861aea1c-8b59-408a-a5a5-8eafebb607f1-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"861aea1c-8b59-408a-a5a5-8eafebb607f1\") " pod="openstack/cinder-scheduler-0" Dec 07 19:35:25 crc kubenswrapper[4815]: I1207 19:35:25.874422 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/861aea1c-8b59-408a-a5a5-8eafebb607f1-config-data\") pod \"cinder-scheduler-0\" (UID: \"861aea1c-8b59-408a-a5a5-8eafebb607f1\") " pod="openstack/cinder-scheduler-0" Dec 07 19:35:25 crc kubenswrapper[4815]: I1207 19:35:25.876396 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/861aea1c-8b59-408a-a5a5-8eafebb607f1-scripts\") pod \"cinder-scheduler-0\" (UID: \"861aea1c-8b59-408a-a5a5-8eafebb607f1\") " pod="openstack/cinder-scheduler-0" Dec 07 19:35:25 crc kubenswrapper[4815]: I1207 19:35:25.876859 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/861aea1c-8b59-408a-a5a5-8eafebb607f1-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"861aea1c-8b59-408a-a5a5-8eafebb607f1\") " pod="openstack/cinder-scheduler-0" Dec 07 19:35:25 crc kubenswrapper[4815]: I1207 19:35:25.877285 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/861aea1c-8b59-408a-a5a5-8eafebb607f1-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"861aea1c-8b59-408a-a5a5-8eafebb607f1\") " pod="openstack/cinder-scheduler-0" Dec 07 19:35:25 crc kubenswrapper[4815]: I1207 19:35:25.877768 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69daea8f-61a8-4a91-b782-afdcb01e0605" path="/var/lib/kubelet/pods/69daea8f-61a8-4a91-b782-afdcb01e0605/volumes" Dec 07 19:35:25 crc kubenswrapper[4815]: I1207 19:35:25.895622 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46v96\" (UniqueName: \"kubernetes.io/projected/861aea1c-8b59-408a-a5a5-8eafebb607f1-kube-api-access-46v96\") pod \"cinder-scheduler-0\" (UID: \"861aea1c-8b59-408a-a5a5-8eafebb607f1\") " pod="openstack/cinder-scheduler-0" Dec 07 19:35:25 crc kubenswrapper[4815]: I1207 19:35:25.902162 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 07 19:35:25 crc kubenswrapper[4815]: I1207 19:35:25.912182 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1-run-httpd\") pod \"ceilometer-0\" (UID: \"392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1\") " pod="openstack/ceilometer-0" Dec 07 19:35:25 crc kubenswrapper[4815]: I1207 19:35:25.912242 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1-scripts\") pod \"ceilometer-0\" (UID: \"392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1\") " pod="openstack/ceilometer-0" Dec 07 19:35:25 crc kubenswrapper[4815]: I1207 19:35:25.912284 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1\") " pod="openstack/ceilometer-0" Dec 07 19:35:25 crc kubenswrapper[4815]: I1207 19:35:25.912369 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1-config-data\") pod \"ceilometer-0\" (UID: \"392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1\") " pod="openstack/ceilometer-0" Dec 07 19:35:25 crc kubenswrapper[4815]: I1207 19:35:25.912397 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lm5wj\" (UniqueName: \"kubernetes.io/projected/392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1-kube-api-access-lm5wj\") pod \"ceilometer-0\" (UID: \"392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1\") " pod="openstack/ceilometer-0" Dec 07 19:35:25 crc kubenswrapper[4815]: I1207 19:35:25.912477 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1-log-httpd\") pod \"ceilometer-0\" (UID: \"392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1\") " pod="openstack/ceilometer-0" Dec 07 19:35:25 crc kubenswrapper[4815]: I1207 19:35:25.912529 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1\") " pod="openstack/ceilometer-0" Dec 07 19:35:25 crc kubenswrapper[4815]: I1207 19:35:25.924545 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-58db5546cc-hwggf"] Dec 07 19:35:25 crc kubenswrapper[4815]: I1207 19:35:25.925979 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58db5546cc-hwggf" Dec 07 19:35:25 crc kubenswrapper[4815]: I1207 19:35:25.926771 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1-run-httpd\") pod \"ceilometer-0\" (UID: \"392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1\") " pod="openstack/ceilometer-0" Dec 07 19:35:25 crc kubenswrapper[4815]: I1207 19:35:25.946005 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58db5546cc-hwggf"] Dec 07 19:35:25 crc kubenswrapper[4815]: I1207 19:35:25.930904 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1-log-httpd\") pod \"ceilometer-0\" (UID: \"392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1\") " pod="openstack/ceilometer-0" Dec 07 19:35:25 crc kubenswrapper[4815]: I1207 19:35:25.949577 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1\") " pod="openstack/ceilometer-0" Dec 07 19:35:25 crc kubenswrapper[4815]: I1207 19:35:25.953442 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1-scripts\") pod \"ceilometer-0\" (UID: \"392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1\") " pod="openstack/ceilometer-0" Dec 07 19:35:25 crc kubenswrapper[4815]: I1207 19:35:25.955605 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1-config-data\") pod \"ceilometer-0\" (UID: \"392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1\") " pod="openstack/ceilometer-0" Dec 07 19:35:25 crc kubenswrapper[4815]: I1207 19:35:25.966238 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1\") " pod="openstack/ceilometer-0" Dec 07 19:35:25 crc kubenswrapper[4815]: I1207 19:35:25.966970 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6cfdf5698d-xlqfg"] Dec 07 19:35:25 crc kubenswrapper[4815]: I1207 19:35:25.969096 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6cfdf5698d-xlqfg" Dec 07 19:35:25 crc kubenswrapper[4815]: I1207 19:35:25.983303 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Dec 07 19:35:25 crc kubenswrapper[4815]: I1207 19:35:25.983652 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Dec 07 19:35:26 crc kubenswrapper[4815]: I1207 19:35:26.010244 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6cfdf5698d-xlqfg"] Dec 07 19:35:26 crc kubenswrapper[4815]: I1207 19:35:26.014105 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5eb493fe-dc18-41f8-8102-7d6b906d1a63-public-tls-certs\") pod \"barbican-api-6cfdf5698d-xlqfg\" (UID: \"5eb493fe-dc18-41f8-8102-7d6b906d1a63\") " pod="openstack/barbican-api-6cfdf5698d-xlqfg" Dec 07 19:35:26 crc kubenswrapper[4815]: I1207 19:35:26.014171 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5eb493fe-dc18-41f8-8102-7d6b906d1a63-logs\") pod \"barbican-api-6cfdf5698d-xlqfg\" (UID: \"5eb493fe-dc18-41f8-8102-7d6b906d1a63\") " pod="openstack/barbican-api-6cfdf5698d-xlqfg" Dec 07 19:35:26 crc kubenswrapper[4815]: I1207 19:35:26.014229 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5eb493fe-dc18-41f8-8102-7d6b906d1a63-config-data\") pod \"barbican-api-6cfdf5698d-xlqfg\" (UID: \"5eb493fe-dc18-41f8-8102-7d6b906d1a63\") " pod="openstack/barbican-api-6cfdf5698d-xlqfg" Dec 07 19:35:26 crc kubenswrapper[4815]: I1207 19:35:26.014298 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4x5r\" (UniqueName: \"kubernetes.io/projected/ca97776a-2f5b-4188-9c96-ff0ee5a94002-kube-api-access-r4x5r\") pod \"dnsmasq-dns-58db5546cc-hwggf\" (UID: \"ca97776a-2f5b-4188-9c96-ff0ee5a94002\") " pod="openstack/dnsmasq-dns-58db5546cc-hwggf" Dec 07 19:35:26 crc kubenswrapper[4815]: I1207 19:35:26.014336 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5eb493fe-dc18-41f8-8102-7d6b906d1a63-config-data-custom\") pod \"barbican-api-6cfdf5698d-xlqfg\" (UID: \"5eb493fe-dc18-41f8-8102-7d6b906d1a63\") " pod="openstack/barbican-api-6cfdf5698d-xlqfg" Dec 07 19:35:26 crc kubenswrapper[4815]: I1207 19:35:26.014356 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5eb493fe-dc18-41f8-8102-7d6b906d1a63-internal-tls-certs\") pod \"barbican-api-6cfdf5698d-xlqfg\" (UID: \"5eb493fe-dc18-41f8-8102-7d6b906d1a63\") " pod="openstack/barbican-api-6cfdf5698d-xlqfg" Dec 07 19:35:26 crc kubenswrapper[4815]: I1207 19:35:26.014375 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ca97776a-2f5b-4188-9c96-ff0ee5a94002-ovsdbserver-sb\") pod \"dnsmasq-dns-58db5546cc-hwggf\" (UID: \"ca97776a-2f5b-4188-9c96-ff0ee5a94002\") " pod="openstack/dnsmasq-dns-58db5546cc-hwggf" Dec 07 19:35:26 crc kubenswrapper[4815]: I1207 19:35:26.014422 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhpn4\" (UniqueName: \"kubernetes.io/projected/5eb493fe-dc18-41f8-8102-7d6b906d1a63-kube-api-access-xhpn4\") pod \"barbican-api-6cfdf5698d-xlqfg\" (UID: \"5eb493fe-dc18-41f8-8102-7d6b906d1a63\") " pod="openstack/barbican-api-6cfdf5698d-xlqfg" Dec 07 19:35:26 crc kubenswrapper[4815]: I1207 19:35:26.014489 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ca97776a-2f5b-4188-9c96-ff0ee5a94002-ovsdbserver-nb\") pod \"dnsmasq-dns-58db5546cc-hwggf\" (UID: \"ca97776a-2f5b-4188-9c96-ff0ee5a94002\") " pod="openstack/dnsmasq-dns-58db5546cc-hwggf" Dec 07 19:35:26 crc kubenswrapper[4815]: I1207 19:35:26.014512 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5eb493fe-dc18-41f8-8102-7d6b906d1a63-combined-ca-bundle\") pod \"barbican-api-6cfdf5698d-xlqfg\" (UID: \"5eb493fe-dc18-41f8-8102-7d6b906d1a63\") " pod="openstack/barbican-api-6cfdf5698d-xlqfg" Dec 07 19:35:26 crc kubenswrapper[4815]: I1207 19:35:26.014532 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca97776a-2f5b-4188-9c96-ff0ee5a94002-config\") pod \"dnsmasq-dns-58db5546cc-hwggf\" (UID: \"ca97776a-2f5b-4188-9c96-ff0ee5a94002\") " pod="openstack/dnsmasq-dns-58db5546cc-hwggf" Dec 07 19:35:26 crc kubenswrapper[4815]: I1207 19:35:26.014571 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ca97776a-2f5b-4188-9c96-ff0ee5a94002-dns-svc\") pod \"dnsmasq-dns-58db5546cc-hwggf\" (UID: \"ca97776a-2f5b-4188-9c96-ff0ee5a94002\") " pod="openstack/dnsmasq-dns-58db5546cc-hwggf" Dec 07 19:35:26 crc kubenswrapper[4815]: I1207 19:35:26.034955 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lm5wj\" (UniqueName: \"kubernetes.io/projected/392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1-kube-api-access-lm5wj\") pod \"ceilometer-0\" (UID: \"392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1\") " pod="openstack/ceilometer-0" Dec 07 19:35:26 crc kubenswrapper[4815]: I1207 19:35:26.116112 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhpn4\" (UniqueName: \"kubernetes.io/projected/5eb493fe-dc18-41f8-8102-7d6b906d1a63-kube-api-access-xhpn4\") pod \"barbican-api-6cfdf5698d-xlqfg\" (UID: \"5eb493fe-dc18-41f8-8102-7d6b906d1a63\") " pod="openstack/barbican-api-6cfdf5698d-xlqfg" Dec 07 19:35:26 crc kubenswrapper[4815]: I1207 19:35:26.116202 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5eb493fe-dc18-41f8-8102-7d6b906d1a63-combined-ca-bundle\") pod \"barbican-api-6cfdf5698d-xlqfg\" (UID: \"5eb493fe-dc18-41f8-8102-7d6b906d1a63\") " pod="openstack/barbican-api-6cfdf5698d-xlqfg" Dec 07 19:35:26 crc kubenswrapper[4815]: I1207 19:35:26.116227 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ca97776a-2f5b-4188-9c96-ff0ee5a94002-ovsdbserver-nb\") pod \"dnsmasq-dns-58db5546cc-hwggf\" (UID: \"ca97776a-2f5b-4188-9c96-ff0ee5a94002\") " pod="openstack/dnsmasq-dns-58db5546cc-hwggf" Dec 07 19:35:26 crc kubenswrapper[4815]: I1207 19:35:26.116257 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca97776a-2f5b-4188-9c96-ff0ee5a94002-config\") pod \"dnsmasq-dns-58db5546cc-hwggf\" (UID: \"ca97776a-2f5b-4188-9c96-ff0ee5a94002\") " pod="openstack/dnsmasq-dns-58db5546cc-hwggf" Dec 07 19:35:26 crc kubenswrapper[4815]: I1207 19:35:26.116284 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ca97776a-2f5b-4188-9c96-ff0ee5a94002-dns-svc\") pod \"dnsmasq-dns-58db5546cc-hwggf\" (UID: \"ca97776a-2f5b-4188-9c96-ff0ee5a94002\") " pod="openstack/dnsmasq-dns-58db5546cc-hwggf" Dec 07 19:35:26 crc kubenswrapper[4815]: I1207 19:35:26.116306 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5eb493fe-dc18-41f8-8102-7d6b906d1a63-public-tls-certs\") pod \"barbican-api-6cfdf5698d-xlqfg\" (UID: \"5eb493fe-dc18-41f8-8102-7d6b906d1a63\") " pod="openstack/barbican-api-6cfdf5698d-xlqfg" Dec 07 19:35:26 crc kubenswrapper[4815]: I1207 19:35:26.116325 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5eb493fe-dc18-41f8-8102-7d6b906d1a63-logs\") pod \"barbican-api-6cfdf5698d-xlqfg\" (UID: \"5eb493fe-dc18-41f8-8102-7d6b906d1a63\") " pod="openstack/barbican-api-6cfdf5698d-xlqfg" Dec 07 19:35:26 crc kubenswrapper[4815]: I1207 19:35:26.116355 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5eb493fe-dc18-41f8-8102-7d6b906d1a63-config-data\") pod \"barbican-api-6cfdf5698d-xlqfg\" (UID: \"5eb493fe-dc18-41f8-8102-7d6b906d1a63\") " pod="openstack/barbican-api-6cfdf5698d-xlqfg" Dec 07 19:35:26 crc kubenswrapper[4815]: I1207 19:35:26.116406 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4x5r\" (UniqueName: \"kubernetes.io/projected/ca97776a-2f5b-4188-9c96-ff0ee5a94002-kube-api-access-r4x5r\") pod \"dnsmasq-dns-58db5546cc-hwggf\" (UID: \"ca97776a-2f5b-4188-9c96-ff0ee5a94002\") " pod="openstack/dnsmasq-dns-58db5546cc-hwggf" Dec 07 19:35:26 crc kubenswrapper[4815]: I1207 19:35:26.116430 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5eb493fe-dc18-41f8-8102-7d6b906d1a63-config-data-custom\") pod \"barbican-api-6cfdf5698d-xlqfg\" (UID: \"5eb493fe-dc18-41f8-8102-7d6b906d1a63\") " pod="openstack/barbican-api-6cfdf5698d-xlqfg" Dec 07 19:35:26 crc kubenswrapper[4815]: I1207 19:35:26.116446 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5eb493fe-dc18-41f8-8102-7d6b906d1a63-internal-tls-certs\") pod \"barbican-api-6cfdf5698d-xlqfg\" (UID: \"5eb493fe-dc18-41f8-8102-7d6b906d1a63\") " pod="openstack/barbican-api-6cfdf5698d-xlqfg" Dec 07 19:35:26 crc kubenswrapper[4815]: I1207 19:35:26.116467 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ca97776a-2f5b-4188-9c96-ff0ee5a94002-ovsdbserver-sb\") pod \"dnsmasq-dns-58db5546cc-hwggf\" (UID: \"ca97776a-2f5b-4188-9c96-ff0ee5a94002\") " pod="openstack/dnsmasq-dns-58db5546cc-hwggf" Dec 07 19:35:26 crc kubenswrapper[4815]: I1207 19:35:26.117552 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ca97776a-2f5b-4188-9c96-ff0ee5a94002-ovsdbserver-sb\") pod \"dnsmasq-dns-58db5546cc-hwggf\" (UID: \"ca97776a-2f5b-4188-9c96-ff0ee5a94002\") " pod="openstack/dnsmasq-dns-58db5546cc-hwggf" Dec 07 19:35:26 crc kubenswrapper[4815]: I1207 19:35:26.117893 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5eb493fe-dc18-41f8-8102-7d6b906d1a63-logs\") pod \"barbican-api-6cfdf5698d-xlqfg\" (UID: \"5eb493fe-dc18-41f8-8102-7d6b906d1a63\") " pod="openstack/barbican-api-6cfdf5698d-xlqfg" Dec 07 19:35:26 crc kubenswrapper[4815]: I1207 19:35:26.118665 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ca97776a-2f5b-4188-9c96-ff0ee5a94002-ovsdbserver-nb\") pod \"dnsmasq-dns-58db5546cc-hwggf\" (UID: \"ca97776a-2f5b-4188-9c96-ff0ee5a94002\") " pod="openstack/dnsmasq-dns-58db5546cc-hwggf" Dec 07 19:35:26 crc kubenswrapper[4815]: I1207 19:35:26.119303 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca97776a-2f5b-4188-9c96-ff0ee5a94002-config\") pod \"dnsmasq-dns-58db5546cc-hwggf\" (UID: \"ca97776a-2f5b-4188-9c96-ff0ee5a94002\") " pod="openstack/dnsmasq-dns-58db5546cc-hwggf" Dec 07 19:35:26 crc kubenswrapper[4815]: I1207 19:35:26.119571 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ca97776a-2f5b-4188-9c96-ff0ee5a94002-dns-svc\") pod \"dnsmasq-dns-58db5546cc-hwggf\" (UID: \"ca97776a-2f5b-4188-9c96-ff0ee5a94002\") " pod="openstack/dnsmasq-dns-58db5546cc-hwggf" Dec 07 19:35:26 crc kubenswrapper[4815]: I1207 19:35:26.120649 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5eb493fe-dc18-41f8-8102-7d6b906d1a63-combined-ca-bundle\") pod \"barbican-api-6cfdf5698d-xlqfg\" (UID: \"5eb493fe-dc18-41f8-8102-7d6b906d1a63\") " pod="openstack/barbican-api-6cfdf5698d-xlqfg" Dec 07 19:35:26 crc kubenswrapper[4815]: I1207 19:35:26.123412 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5eb493fe-dc18-41f8-8102-7d6b906d1a63-config-data\") pod \"barbican-api-6cfdf5698d-xlqfg\" (UID: \"5eb493fe-dc18-41f8-8102-7d6b906d1a63\") " pod="openstack/barbican-api-6cfdf5698d-xlqfg" Dec 07 19:35:26 crc kubenswrapper[4815]: I1207 19:35:26.124752 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5eb493fe-dc18-41f8-8102-7d6b906d1a63-config-data-custom\") pod \"barbican-api-6cfdf5698d-xlqfg\" (UID: \"5eb493fe-dc18-41f8-8102-7d6b906d1a63\") " pod="openstack/barbican-api-6cfdf5698d-xlqfg" Dec 07 19:35:26 crc kubenswrapper[4815]: I1207 19:35:26.130431 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5eb493fe-dc18-41f8-8102-7d6b906d1a63-internal-tls-certs\") pod \"barbican-api-6cfdf5698d-xlqfg\" (UID: \"5eb493fe-dc18-41f8-8102-7d6b906d1a63\") " pod="openstack/barbican-api-6cfdf5698d-xlqfg" Dec 07 19:35:26 crc kubenswrapper[4815]: I1207 19:35:26.140248 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5eb493fe-dc18-41f8-8102-7d6b906d1a63-public-tls-certs\") pod \"barbican-api-6cfdf5698d-xlqfg\" (UID: \"5eb493fe-dc18-41f8-8102-7d6b906d1a63\") " pod="openstack/barbican-api-6cfdf5698d-xlqfg" Dec 07 19:35:26 crc kubenswrapper[4815]: I1207 19:35:26.153365 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhpn4\" (UniqueName: \"kubernetes.io/projected/5eb493fe-dc18-41f8-8102-7d6b906d1a63-kube-api-access-xhpn4\") pod \"barbican-api-6cfdf5698d-xlqfg\" (UID: \"5eb493fe-dc18-41f8-8102-7d6b906d1a63\") " pod="openstack/barbican-api-6cfdf5698d-xlqfg" Dec 07 19:35:26 crc kubenswrapper[4815]: I1207 19:35:26.159312 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4x5r\" (UniqueName: \"kubernetes.io/projected/ca97776a-2f5b-4188-9c96-ff0ee5a94002-kube-api-access-r4x5r\") pod \"dnsmasq-dns-58db5546cc-hwggf\" (UID: \"ca97776a-2f5b-4188-9c96-ff0ee5a94002\") " pod="openstack/dnsmasq-dns-58db5546cc-hwggf" Dec 07 19:35:26 crc kubenswrapper[4815]: I1207 19:35:26.177075 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 07 19:35:26 crc kubenswrapper[4815]: I1207 19:35:26.178558 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 07 19:35:26 crc kubenswrapper[4815]: I1207 19:35:26.182515 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 07 19:35:26 crc kubenswrapper[4815]: I1207 19:35:26.303237 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6cfdf5698d-xlqfg" Dec 07 19:35:26 crc kubenswrapper[4815]: I1207 19:35:26.304815 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58db5546cc-hwggf" Dec 07 19:35:26 crc kubenswrapper[4815]: I1207 19:35:26.305001 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 07 19:35:26 crc kubenswrapper[4815]: I1207 19:35:26.305370 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 07 19:35:26 crc kubenswrapper[4815]: I1207 19:35:26.402952 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fee8d41-443f-470b-8d24-8f203edd74f9-config-data\") pod \"cinder-api-0\" (UID: \"2fee8d41-443f-470b-8d24-8f203edd74f9\") " pod="openstack/cinder-api-0" Dec 07 19:35:26 crc kubenswrapper[4815]: I1207 19:35:26.403014 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2fee8d41-443f-470b-8d24-8f203edd74f9-scripts\") pod \"cinder-api-0\" (UID: \"2fee8d41-443f-470b-8d24-8f203edd74f9\") " pod="openstack/cinder-api-0" Dec 07 19:35:26 crc kubenswrapper[4815]: I1207 19:35:26.403039 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2fee8d41-443f-470b-8d24-8f203edd74f9-logs\") pod \"cinder-api-0\" (UID: \"2fee8d41-443f-470b-8d24-8f203edd74f9\") " pod="openstack/cinder-api-0" Dec 07 19:35:26 crc kubenswrapper[4815]: I1207 19:35:26.403105 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fee8d41-443f-470b-8d24-8f203edd74f9-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"2fee8d41-443f-470b-8d24-8f203edd74f9\") " pod="openstack/cinder-api-0" Dec 07 19:35:26 crc kubenswrapper[4815]: I1207 19:35:26.403123 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2fee8d41-443f-470b-8d24-8f203edd74f9-config-data-custom\") pod \"cinder-api-0\" (UID: \"2fee8d41-443f-470b-8d24-8f203edd74f9\") " pod="openstack/cinder-api-0" Dec 07 19:35:26 crc kubenswrapper[4815]: I1207 19:35:26.403158 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdfgh\" (UniqueName: \"kubernetes.io/projected/2fee8d41-443f-470b-8d24-8f203edd74f9-kube-api-access-tdfgh\") pod \"cinder-api-0\" (UID: \"2fee8d41-443f-470b-8d24-8f203edd74f9\") " pod="openstack/cinder-api-0" Dec 07 19:35:26 crc kubenswrapper[4815]: I1207 19:35:26.403186 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2fee8d41-443f-470b-8d24-8f203edd74f9-etc-machine-id\") pod \"cinder-api-0\" (UID: \"2fee8d41-443f-470b-8d24-8f203edd74f9\") " pod="openstack/cinder-api-0" Dec 07 19:35:26 crc kubenswrapper[4815]: I1207 19:35:26.512656 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fee8d41-443f-470b-8d24-8f203edd74f9-config-data\") pod \"cinder-api-0\" (UID: \"2fee8d41-443f-470b-8d24-8f203edd74f9\") " pod="openstack/cinder-api-0" Dec 07 19:35:26 crc kubenswrapper[4815]: I1207 19:35:26.512709 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2fee8d41-443f-470b-8d24-8f203edd74f9-scripts\") pod \"cinder-api-0\" (UID: \"2fee8d41-443f-470b-8d24-8f203edd74f9\") " pod="openstack/cinder-api-0" Dec 07 19:35:26 crc kubenswrapper[4815]: I1207 19:35:26.512747 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2fee8d41-443f-470b-8d24-8f203edd74f9-logs\") pod \"cinder-api-0\" (UID: \"2fee8d41-443f-470b-8d24-8f203edd74f9\") " pod="openstack/cinder-api-0" Dec 07 19:35:26 crc kubenswrapper[4815]: I1207 19:35:26.512812 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fee8d41-443f-470b-8d24-8f203edd74f9-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"2fee8d41-443f-470b-8d24-8f203edd74f9\") " pod="openstack/cinder-api-0" Dec 07 19:35:26 crc kubenswrapper[4815]: I1207 19:35:26.512829 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2fee8d41-443f-470b-8d24-8f203edd74f9-config-data-custom\") pod \"cinder-api-0\" (UID: \"2fee8d41-443f-470b-8d24-8f203edd74f9\") " pod="openstack/cinder-api-0" Dec 07 19:35:26 crc kubenswrapper[4815]: I1207 19:35:26.512845 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdfgh\" (UniqueName: \"kubernetes.io/projected/2fee8d41-443f-470b-8d24-8f203edd74f9-kube-api-access-tdfgh\") pod \"cinder-api-0\" (UID: \"2fee8d41-443f-470b-8d24-8f203edd74f9\") " pod="openstack/cinder-api-0" Dec 07 19:35:26 crc kubenswrapper[4815]: I1207 19:35:26.512874 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2fee8d41-443f-470b-8d24-8f203edd74f9-etc-machine-id\") pod \"cinder-api-0\" (UID: \"2fee8d41-443f-470b-8d24-8f203edd74f9\") " pod="openstack/cinder-api-0" Dec 07 19:35:26 crc kubenswrapper[4815]: I1207 19:35:26.513645 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2fee8d41-443f-470b-8d24-8f203edd74f9-logs\") pod \"cinder-api-0\" (UID: \"2fee8d41-443f-470b-8d24-8f203edd74f9\") " pod="openstack/cinder-api-0" Dec 07 19:35:26 crc kubenswrapper[4815]: I1207 19:35:26.514053 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2fee8d41-443f-470b-8d24-8f203edd74f9-etc-machine-id\") pod \"cinder-api-0\" (UID: \"2fee8d41-443f-470b-8d24-8f203edd74f9\") " pod="openstack/cinder-api-0" Dec 07 19:35:26 crc kubenswrapper[4815]: I1207 19:35:26.520836 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fee8d41-443f-470b-8d24-8f203edd74f9-config-data\") pod \"cinder-api-0\" (UID: \"2fee8d41-443f-470b-8d24-8f203edd74f9\") " pod="openstack/cinder-api-0" Dec 07 19:35:26 crc kubenswrapper[4815]: I1207 19:35:26.522167 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2fee8d41-443f-470b-8d24-8f203edd74f9-scripts\") pod \"cinder-api-0\" (UID: \"2fee8d41-443f-470b-8d24-8f203edd74f9\") " pod="openstack/cinder-api-0" Dec 07 19:35:26 crc kubenswrapper[4815]: I1207 19:35:26.523709 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fee8d41-443f-470b-8d24-8f203edd74f9-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"2fee8d41-443f-470b-8d24-8f203edd74f9\") " pod="openstack/cinder-api-0" Dec 07 19:35:26 crc kubenswrapper[4815]: I1207 19:35:26.530466 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2fee8d41-443f-470b-8d24-8f203edd74f9-config-data-custom\") pod \"cinder-api-0\" (UID: \"2fee8d41-443f-470b-8d24-8f203edd74f9\") " pod="openstack/cinder-api-0" Dec 07 19:35:26 crc kubenswrapper[4815]: I1207 19:35:26.530930 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdfgh\" (UniqueName: \"kubernetes.io/projected/2fee8d41-443f-470b-8d24-8f203edd74f9-kube-api-access-tdfgh\") pod \"cinder-api-0\" (UID: \"2fee8d41-443f-470b-8d24-8f203edd74f9\") " pod="openstack/cinder-api-0" Dec 07 19:35:26 crc kubenswrapper[4815]: I1207 19:35:26.605713 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 07 19:35:26 crc kubenswrapper[4815]: I1207 19:35:26.713235 4815 scope.go:117] "RemoveContainer" containerID="03f5e66f8f929a2eb3d5b4a807b2b2019938d3990b67f87bb25a0df5ca30144c" Dec 07 19:35:26 crc kubenswrapper[4815]: I1207 19:35:26.893162 4815 scope.go:117] "RemoveContainer" containerID="d8db10fcc665455ea782f9324b354461ff056f610d893659790bb546c69044ea" Dec 07 19:35:27 crc kubenswrapper[4815]: I1207 19:35:27.007806 4815 scope.go:117] "RemoveContainer" containerID="6cfc9f445c2c6b5c1aaf487517b32ed8cdf2aa13f5153d0592c6d4402a336c89" Dec 07 19:35:27 crc kubenswrapper[4815]: I1207 19:35:27.436834 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-bc648dd8-m5s6x" event={"ID":"7b554cad-0000-4ecb-97df-7f0fbdb8c7e8","Type":"ContainerStarted","Data":"cf36c3cf29ceb9f32e7c46376ed0c5113868437315ca061870c2daa5ed5f0ac3"} Dec 07 19:35:27 crc kubenswrapper[4815]: I1207 19:35:27.447763 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-869f779d85-crppt" podUID="890f70c1-67cc-4e1f-883f-3468d96efc77" containerName="dnsmasq-dns" containerID="cri-o://397bd82632eef4997cdc00a7016e94aa8edc1b8c875cfa3d52ac98a62c900cec" gracePeriod=10 Dec 07 19:35:27 crc kubenswrapper[4815]: I1207 19:35:27.448076 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-867657f79f-kqjvk" event={"ID":"22a26306-1a35-4196-8538-3361e51808fc","Type":"ContainerStarted","Data":"1c1d4a6707c54110d2a4bbd149829636b26d8f2498d0c50ae1cc399692fa5386"} Dec 07 19:35:27 crc kubenswrapper[4815]: I1207 19:35:27.716062 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58db5546cc-hwggf"] Dec 07 19:35:27 crc kubenswrapper[4815]: I1207 19:35:27.730477 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 07 19:35:27 crc kubenswrapper[4815]: W1207 19:35:27.752133 4815 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2fee8d41_443f_470b_8d24_8f203edd74f9.slice/crio-d88f84ecf85c079ffd2ebf02eece8e6e2b4ebfa9b5e5281eb360424570c8856e WatchSource:0}: Error finding container d88f84ecf85c079ffd2ebf02eece8e6e2b4ebfa9b5e5281eb360424570c8856e: Status 404 returned error can't find the container with id d88f84ecf85c079ffd2ebf02eece8e6e2b4ebfa9b5e5281eb360424570c8856e Dec 07 19:35:27 crc kubenswrapper[4815]: I1207 19:35:27.752782 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 07 19:35:27 crc kubenswrapper[4815]: W1207 19:35:27.793104 4815 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca97776a_2f5b_4188_9c96_ff0ee5a94002.slice/crio-be30b35974c34fa57c8cd427b2d64a3031ea003b040eb2bee8ca16fc1c691641 WatchSource:0}: Error finding container be30b35974c34fa57c8cd427b2d64a3031ea003b040eb2bee8ca16fc1c691641: Status 404 returned error can't find the container with id be30b35974c34fa57c8cd427b2d64a3031ea003b040eb2bee8ca16fc1c691641 Dec 07 19:35:27 crc kubenswrapper[4815]: I1207 19:35:27.810222 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6cfdf5698d-xlqfg"] Dec 07 19:35:27 crc kubenswrapper[4815]: I1207 19:35:27.941886 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 07 19:35:28 crc kubenswrapper[4815]: I1207 19:35:28.082805 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-869f779d85-crppt" Dec 07 19:35:28 crc kubenswrapper[4815]: I1207 19:35:28.155109 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/890f70c1-67cc-4e1f-883f-3468d96efc77-config\") pod \"890f70c1-67cc-4e1f-883f-3468d96efc77\" (UID: \"890f70c1-67cc-4e1f-883f-3468d96efc77\") " Dec 07 19:35:28 crc kubenswrapper[4815]: I1207 19:35:28.155215 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rrnmh\" (UniqueName: \"kubernetes.io/projected/890f70c1-67cc-4e1f-883f-3468d96efc77-kube-api-access-rrnmh\") pod \"890f70c1-67cc-4e1f-883f-3468d96efc77\" (UID: \"890f70c1-67cc-4e1f-883f-3468d96efc77\") " Dec 07 19:35:28 crc kubenswrapper[4815]: I1207 19:35:28.155306 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/890f70c1-67cc-4e1f-883f-3468d96efc77-ovsdbserver-sb\") pod \"890f70c1-67cc-4e1f-883f-3468d96efc77\" (UID: \"890f70c1-67cc-4e1f-883f-3468d96efc77\") " Dec 07 19:35:28 crc kubenswrapper[4815]: I1207 19:35:28.155354 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/890f70c1-67cc-4e1f-883f-3468d96efc77-ovsdbserver-nb\") pod \"890f70c1-67cc-4e1f-883f-3468d96efc77\" (UID: \"890f70c1-67cc-4e1f-883f-3468d96efc77\") " Dec 07 19:35:28 crc kubenswrapper[4815]: I1207 19:35:28.155378 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/890f70c1-67cc-4e1f-883f-3468d96efc77-dns-svc\") pod \"890f70c1-67cc-4e1f-883f-3468d96efc77\" (UID: \"890f70c1-67cc-4e1f-883f-3468d96efc77\") " Dec 07 19:35:28 crc kubenswrapper[4815]: I1207 19:35:28.169021 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/890f70c1-67cc-4e1f-883f-3468d96efc77-kube-api-access-rrnmh" (OuterVolumeSpecName: "kube-api-access-rrnmh") pod "890f70c1-67cc-4e1f-883f-3468d96efc77" (UID: "890f70c1-67cc-4e1f-883f-3468d96efc77"). InnerVolumeSpecName "kube-api-access-rrnmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:35:28 crc kubenswrapper[4815]: I1207 19:35:28.230659 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/890f70c1-67cc-4e1f-883f-3468d96efc77-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "890f70c1-67cc-4e1f-883f-3468d96efc77" (UID: "890f70c1-67cc-4e1f-883f-3468d96efc77"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:35:28 crc kubenswrapper[4815]: I1207 19:35:28.237039 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/890f70c1-67cc-4e1f-883f-3468d96efc77-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "890f70c1-67cc-4e1f-883f-3468d96efc77" (UID: "890f70c1-67cc-4e1f-883f-3468d96efc77"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:35:28 crc kubenswrapper[4815]: I1207 19:35:28.246322 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/890f70c1-67cc-4e1f-883f-3468d96efc77-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "890f70c1-67cc-4e1f-883f-3468d96efc77" (UID: "890f70c1-67cc-4e1f-883f-3468d96efc77"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:35:28 crc kubenswrapper[4815]: I1207 19:35:28.257618 4815 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/890f70c1-67cc-4e1f-883f-3468d96efc77-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 07 19:35:28 crc kubenswrapper[4815]: I1207 19:35:28.257650 4815 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/890f70c1-67cc-4e1f-883f-3468d96efc77-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 07 19:35:28 crc kubenswrapper[4815]: I1207 19:35:28.257660 4815 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/890f70c1-67cc-4e1f-883f-3468d96efc77-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 07 19:35:28 crc kubenswrapper[4815]: I1207 19:35:28.257671 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rrnmh\" (UniqueName: \"kubernetes.io/projected/890f70c1-67cc-4e1f-883f-3468d96efc77-kube-api-access-rrnmh\") on node \"crc\" DevicePath \"\"" Dec 07 19:35:28 crc kubenswrapper[4815]: I1207 19:35:28.261770 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/890f70c1-67cc-4e1f-883f-3468d96efc77-config" (OuterVolumeSpecName: "config") pod "890f70c1-67cc-4e1f-883f-3468d96efc77" (UID: "890f70c1-67cc-4e1f-883f-3468d96efc77"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:35:28 crc kubenswrapper[4815]: I1207 19:35:28.358700 4815 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/890f70c1-67cc-4e1f-883f-3468d96efc77-config\") on node \"crc\" DevicePath \"\"" Dec 07 19:35:28 crc kubenswrapper[4815]: I1207 19:35:28.472308 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-867657f79f-kqjvk" event={"ID":"22a26306-1a35-4196-8538-3361e51808fc","Type":"ContainerStarted","Data":"153e096598a34aa69257981dbf524166c0ab7056c63fe6d552ab5fbc136dcfef"} Dec 07 19:35:28 crc kubenswrapper[4815]: I1207 19:35:28.474666 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6cfdf5698d-xlqfg" event={"ID":"5eb493fe-dc18-41f8-8102-7d6b906d1a63","Type":"ContainerStarted","Data":"874d41044fd572d16c6f8fcfcec16112ae80a7bed202596cc939a2f683a42104"} Dec 07 19:35:28 crc kubenswrapper[4815]: I1207 19:35:28.474704 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6cfdf5698d-xlqfg" event={"ID":"5eb493fe-dc18-41f8-8102-7d6b906d1a63","Type":"ContainerStarted","Data":"5d85c2b1ca8052b92bccb88d6c1f53a93434471ef5388ff029b16aac36833ebb"} Dec 07 19:35:28 crc kubenswrapper[4815]: I1207 19:35:28.474715 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6cfdf5698d-xlqfg" event={"ID":"5eb493fe-dc18-41f8-8102-7d6b906d1a63","Type":"ContainerStarted","Data":"409dc668ba6aa05c8dd851b8beafa98c566a656ec2559ea60a6a47d2f7bdb91e"} Dec 07 19:35:28 crc kubenswrapper[4815]: I1207 19:35:28.474810 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6cfdf5698d-xlqfg" Dec 07 19:35:28 crc kubenswrapper[4815]: I1207 19:35:28.474835 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6cfdf5698d-xlqfg" Dec 07 19:35:28 crc kubenswrapper[4815]: I1207 19:35:28.479366 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-bc648dd8-m5s6x" event={"ID":"7b554cad-0000-4ecb-97df-7f0fbdb8c7e8","Type":"ContainerStarted","Data":"a9c83849b5db067996d9d10a2c57187fb695cf93a346c19598380629e9cd493a"} Dec 07 19:35:28 crc kubenswrapper[4815]: I1207 19:35:28.481166 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"861aea1c-8b59-408a-a5a5-8eafebb607f1","Type":"ContainerStarted","Data":"ca8f473a521531654fab49f33e9d141574f2cf7005c6606ac68925e49614f8e1"} Dec 07 19:35:28 crc kubenswrapper[4815]: I1207 19:35:28.493263 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-867657f79f-kqjvk" podStartSLOduration=3.744027372 podStartE2EDuration="7.493246108s" podCreationTimestamp="2025-12-07 19:35:21 +0000 UTC" firstStartedPulling="2025-12-07 19:35:23.218078841 +0000 UTC m=+1227.797068886" lastFinishedPulling="2025-12-07 19:35:26.967297577 +0000 UTC m=+1231.546287622" observedRunningTime="2025-12-07 19:35:28.492737084 +0000 UTC m=+1233.071727129" watchObservedRunningTime="2025-12-07 19:35:28.493246108 +0000 UTC m=+1233.072236153" Dec 07 19:35:28 crc kubenswrapper[4815]: I1207 19:35:28.493903 4815 generic.go:334] "Generic (PLEG): container finished" podID="890f70c1-67cc-4e1f-883f-3468d96efc77" containerID="397bd82632eef4997cdc00a7016e94aa8edc1b8c875cfa3d52ac98a62c900cec" exitCode=0 Dec 07 19:35:28 crc kubenswrapper[4815]: I1207 19:35:28.494009 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-869f779d85-crppt" event={"ID":"890f70c1-67cc-4e1f-883f-3468d96efc77","Type":"ContainerDied","Data":"397bd82632eef4997cdc00a7016e94aa8edc1b8c875cfa3d52ac98a62c900cec"} Dec 07 19:35:28 crc kubenswrapper[4815]: I1207 19:35:28.494046 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-869f779d85-crppt" event={"ID":"890f70c1-67cc-4e1f-883f-3468d96efc77","Type":"ContainerDied","Data":"67f5ea7334f56eceae22c95a0065bb2fe809856bfa80d4d2d01d05b13a6096f0"} Dec 07 19:35:28 crc kubenswrapper[4815]: I1207 19:35:28.493984 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-869f779d85-crppt" Dec 07 19:35:28 crc kubenswrapper[4815]: I1207 19:35:28.494068 4815 scope.go:117] "RemoveContainer" containerID="397bd82632eef4997cdc00a7016e94aa8edc1b8c875cfa3d52ac98a62c900cec" Dec 07 19:35:28 crc kubenswrapper[4815]: I1207 19:35:28.513654 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2fee8d41-443f-470b-8d24-8f203edd74f9","Type":"ContainerStarted","Data":"d88f84ecf85c079ffd2ebf02eece8e6e2b4ebfa9b5e5281eb360424570c8856e"} Dec 07 19:35:28 crc kubenswrapper[4815]: I1207 19:35:28.532267 4815 generic.go:334] "Generic (PLEG): container finished" podID="ca97776a-2f5b-4188-9c96-ff0ee5a94002" containerID="d2f3639df657426008057679bc7e36801e0d8ecd62783af87a339716d923963f" exitCode=0 Dec 07 19:35:28 crc kubenswrapper[4815]: I1207 19:35:28.532645 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58db5546cc-hwggf" event={"ID":"ca97776a-2f5b-4188-9c96-ff0ee5a94002","Type":"ContainerDied","Data":"d2f3639df657426008057679bc7e36801e0d8ecd62783af87a339716d923963f"} Dec 07 19:35:28 crc kubenswrapper[4815]: I1207 19:35:28.532673 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58db5546cc-hwggf" event={"ID":"ca97776a-2f5b-4188-9c96-ff0ee5a94002","Type":"ContainerStarted","Data":"be30b35974c34fa57c8cd427b2d64a3031ea003b040eb2bee8ca16fc1c691641"} Dec 07 19:35:28 crc kubenswrapper[4815]: I1207 19:35:28.533830 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6cfdf5698d-xlqfg" podStartSLOduration=3.533810153 podStartE2EDuration="3.533810153s" podCreationTimestamp="2025-12-07 19:35:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:35:28.531387055 +0000 UTC m=+1233.110377100" watchObservedRunningTime="2025-12-07 19:35:28.533810153 +0000 UTC m=+1233.112800198" Dec 07 19:35:28 crc kubenswrapper[4815]: I1207 19:35:28.543110 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1","Type":"ContainerStarted","Data":"c240d9f7e31d799e089a6148381fe8455fe8aa2707de858295af5fb4dee28565"} Dec 07 19:35:28 crc kubenswrapper[4815]: I1207 19:35:28.597560 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-bc648dd8-m5s6x" podStartSLOduration=3.732801005 podStartE2EDuration="7.597539452s" podCreationTimestamp="2025-12-07 19:35:21 +0000 UTC" firstStartedPulling="2025-12-07 19:35:23.096612242 +0000 UTC m=+1227.675602277" lastFinishedPulling="2025-12-07 19:35:26.961350669 +0000 UTC m=+1231.540340724" observedRunningTime="2025-12-07 19:35:28.568880573 +0000 UTC m=+1233.147870618" watchObservedRunningTime="2025-12-07 19:35:28.597539452 +0000 UTC m=+1233.176529497" Dec 07 19:35:28 crc kubenswrapper[4815]: I1207 19:35:28.721318 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-869f779d85-crppt"] Dec 07 19:35:28 crc kubenswrapper[4815]: I1207 19:35:28.726093 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-869f779d85-crppt"] Dec 07 19:35:28 crc kubenswrapper[4815]: I1207 19:35:28.770040 4815 scope.go:117] "RemoveContainer" containerID="d792290c9f092857a09c62f3ff8f8dc19178a3467ae494c022580db194ea6a68" Dec 07 19:35:29 crc kubenswrapper[4815]: I1207 19:35:29.129701 4815 scope.go:117] "RemoveContainer" containerID="397bd82632eef4997cdc00a7016e94aa8edc1b8c875cfa3d52ac98a62c900cec" Dec 07 19:35:29 crc kubenswrapper[4815]: E1207 19:35:29.135053 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"397bd82632eef4997cdc00a7016e94aa8edc1b8c875cfa3d52ac98a62c900cec\": container with ID starting with 397bd82632eef4997cdc00a7016e94aa8edc1b8c875cfa3d52ac98a62c900cec not found: ID does not exist" containerID="397bd82632eef4997cdc00a7016e94aa8edc1b8c875cfa3d52ac98a62c900cec" Dec 07 19:35:29 crc kubenswrapper[4815]: I1207 19:35:29.135107 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"397bd82632eef4997cdc00a7016e94aa8edc1b8c875cfa3d52ac98a62c900cec"} err="failed to get container status \"397bd82632eef4997cdc00a7016e94aa8edc1b8c875cfa3d52ac98a62c900cec\": rpc error: code = NotFound desc = could not find container \"397bd82632eef4997cdc00a7016e94aa8edc1b8c875cfa3d52ac98a62c900cec\": container with ID starting with 397bd82632eef4997cdc00a7016e94aa8edc1b8c875cfa3d52ac98a62c900cec not found: ID does not exist" Dec 07 19:35:29 crc kubenswrapper[4815]: I1207 19:35:29.135138 4815 scope.go:117] "RemoveContainer" containerID="d792290c9f092857a09c62f3ff8f8dc19178a3467ae494c022580db194ea6a68" Dec 07 19:35:29 crc kubenswrapper[4815]: E1207 19:35:29.135490 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d792290c9f092857a09c62f3ff8f8dc19178a3467ae494c022580db194ea6a68\": container with ID starting with d792290c9f092857a09c62f3ff8f8dc19178a3467ae494c022580db194ea6a68 not found: ID does not exist" containerID="d792290c9f092857a09c62f3ff8f8dc19178a3467ae494c022580db194ea6a68" Dec 07 19:35:29 crc kubenswrapper[4815]: I1207 19:35:29.135535 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d792290c9f092857a09c62f3ff8f8dc19178a3467ae494c022580db194ea6a68"} err="failed to get container status \"d792290c9f092857a09c62f3ff8f8dc19178a3467ae494c022580db194ea6a68\": rpc error: code = NotFound desc = could not find container \"d792290c9f092857a09c62f3ff8f8dc19178a3467ae494c022580db194ea6a68\": container with ID starting with d792290c9f092857a09c62f3ff8f8dc19178a3467ae494c022580db194ea6a68 not found: ID does not exist" Dec 07 19:35:29 crc kubenswrapper[4815]: I1207 19:35:29.575318 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2fee8d41-443f-470b-8d24-8f203edd74f9","Type":"ContainerStarted","Data":"77e9cd6e5aa474c3b93dce8f1044055eedb05d3144f0668a29681ad252a757bd"} Dec 07 19:35:29 crc kubenswrapper[4815]: I1207 19:35:29.587525 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58db5546cc-hwggf" event={"ID":"ca97776a-2f5b-4188-9c96-ff0ee5a94002","Type":"ContainerStarted","Data":"59d7e595b532874472069ee447ab845d643b7d72499665443b5a86e02b8b5e9d"} Dec 07 19:35:29 crc kubenswrapper[4815]: I1207 19:35:29.588258 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-58db5546cc-hwggf" Dec 07 19:35:29 crc kubenswrapper[4815]: I1207 19:35:29.591601 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1","Type":"ContainerStarted","Data":"e298f74a21b244c75c8fb6f5c735b8de7527b7362c9969e50ebf1236c8e1e037"} Dec 07 19:35:29 crc kubenswrapper[4815]: I1207 19:35:29.614283 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-58db5546cc-hwggf" podStartSLOduration=4.614268761 podStartE2EDuration="4.614268761s" podCreationTimestamp="2025-12-07 19:35:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:35:29.609827695 +0000 UTC m=+1234.188817760" watchObservedRunningTime="2025-12-07 19:35:29.614268761 +0000 UTC m=+1234.193258806" Dec 07 19:35:29 crc kubenswrapper[4815]: I1207 19:35:29.815202 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="890f70c1-67cc-4e1f-883f-3468d96efc77" path="/var/lib/kubelet/pods/890f70c1-67cc-4e1f-883f-3468d96efc77/volumes" Dec 07 19:35:30 crc kubenswrapper[4815]: I1207 19:35:30.168778 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 07 19:35:30 crc kubenswrapper[4815]: I1207 19:35:30.634855 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"861aea1c-8b59-408a-a5a5-8eafebb607f1","Type":"ContainerStarted","Data":"bb353d0635c5d5823833b5cb186d50accecefb96aef371d140c49506c817063d"} Dec 07 19:35:31 crc kubenswrapper[4815]: I1207 19:35:31.080230 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-66fb95d9b4-wdncv" Dec 07 19:35:31 crc kubenswrapper[4815]: I1207 19:35:31.645058 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2fee8d41-443f-470b-8d24-8f203edd74f9","Type":"ContainerStarted","Data":"8e79d983bc26c3afdd6f97bef7ed4cb4ff56dc471e407cc1606beba10a7dbac5"} Dec 07 19:35:31 crc kubenswrapper[4815]: I1207 19:35:31.645441 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="2fee8d41-443f-470b-8d24-8f203edd74f9" containerName="cinder-api-log" containerID="cri-o://77e9cd6e5aa474c3b93dce8f1044055eedb05d3144f0668a29681ad252a757bd" gracePeriod=30 Dec 07 19:35:31 crc kubenswrapper[4815]: I1207 19:35:31.645576 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="2fee8d41-443f-470b-8d24-8f203edd74f9" containerName="cinder-api" containerID="cri-o://8e79d983bc26c3afdd6f97bef7ed4cb4ff56dc471e407cc1606beba10a7dbac5" gracePeriod=30 Dec 07 19:35:31 crc kubenswrapper[4815]: I1207 19:35:31.645643 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 07 19:35:31 crc kubenswrapper[4815]: I1207 19:35:31.652851 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1","Type":"ContainerStarted","Data":"0b9247d293527d45f23d929de7857537057e57de5415b603aaf0653a50d00168"} Dec 07 19:35:31 crc kubenswrapper[4815]: I1207 19:35:31.655661 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"861aea1c-8b59-408a-a5a5-8eafebb607f1","Type":"ContainerStarted","Data":"699f68b317975d4a25de3d6499567eded70aa41b28003550bbcae5c604915bbc"} Dec 07 19:35:31 crc kubenswrapper[4815]: I1207 19:35:31.673394 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.673374431 podStartE2EDuration="5.673374431s" podCreationTimestamp="2025-12-07 19:35:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:35:31.664569653 +0000 UTC m=+1236.243559698" watchObservedRunningTime="2025-12-07 19:35:31.673374431 +0000 UTC m=+1236.252364476" Dec 07 19:35:31 crc kubenswrapper[4815]: I1207 19:35:31.703115 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=5.641268419 podStartE2EDuration="6.70309794s" podCreationTimestamp="2025-12-07 19:35:25 +0000 UTC" firstStartedPulling="2025-12-07 19:35:27.842237413 +0000 UTC m=+1232.421227458" lastFinishedPulling="2025-12-07 19:35:28.904066934 +0000 UTC m=+1233.483056979" observedRunningTime="2025-12-07 19:35:31.696331119 +0000 UTC m=+1236.275321164" watchObservedRunningTime="2025-12-07 19:35:31.70309794 +0000 UTC m=+1236.282087985" Dec 07 19:35:32 crc kubenswrapper[4815]: I1207 19:35:32.697819 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1","Type":"ContainerStarted","Data":"de12606e922eed5e3216cbd98f35ac9f51c4271f8248bff6d4c6e51d8e266e67"} Dec 07 19:35:32 crc kubenswrapper[4815]: I1207 19:35:32.699961 4815 generic.go:334] "Generic (PLEG): container finished" podID="2fee8d41-443f-470b-8d24-8f203edd74f9" containerID="8e79d983bc26c3afdd6f97bef7ed4cb4ff56dc471e407cc1606beba10a7dbac5" exitCode=0 Dec 07 19:35:32 crc kubenswrapper[4815]: I1207 19:35:32.699991 4815 generic.go:334] "Generic (PLEG): container finished" podID="2fee8d41-443f-470b-8d24-8f203edd74f9" containerID="77e9cd6e5aa474c3b93dce8f1044055eedb05d3144f0668a29681ad252a757bd" exitCode=143 Dec 07 19:35:32 crc kubenswrapper[4815]: I1207 19:35:32.700031 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2fee8d41-443f-470b-8d24-8f203edd74f9","Type":"ContainerDied","Data":"8e79d983bc26c3afdd6f97bef7ed4cb4ff56dc471e407cc1606beba10a7dbac5"} Dec 07 19:35:32 crc kubenswrapper[4815]: I1207 19:35:32.700064 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2fee8d41-443f-470b-8d24-8f203edd74f9","Type":"ContainerDied","Data":"77e9cd6e5aa474c3b93dce8f1044055eedb05d3144f0668a29681ad252a757bd"} Dec 07 19:35:32 crc kubenswrapper[4815]: I1207 19:35:32.700079 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2fee8d41-443f-470b-8d24-8f203edd74f9","Type":"ContainerDied","Data":"d88f84ecf85c079ffd2ebf02eece8e6e2b4ebfa9b5e5281eb360424570c8856e"} Dec 07 19:35:32 crc kubenswrapper[4815]: I1207 19:35:32.700088 4815 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d88f84ecf85c079ffd2ebf02eece8e6e2b4ebfa9b5e5281eb360424570c8856e" Dec 07 19:35:32 crc kubenswrapper[4815]: I1207 19:35:32.715277 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 07 19:35:32 crc kubenswrapper[4815]: I1207 19:35:32.781130 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2fee8d41-443f-470b-8d24-8f203edd74f9-logs\") pod \"2fee8d41-443f-470b-8d24-8f203edd74f9\" (UID: \"2fee8d41-443f-470b-8d24-8f203edd74f9\") " Dec 07 19:35:32 crc kubenswrapper[4815]: I1207 19:35:32.781185 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tdfgh\" (UniqueName: \"kubernetes.io/projected/2fee8d41-443f-470b-8d24-8f203edd74f9-kube-api-access-tdfgh\") pod \"2fee8d41-443f-470b-8d24-8f203edd74f9\" (UID: \"2fee8d41-443f-470b-8d24-8f203edd74f9\") " Dec 07 19:35:32 crc kubenswrapper[4815]: I1207 19:35:32.781213 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2fee8d41-443f-470b-8d24-8f203edd74f9-scripts\") pod \"2fee8d41-443f-470b-8d24-8f203edd74f9\" (UID: \"2fee8d41-443f-470b-8d24-8f203edd74f9\") " Dec 07 19:35:32 crc kubenswrapper[4815]: I1207 19:35:32.781251 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2fee8d41-443f-470b-8d24-8f203edd74f9-config-data-custom\") pod \"2fee8d41-443f-470b-8d24-8f203edd74f9\" (UID: \"2fee8d41-443f-470b-8d24-8f203edd74f9\") " Dec 07 19:35:32 crc kubenswrapper[4815]: I1207 19:35:32.781294 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fee8d41-443f-470b-8d24-8f203edd74f9-config-data\") pod \"2fee8d41-443f-470b-8d24-8f203edd74f9\" (UID: \"2fee8d41-443f-470b-8d24-8f203edd74f9\") " Dec 07 19:35:32 crc kubenswrapper[4815]: I1207 19:35:32.781324 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fee8d41-443f-470b-8d24-8f203edd74f9-combined-ca-bundle\") pod \"2fee8d41-443f-470b-8d24-8f203edd74f9\" (UID: \"2fee8d41-443f-470b-8d24-8f203edd74f9\") " Dec 07 19:35:32 crc kubenswrapper[4815]: I1207 19:35:32.781345 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2fee8d41-443f-470b-8d24-8f203edd74f9-etc-machine-id\") pod \"2fee8d41-443f-470b-8d24-8f203edd74f9\" (UID: \"2fee8d41-443f-470b-8d24-8f203edd74f9\") " Dec 07 19:35:32 crc kubenswrapper[4815]: I1207 19:35:32.782593 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2fee8d41-443f-470b-8d24-8f203edd74f9-logs" (OuterVolumeSpecName: "logs") pod "2fee8d41-443f-470b-8d24-8f203edd74f9" (UID: "2fee8d41-443f-470b-8d24-8f203edd74f9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 07 19:35:32 crc kubenswrapper[4815]: I1207 19:35:32.783475 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2fee8d41-443f-470b-8d24-8f203edd74f9-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "2fee8d41-443f-470b-8d24-8f203edd74f9" (UID: "2fee8d41-443f-470b-8d24-8f203edd74f9"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 07 19:35:32 crc kubenswrapper[4815]: I1207 19:35:32.791477 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fee8d41-443f-470b-8d24-8f203edd74f9-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "2fee8d41-443f-470b-8d24-8f203edd74f9" (UID: "2fee8d41-443f-470b-8d24-8f203edd74f9"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:35:32 crc kubenswrapper[4815]: I1207 19:35:32.796425 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fee8d41-443f-470b-8d24-8f203edd74f9-kube-api-access-tdfgh" (OuterVolumeSpecName: "kube-api-access-tdfgh") pod "2fee8d41-443f-470b-8d24-8f203edd74f9" (UID: "2fee8d41-443f-470b-8d24-8f203edd74f9"). InnerVolumeSpecName "kube-api-access-tdfgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:35:32 crc kubenswrapper[4815]: I1207 19:35:32.812317 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fee8d41-443f-470b-8d24-8f203edd74f9-scripts" (OuterVolumeSpecName: "scripts") pod "2fee8d41-443f-470b-8d24-8f203edd74f9" (UID: "2fee8d41-443f-470b-8d24-8f203edd74f9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:35:32 crc kubenswrapper[4815]: I1207 19:35:32.884511 4815 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2fee8d41-443f-470b-8d24-8f203edd74f9-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 07 19:35:32 crc kubenswrapper[4815]: I1207 19:35:32.884597 4815 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2fee8d41-443f-470b-8d24-8f203edd74f9-logs\") on node \"crc\" DevicePath \"\"" Dec 07 19:35:32 crc kubenswrapper[4815]: I1207 19:35:32.884650 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tdfgh\" (UniqueName: \"kubernetes.io/projected/2fee8d41-443f-470b-8d24-8f203edd74f9-kube-api-access-tdfgh\") on node \"crc\" DevicePath \"\"" Dec 07 19:35:32 crc kubenswrapper[4815]: I1207 19:35:32.884662 4815 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2fee8d41-443f-470b-8d24-8f203edd74f9-scripts\") on node \"crc\" DevicePath \"\"" Dec 07 19:35:32 crc kubenswrapper[4815]: I1207 19:35:32.884671 4815 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2fee8d41-443f-470b-8d24-8f203edd74f9-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 07 19:35:32 crc kubenswrapper[4815]: I1207 19:35:32.915767 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fee8d41-443f-470b-8d24-8f203edd74f9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2fee8d41-443f-470b-8d24-8f203edd74f9" (UID: "2fee8d41-443f-470b-8d24-8f203edd74f9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:35:32 crc kubenswrapper[4815]: I1207 19:35:32.985758 4815 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fee8d41-443f-470b-8d24-8f203edd74f9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 07 19:35:33 crc kubenswrapper[4815]: I1207 19:35:33.019161 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fee8d41-443f-470b-8d24-8f203edd74f9-config-data" (OuterVolumeSpecName: "config-data") pod "2fee8d41-443f-470b-8d24-8f203edd74f9" (UID: "2fee8d41-443f-470b-8d24-8f203edd74f9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:35:33 crc kubenswrapper[4815]: I1207 19:35:33.090019 4815 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fee8d41-443f-470b-8d24-8f203edd74f9-config-data\") on node \"crc\" DevicePath \"\"" Dec 07 19:35:33 crc kubenswrapper[4815]: I1207 19:35:33.714839 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1","Type":"ContainerStarted","Data":"c6b7bf7f663c29627ec6376c4bd7a71b49b335071e518b1c3bed920cfda6acc3"} Dec 07 19:35:33 crc kubenswrapper[4815]: I1207 19:35:33.715208 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 07 19:35:33 crc kubenswrapper[4815]: I1207 19:35:33.714949 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 07 19:35:33 crc kubenswrapper[4815]: I1207 19:35:33.736198 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.772425479 podStartE2EDuration="8.736078864s" podCreationTimestamp="2025-12-07 19:35:25 +0000 UTC" firstStartedPulling="2025-12-07 19:35:28.025076634 +0000 UTC m=+1232.604066679" lastFinishedPulling="2025-12-07 19:35:32.988730019 +0000 UTC m=+1237.567720064" observedRunningTime="2025-12-07 19:35:33.734041306 +0000 UTC m=+1238.313031351" watchObservedRunningTime="2025-12-07 19:35:33.736078864 +0000 UTC m=+1238.315068909" Dec 07 19:35:33 crc kubenswrapper[4815]: I1207 19:35:33.763534 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 07 19:35:33 crc kubenswrapper[4815]: I1207 19:35:33.783430 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Dec 07 19:35:33 crc kubenswrapper[4815]: I1207 19:35:33.790442 4815 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-6cb9b7fdf8-65cp6" podUID="03745531-bf65-4651-8ce0-de448fb99038" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.146:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 07 19:35:33 crc kubenswrapper[4815]: I1207 19:35:33.790774 4815 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-6cb9b7fdf8-65cp6" podUID="03745531-bf65-4651-8ce0-de448fb99038" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.146:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 07 19:35:33 crc kubenswrapper[4815]: I1207 19:35:33.795625 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 07 19:35:33 crc kubenswrapper[4815]: E1207 19:35:33.796020 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="890f70c1-67cc-4e1f-883f-3468d96efc77" containerName="init" Dec 07 19:35:33 crc kubenswrapper[4815]: I1207 19:35:33.796042 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="890f70c1-67cc-4e1f-883f-3468d96efc77" containerName="init" Dec 07 19:35:33 crc kubenswrapper[4815]: E1207 19:35:33.796059 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="890f70c1-67cc-4e1f-883f-3468d96efc77" containerName="dnsmasq-dns" Dec 07 19:35:33 crc kubenswrapper[4815]: I1207 19:35:33.796066 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="890f70c1-67cc-4e1f-883f-3468d96efc77" containerName="dnsmasq-dns" Dec 07 19:35:33 crc kubenswrapper[4815]: E1207 19:35:33.796091 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fee8d41-443f-470b-8d24-8f203edd74f9" containerName="cinder-api" Dec 07 19:35:33 crc kubenswrapper[4815]: I1207 19:35:33.796096 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fee8d41-443f-470b-8d24-8f203edd74f9" containerName="cinder-api" Dec 07 19:35:33 crc kubenswrapper[4815]: E1207 19:35:33.796109 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fee8d41-443f-470b-8d24-8f203edd74f9" containerName="cinder-api-log" Dec 07 19:35:33 crc kubenswrapper[4815]: I1207 19:35:33.796114 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fee8d41-443f-470b-8d24-8f203edd74f9" containerName="cinder-api-log" Dec 07 19:35:33 crc kubenswrapper[4815]: I1207 19:35:33.796320 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="890f70c1-67cc-4e1f-883f-3468d96efc77" containerName="dnsmasq-dns" Dec 07 19:35:33 crc kubenswrapper[4815]: I1207 19:35:33.796337 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fee8d41-443f-470b-8d24-8f203edd74f9" containerName="cinder-api-log" Dec 07 19:35:33 crc kubenswrapper[4815]: I1207 19:35:33.796348 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fee8d41-443f-470b-8d24-8f203edd74f9" containerName="cinder-api" Dec 07 19:35:33 crc kubenswrapper[4815]: I1207 19:35:33.797212 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 07 19:35:33 crc kubenswrapper[4815]: I1207 19:35:33.802980 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 07 19:35:33 crc kubenswrapper[4815]: I1207 19:35:33.807047 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Dec 07 19:35:33 crc kubenswrapper[4815]: I1207 19:35:33.807234 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Dec 07 19:35:33 crc kubenswrapper[4815]: I1207 19:35:33.812235 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 07 19:35:33 crc kubenswrapper[4815]: I1207 19:35:33.978037 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b698eff3-3885-47f0-bdf5-3de49fa89141-etc-machine-id\") pod \"cinder-api-0\" (UID: \"b698eff3-3885-47f0-bdf5-3de49fa89141\") " pod="openstack/cinder-api-0" Dec 07 19:35:33 crc kubenswrapper[4815]: I1207 19:35:33.978443 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b698eff3-3885-47f0-bdf5-3de49fa89141-config-data\") pod \"cinder-api-0\" (UID: \"b698eff3-3885-47f0-bdf5-3de49fa89141\") " pod="openstack/cinder-api-0" Dec 07 19:35:33 crc kubenswrapper[4815]: I1207 19:35:33.979080 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b698eff3-3885-47f0-bdf5-3de49fa89141-logs\") pod \"cinder-api-0\" (UID: \"b698eff3-3885-47f0-bdf5-3de49fa89141\") " pod="openstack/cinder-api-0" Dec 07 19:35:33 crc kubenswrapper[4815]: I1207 19:35:33.979109 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b698eff3-3885-47f0-bdf5-3de49fa89141-scripts\") pod \"cinder-api-0\" (UID: \"b698eff3-3885-47f0-bdf5-3de49fa89141\") " pod="openstack/cinder-api-0" Dec 07 19:35:33 crc kubenswrapper[4815]: I1207 19:35:33.979187 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b698eff3-3885-47f0-bdf5-3de49fa89141-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"b698eff3-3885-47f0-bdf5-3de49fa89141\") " pod="openstack/cinder-api-0" Dec 07 19:35:33 crc kubenswrapper[4815]: I1207 19:35:33.979243 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhqb9\" (UniqueName: \"kubernetes.io/projected/b698eff3-3885-47f0-bdf5-3de49fa89141-kube-api-access-lhqb9\") pod \"cinder-api-0\" (UID: \"b698eff3-3885-47f0-bdf5-3de49fa89141\") " pod="openstack/cinder-api-0" Dec 07 19:35:33 crc kubenswrapper[4815]: I1207 19:35:33.979269 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b698eff3-3885-47f0-bdf5-3de49fa89141-config-data-custom\") pod \"cinder-api-0\" (UID: \"b698eff3-3885-47f0-bdf5-3de49fa89141\") " pod="openstack/cinder-api-0" Dec 07 19:35:33 crc kubenswrapper[4815]: I1207 19:35:33.980349 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b698eff3-3885-47f0-bdf5-3de49fa89141-public-tls-certs\") pod \"cinder-api-0\" (UID: \"b698eff3-3885-47f0-bdf5-3de49fa89141\") " pod="openstack/cinder-api-0" Dec 07 19:35:33 crc kubenswrapper[4815]: I1207 19:35:33.980381 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b698eff3-3885-47f0-bdf5-3de49fa89141-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"b698eff3-3885-47f0-bdf5-3de49fa89141\") " pod="openstack/cinder-api-0" Dec 07 19:35:34 crc kubenswrapper[4815]: I1207 19:35:34.081696 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b698eff3-3885-47f0-bdf5-3de49fa89141-config-data\") pod \"cinder-api-0\" (UID: \"b698eff3-3885-47f0-bdf5-3de49fa89141\") " pod="openstack/cinder-api-0" Dec 07 19:35:34 crc kubenswrapper[4815]: I1207 19:35:34.081750 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b698eff3-3885-47f0-bdf5-3de49fa89141-logs\") pod \"cinder-api-0\" (UID: \"b698eff3-3885-47f0-bdf5-3de49fa89141\") " pod="openstack/cinder-api-0" Dec 07 19:35:34 crc kubenswrapper[4815]: I1207 19:35:34.081768 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b698eff3-3885-47f0-bdf5-3de49fa89141-scripts\") pod \"cinder-api-0\" (UID: \"b698eff3-3885-47f0-bdf5-3de49fa89141\") " pod="openstack/cinder-api-0" Dec 07 19:35:34 crc kubenswrapper[4815]: I1207 19:35:34.081785 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b698eff3-3885-47f0-bdf5-3de49fa89141-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"b698eff3-3885-47f0-bdf5-3de49fa89141\") " pod="openstack/cinder-api-0" Dec 07 19:35:34 crc kubenswrapper[4815]: I1207 19:35:34.081806 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhqb9\" (UniqueName: \"kubernetes.io/projected/b698eff3-3885-47f0-bdf5-3de49fa89141-kube-api-access-lhqb9\") pod \"cinder-api-0\" (UID: \"b698eff3-3885-47f0-bdf5-3de49fa89141\") " pod="openstack/cinder-api-0" Dec 07 19:35:34 crc kubenswrapper[4815]: I1207 19:35:34.081820 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b698eff3-3885-47f0-bdf5-3de49fa89141-config-data-custom\") pod \"cinder-api-0\" (UID: \"b698eff3-3885-47f0-bdf5-3de49fa89141\") " pod="openstack/cinder-api-0" Dec 07 19:35:34 crc kubenswrapper[4815]: I1207 19:35:34.081906 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b698eff3-3885-47f0-bdf5-3de49fa89141-public-tls-certs\") pod \"cinder-api-0\" (UID: \"b698eff3-3885-47f0-bdf5-3de49fa89141\") " pod="openstack/cinder-api-0" Dec 07 19:35:34 crc kubenswrapper[4815]: I1207 19:35:34.081947 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b698eff3-3885-47f0-bdf5-3de49fa89141-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"b698eff3-3885-47f0-bdf5-3de49fa89141\") " pod="openstack/cinder-api-0" Dec 07 19:35:34 crc kubenswrapper[4815]: I1207 19:35:34.081985 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b698eff3-3885-47f0-bdf5-3de49fa89141-etc-machine-id\") pod \"cinder-api-0\" (UID: \"b698eff3-3885-47f0-bdf5-3de49fa89141\") " pod="openstack/cinder-api-0" Dec 07 19:35:34 crc kubenswrapper[4815]: I1207 19:35:34.082051 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b698eff3-3885-47f0-bdf5-3de49fa89141-etc-machine-id\") pod \"cinder-api-0\" (UID: \"b698eff3-3885-47f0-bdf5-3de49fa89141\") " pod="openstack/cinder-api-0" Dec 07 19:35:34 crc kubenswrapper[4815]: I1207 19:35:34.086299 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b698eff3-3885-47f0-bdf5-3de49fa89141-logs\") pod \"cinder-api-0\" (UID: \"b698eff3-3885-47f0-bdf5-3de49fa89141\") " pod="openstack/cinder-api-0" Dec 07 19:35:34 crc kubenswrapper[4815]: I1207 19:35:34.089140 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b698eff3-3885-47f0-bdf5-3de49fa89141-config-data\") pod \"cinder-api-0\" (UID: \"b698eff3-3885-47f0-bdf5-3de49fa89141\") " pod="openstack/cinder-api-0" Dec 07 19:35:34 crc kubenswrapper[4815]: I1207 19:35:34.089992 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b698eff3-3885-47f0-bdf5-3de49fa89141-scripts\") pod \"cinder-api-0\" (UID: \"b698eff3-3885-47f0-bdf5-3de49fa89141\") " pod="openstack/cinder-api-0" Dec 07 19:35:34 crc kubenswrapper[4815]: I1207 19:35:34.093049 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b698eff3-3885-47f0-bdf5-3de49fa89141-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"b698eff3-3885-47f0-bdf5-3de49fa89141\") " pod="openstack/cinder-api-0" Dec 07 19:35:34 crc kubenswrapper[4815]: I1207 19:35:34.095523 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b698eff3-3885-47f0-bdf5-3de49fa89141-public-tls-certs\") pod \"cinder-api-0\" (UID: \"b698eff3-3885-47f0-bdf5-3de49fa89141\") " pod="openstack/cinder-api-0" Dec 07 19:35:34 crc kubenswrapper[4815]: I1207 19:35:34.101387 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b698eff3-3885-47f0-bdf5-3de49fa89141-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"b698eff3-3885-47f0-bdf5-3de49fa89141\") " pod="openstack/cinder-api-0" Dec 07 19:35:34 crc kubenswrapper[4815]: I1207 19:35:34.105133 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b698eff3-3885-47f0-bdf5-3de49fa89141-config-data-custom\") pod \"cinder-api-0\" (UID: \"b698eff3-3885-47f0-bdf5-3de49fa89141\") " pod="openstack/cinder-api-0" Dec 07 19:35:34 crc kubenswrapper[4815]: I1207 19:35:34.105630 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhqb9\" (UniqueName: \"kubernetes.io/projected/b698eff3-3885-47f0-bdf5-3de49fa89141-kube-api-access-lhqb9\") pod \"cinder-api-0\" (UID: \"b698eff3-3885-47f0-bdf5-3de49fa89141\") " pod="openstack/cinder-api-0" Dec 07 19:35:34 crc kubenswrapper[4815]: I1207 19:35:34.117153 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 07 19:35:34 crc kubenswrapper[4815]: I1207 19:35:34.846462 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 07 19:35:34 crc kubenswrapper[4815]: I1207 19:35:34.948695 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-869d694d95-htxnx" Dec 07 19:35:35 crc kubenswrapper[4815]: I1207 19:35:35.057763 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-66fb95d9b4-wdncv"] Dec 07 19:35:35 crc kubenswrapper[4815]: I1207 19:35:35.058009 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-66fb95d9b4-wdncv" podUID="092eb215-4eaa-4402-9692-1cf4aa9928e3" containerName="neutron-api" containerID="cri-o://c162cbf7bc1bea033b438651ed33a918c8b3eed3ab44859ca45384f607ec6be9" gracePeriod=30 Dec 07 19:35:35 crc kubenswrapper[4815]: I1207 19:35:35.058456 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-66fb95d9b4-wdncv" podUID="092eb215-4eaa-4402-9692-1cf4aa9928e3" containerName="neutron-httpd" containerID="cri-o://9a646cf1a20f69e63d455c594a9224f63eb6d8c2c3aff2790476f333402e5510" gracePeriod=30 Dec 07 19:35:35 crc kubenswrapper[4815]: E1207 19:35:35.347872 4815 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod092eb215_4eaa_4402_9692_1cf4aa9928e3.slice/crio-9a646cf1a20f69e63d455c594a9224f63eb6d8c2c3aff2790476f333402e5510.scope\": RecentStats: unable to find data in memory cache]" Dec 07 19:35:35 crc kubenswrapper[4815]: I1207 19:35:35.729831 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b698eff3-3885-47f0-bdf5-3de49fa89141","Type":"ContainerStarted","Data":"338c05d54130414fe760255275f1ad7060eedcb18818e0e1e8f374ad25b70cad"} Dec 07 19:35:35 crc kubenswrapper[4815]: I1207 19:35:35.734828 4815 generic.go:334] "Generic (PLEG): container finished" podID="092eb215-4eaa-4402-9692-1cf4aa9928e3" containerID="9a646cf1a20f69e63d455c594a9224f63eb6d8c2c3aff2790476f333402e5510" exitCode=0 Dec 07 19:35:35 crc kubenswrapper[4815]: I1207 19:35:35.734873 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-66fb95d9b4-wdncv" event={"ID":"092eb215-4eaa-4402-9692-1cf4aa9928e3","Type":"ContainerDied","Data":"9a646cf1a20f69e63d455c594a9224f63eb6d8c2c3aff2790476f333402e5510"} Dec 07 19:35:35 crc kubenswrapper[4815]: I1207 19:35:35.786389 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2fee8d41-443f-470b-8d24-8f203edd74f9" path="/var/lib/kubelet/pods/2fee8d41-443f-470b-8d24-8f203edd74f9/volumes" Dec 07 19:35:35 crc kubenswrapper[4815]: I1207 19:35:35.911123 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 07 19:35:36 crc kubenswrapper[4815]: I1207 19:35:36.306063 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-58db5546cc-hwggf" Dec 07 19:35:36 crc kubenswrapper[4815]: I1207 19:35:36.404818 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f66db59b9-qc6xg"] Dec 07 19:35:36 crc kubenswrapper[4815]: I1207 19:35:36.416180 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5f66db59b9-qc6xg" podUID="d3d79808-d535-4247-b6d3-6152049a185e" containerName="dnsmasq-dns" containerID="cri-o://59dda7884a0a9118128cc92109b6420908376852f9733dc400881d66091bc444" gracePeriod=10 Dec 07 19:35:36 crc kubenswrapper[4815]: I1207 19:35:36.532207 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 07 19:35:36 crc kubenswrapper[4815]: I1207 19:35:36.800696 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b698eff3-3885-47f0-bdf5-3de49fa89141","Type":"ContainerStarted","Data":"b660ebcc08780e48c2e74e669fc127a4490e03ad0fd47bfbb5fdc729b6d7a406"} Dec 07 19:35:36 crc kubenswrapper[4815]: I1207 19:35:36.918079 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 07 19:35:37 crc kubenswrapper[4815]: I1207 19:35:37.292440 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f66db59b9-qc6xg" Dec 07 19:35:37 crc kubenswrapper[4815]: I1207 19:35:37.428301 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3d79808-d535-4247-b6d3-6152049a185e-config\") pod \"d3d79808-d535-4247-b6d3-6152049a185e\" (UID: \"d3d79808-d535-4247-b6d3-6152049a185e\") " Dec 07 19:35:37 crc kubenswrapper[4815]: I1207 19:35:37.428380 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d3d79808-d535-4247-b6d3-6152049a185e-ovsdbserver-nb\") pod \"d3d79808-d535-4247-b6d3-6152049a185e\" (UID: \"d3d79808-d535-4247-b6d3-6152049a185e\") " Dec 07 19:35:37 crc kubenswrapper[4815]: I1207 19:35:37.428538 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d3d79808-d535-4247-b6d3-6152049a185e-ovsdbserver-sb\") pod \"d3d79808-d535-4247-b6d3-6152049a185e\" (UID: \"d3d79808-d535-4247-b6d3-6152049a185e\") " Dec 07 19:35:37 crc kubenswrapper[4815]: I1207 19:35:37.428567 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d3d79808-d535-4247-b6d3-6152049a185e-dns-svc\") pod \"d3d79808-d535-4247-b6d3-6152049a185e\" (UID: \"d3d79808-d535-4247-b6d3-6152049a185e\") " Dec 07 19:35:37 crc kubenswrapper[4815]: I1207 19:35:37.428621 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjrh7\" (UniqueName: \"kubernetes.io/projected/d3d79808-d535-4247-b6d3-6152049a185e-kube-api-access-bjrh7\") pod \"d3d79808-d535-4247-b6d3-6152049a185e\" (UID: \"d3d79808-d535-4247-b6d3-6152049a185e\") " Dec 07 19:35:37 crc kubenswrapper[4815]: I1207 19:35:37.465279 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3d79808-d535-4247-b6d3-6152049a185e-kube-api-access-bjrh7" (OuterVolumeSpecName: "kube-api-access-bjrh7") pod "d3d79808-d535-4247-b6d3-6152049a185e" (UID: "d3d79808-d535-4247-b6d3-6152049a185e"). InnerVolumeSpecName "kube-api-access-bjrh7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:35:37 crc kubenswrapper[4815]: I1207 19:35:37.529519 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3d79808-d535-4247-b6d3-6152049a185e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d3d79808-d535-4247-b6d3-6152049a185e" (UID: "d3d79808-d535-4247-b6d3-6152049a185e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:35:37 crc kubenswrapper[4815]: I1207 19:35:37.532293 4815 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d3d79808-d535-4247-b6d3-6152049a185e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 07 19:35:37 crc kubenswrapper[4815]: I1207 19:35:37.532331 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bjrh7\" (UniqueName: \"kubernetes.io/projected/d3d79808-d535-4247-b6d3-6152049a185e-kube-api-access-bjrh7\") on node \"crc\" DevicePath \"\"" Dec 07 19:35:37 crc kubenswrapper[4815]: I1207 19:35:37.554595 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3d79808-d535-4247-b6d3-6152049a185e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d3d79808-d535-4247-b6d3-6152049a185e" (UID: "d3d79808-d535-4247-b6d3-6152049a185e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:35:37 crc kubenswrapper[4815]: I1207 19:35:37.557795 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3d79808-d535-4247-b6d3-6152049a185e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d3d79808-d535-4247-b6d3-6152049a185e" (UID: "d3d79808-d535-4247-b6d3-6152049a185e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:35:37 crc kubenswrapper[4815]: I1207 19:35:37.580834 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3d79808-d535-4247-b6d3-6152049a185e-config" (OuterVolumeSpecName: "config") pod "d3d79808-d535-4247-b6d3-6152049a185e" (UID: "d3d79808-d535-4247-b6d3-6152049a185e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:35:37 crc kubenswrapper[4815]: I1207 19:35:37.633571 4815 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d3d79808-d535-4247-b6d3-6152049a185e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 07 19:35:37 crc kubenswrapper[4815]: I1207 19:35:37.633604 4815 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d3d79808-d535-4247-b6d3-6152049a185e-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 07 19:35:37 crc kubenswrapper[4815]: I1207 19:35:37.633613 4815 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3d79808-d535-4247-b6d3-6152049a185e-config\") on node \"crc\" DevicePath \"\"" Dec 07 19:35:37 crc kubenswrapper[4815]: I1207 19:35:37.783177 4815 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6cb9b7fdf8-65cp6" podUID="03745531-bf65-4651-8ce0-de448fb99038" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.146:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 07 19:35:37 crc kubenswrapper[4815]: I1207 19:35:37.783183 4815 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6cb9b7fdf8-65cp6" podUID="03745531-bf65-4651-8ce0-de448fb99038" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.146:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 07 19:35:37 crc kubenswrapper[4815]: I1207 19:35:37.813961 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b698eff3-3885-47f0-bdf5-3de49fa89141","Type":"ContainerStarted","Data":"9e5bfc23a07abe6b5aae764825a8a50632710e8785dc6da44bae308b2cca9cd0"} Dec 07 19:35:37 crc kubenswrapper[4815]: I1207 19:35:37.815250 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 07 19:35:37 crc kubenswrapper[4815]: I1207 19:35:37.831078 4815 generic.go:334] "Generic (PLEG): container finished" podID="d3d79808-d535-4247-b6d3-6152049a185e" containerID="59dda7884a0a9118128cc92109b6420908376852f9733dc400881d66091bc444" exitCode=0 Dec 07 19:35:37 crc kubenswrapper[4815]: I1207 19:35:37.831514 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="861aea1c-8b59-408a-a5a5-8eafebb607f1" containerName="cinder-scheduler" containerID="cri-o://bb353d0635c5d5823833b5cb186d50accecefb96aef371d140c49506c817063d" gracePeriod=30 Dec 07 19:35:37 crc kubenswrapper[4815]: I1207 19:35:37.831853 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f66db59b9-qc6xg" Dec 07 19:35:37 crc kubenswrapper[4815]: I1207 19:35:37.832228 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f66db59b9-qc6xg" event={"ID":"d3d79808-d535-4247-b6d3-6152049a185e","Type":"ContainerDied","Data":"59dda7884a0a9118128cc92109b6420908376852f9733dc400881d66091bc444"} Dec 07 19:35:37 crc kubenswrapper[4815]: I1207 19:35:37.832285 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f66db59b9-qc6xg" event={"ID":"d3d79808-d535-4247-b6d3-6152049a185e","Type":"ContainerDied","Data":"10a39f47c10b2d312cf53aaec25cd38e4d05b84434262001d2240700c0c27579"} Dec 07 19:35:37 crc kubenswrapper[4815]: I1207 19:35:37.832306 4815 scope.go:117] "RemoveContainer" containerID="59dda7884a0a9118128cc92109b6420908376852f9733dc400881d66091bc444" Dec 07 19:35:37 crc kubenswrapper[4815]: I1207 19:35:37.832462 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="861aea1c-8b59-408a-a5a5-8eafebb607f1" containerName="probe" containerID="cri-o://699f68b317975d4a25de3d6499567eded70aa41b28003550bbcae5c604915bbc" gracePeriod=30 Dec 07 19:35:37 crc kubenswrapper[4815]: I1207 19:35:37.850441 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6cb9b7fdf8-65cp6" Dec 07 19:35:37 crc kubenswrapper[4815]: I1207 19:35:37.872936 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.87290681 podStartE2EDuration="4.87290681s" podCreationTimestamp="2025-12-07 19:35:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:35:37.844472447 +0000 UTC m=+1242.423462492" watchObservedRunningTime="2025-12-07 19:35:37.87290681 +0000 UTC m=+1242.451896855" Dec 07 19:35:37 crc kubenswrapper[4815]: I1207 19:35:37.884493 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f66db59b9-qc6xg"] Dec 07 19:35:37 crc kubenswrapper[4815]: I1207 19:35:37.904967 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5f66db59b9-qc6xg"] Dec 07 19:35:37 crc kubenswrapper[4815]: I1207 19:35:37.926025 4815 scope.go:117] "RemoveContainer" containerID="7b128be25a3c3150ddb35f02c1e40a485f80f6f81d6d3d10828698a48ada7942" Dec 07 19:35:38 crc kubenswrapper[4815]: I1207 19:35:38.007103 4815 scope.go:117] "RemoveContainer" containerID="59dda7884a0a9118128cc92109b6420908376852f9733dc400881d66091bc444" Dec 07 19:35:38 crc kubenswrapper[4815]: E1207 19:35:38.009318 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59dda7884a0a9118128cc92109b6420908376852f9733dc400881d66091bc444\": container with ID starting with 59dda7884a0a9118128cc92109b6420908376852f9733dc400881d66091bc444 not found: ID does not exist" containerID="59dda7884a0a9118128cc92109b6420908376852f9733dc400881d66091bc444" Dec 07 19:35:38 crc kubenswrapper[4815]: I1207 19:35:38.009361 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59dda7884a0a9118128cc92109b6420908376852f9733dc400881d66091bc444"} err="failed to get container status \"59dda7884a0a9118128cc92109b6420908376852f9733dc400881d66091bc444\": rpc error: code = NotFound desc = could not find container \"59dda7884a0a9118128cc92109b6420908376852f9733dc400881d66091bc444\": container with ID starting with 59dda7884a0a9118128cc92109b6420908376852f9733dc400881d66091bc444 not found: ID does not exist" Dec 07 19:35:38 crc kubenswrapper[4815]: I1207 19:35:38.009387 4815 scope.go:117] "RemoveContainer" containerID="7b128be25a3c3150ddb35f02c1e40a485f80f6f81d6d3d10828698a48ada7942" Dec 07 19:35:38 crc kubenswrapper[4815]: E1207 19:35:38.011239 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b128be25a3c3150ddb35f02c1e40a485f80f6f81d6d3d10828698a48ada7942\": container with ID starting with 7b128be25a3c3150ddb35f02c1e40a485f80f6f81d6d3d10828698a48ada7942 not found: ID does not exist" containerID="7b128be25a3c3150ddb35f02c1e40a485f80f6f81d6d3d10828698a48ada7942" Dec 07 19:35:38 crc kubenswrapper[4815]: I1207 19:35:38.011275 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b128be25a3c3150ddb35f02c1e40a485f80f6f81d6d3d10828698a48ada7942"} err="failed to get container status \"7b128be25a3c3150ddb35f02c1e40a485f80f6f81d6d3d10828698a48ada7942\": rpc error: code = NotFound desc = could not find container \"7b128be25a3c3150ddb35f02c1e40a485f80f6f81d6d3d10828698a48ada7942\": container with ID starting with 7b128be25a3c3150ddb35f02c1e40a485f80f6f81d6d3d10828698a48ada7942 not found: ID does not exist" Dec 07 19:35:38 crc kubenswrapper[4815]: I1207 19:35:38.843011 4815 generic.go:334] "Generic (PLEG): container finished" podID="861aea1c-8b59-408a-a5a5-8eafebb607f1" containerID="bb353d0635c5d5823833b5cb186d50accecefb96aef371d140c49506c817063d" exitCode=0 Dec 07 19:35:38 crc kubenswrapper[4815]: I1207 19:35:38.843233 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"861aea1c-8b59-408a-a5a5-8eafebb607f1","Type":"ContainerDied","Data":"bb353d0635c5d5823833b5cb186d50accecefb96aef371d140c49506c817063d"} Dec 07 19:35:39 crc kubenswrapper[4815]: I1207 19:35:39.372968 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 07 19:35:39 crc kubenswrapper[4815]: I1207 19:35:39.386324 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-46v96\" (UniqueName: \"kubernetes.io/projected/861aea1c-8b59-408a-a5a5-8eafebb607f1-kube-api-access-46v96\") pod \"861aea1c-8b59-408a-a5a5-8eafebb607f1\" (UID: \"861aea1c-8b59-408a-a5a5-8eafebb607f1\") " Dec 07 19:35:39 crc kubenswrapper[4815]: I1207 19:35:39.386438 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/861aea1c-8b59-408a-a5a5-8eafebb607f1-etc-machine-id\") pod \"861aea1c-8b59-408a-a5a5-8eafebb607f1\" (UID: \"861aea1c-8b59-408a-a5a5-8eafebb607f1\") " Dec 07 19:35:39 crc kubenswrapper[4815]: I1207 19:35:39.386478 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/861aea1c-8b59-408a-a5a5-8eafebb607f1-scripts\") pod \"861aea1c-8b59-408a-a5a5-8eafebb607f1\" (UID: \"861aea1c-8b59-408a-a5a5-8eafebb607f1\") " Dec 07 19:35:39 crc kubenswrapper[4815]: I1207 19:35:39.386541 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/861aea1c-8b59-408a-a5a5-8eafebb607f1-config-data\") pod \"861aea1c-8b59-408a-a5a5-8eafebb607f1\" (UID: \"861aea1c-8b59-408a-a5a5-8eafebb607f1\") " Dec 07 19:35:39 crc kubenswrapper[4815]: I1207 19:35:39.386562 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/861aea1c-8b59-408a-a5a5-8eafebb607f1-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "861aea1c-8b59-408a-a5a5-8eafebb607f1" (UID: "861aea1c-8b59-408a-a5a5-8eafebb607f1"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 07 19:35:39 crc kubenswrapper[4815]: I1207 19:35:39.386585 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/861aea1c-8b59-408a-a5a5-8eafebb607f1-config-data-custom\") pod \"861aea1c-8b59-408a-a5a5-8eafebb607f1\" (UID: \"861aea1c-8b59-408a-a5a5-8eafebb607f1\") " Dec 07 19:35:39 crc kubenswrapper[4815]: I1207 19:35:39.386622 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/861aea1c-8b59-408a-a5a5-8eafebb607f1-combined-ca-bundle\") pod \"861aea1c-8b59-408a-a5a5-8eafebb607f1\" (UID: \"861aea1c-8b59-408a-a5a5-8eafebb607f1\") " Dec 07 19:35:39 crc kubenswrapper[4815]: I1207 19:35:39.386994 4815 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/861aea1c-8b59-408a-a5a5-8eafebb607f1-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 07 19:35:39 crc kubenswrapper[4815]: I1207 19:35:39.449204 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/861aea1c-8b59-408a-a5a5-8eafebb607f1-kube-api-access-46v96" (OuterVolumeSpecName: "kube-api-access-46v96") pod "861aea1c-8b59-408a-a5a5-8eafebb607f1" (UID: "861aea1c-8b59-408a-a5a5-8eafebb607f1"). InnerVolumeSpecName "kube-api-access-46v96". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:35:39 crc kubenswrapper[4815]: I1207 19:35:39.449295 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/861aea1c-8b59-408a-a5a5-8eafebb607f1-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "861aea1c-8b59-408a-a5a5-8eafebb607f1" (UID: "861aea1c-8b59-408a-a5a5-8eafebb607f1"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:35:39 crc kubenswrapper[4815]: I1207 19:35:39.449380 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/861aea1c-8b59-408a-a5a5-8eafebb607f1-scripts" (OuterVolumeSpecName: "scripts") pod "861aea1c-8b59-408a-a5a5-8eafebb607f1" (UID: "861aea1c-8b59-408a-a5a5-8eafebb607f1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:35:39 crc kubenswrapper[4815]: I1207 19:35:39.488086 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/861aea1c-8b59-408a-a5a5-8eafebb607f1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "861aea1c-8b59-408a-a5a5-8eafebb607f1" (UID: "861aea1c-8b59-408a-a5a5-8eafebb607f1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:35:39 crc kubenswrapper[4815]: I1207 19:35:39.501141 4815 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/861aea1c-8b59-408a-a5a5-8eafebb607f1-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 07 19:35:39 crc kubenswrapper[4815]: I1207 19:35:39.501191 4815 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/861aea1c-8b59-408a-a5a5-8eafebb607f1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 07 19:35:39 crc kubenswrapper[4815]: I1207 19:35:39.501206 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-46v96\" (UniqueName: \"kubernetes.io/projected/861aea1c-8b59-408a-a5a5-8eafebb607f1-kube-api-access-46v96\") on node \"crc\" DevicePath \"\"" Dec 07 19:35:39 crc kubenswrapper[4815]: I1207 19:35:39.501262 4815 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/861aea1c-8b59-408a-a5a5-8eafebb607f1-scripts\") on node \"crc\" DevicePath \"\"" Dec 07 19:35:39 crc kubenswrapper[4815]: I1207 19:35:39.555392 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/861aea1c-8b59-408a-a5a5-8eafebb607f1-config-data" (OuterVolumeSpecName: "config-data") pod "861aea1c-8b59-408a-a5a5-8eafebb607f1" (UID: "861aea1c-8b59-408a-a5a5-8eafebb607f1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:35:39 crc kubenswrapper[4815]: I1207 19:35:39.605555 4815 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/861aea1c-8b59-408a-a5a5-8eafebb607f1-config-data\") on node \"crc\" DevicePath \"\"" Dec 07 19:35:39 crc kubenswrapper[4815]: I1207 19:35:39.772149 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-66fb95d9b4-wdncv" Dec 07 19:35:39 crc kubenswrapper[4815]: I1207 19:35:39.781542 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3d79808-d535-4247-b6d3-6152049a185e" path="/var/lib/kubelet/pods/d3d79808-d535-4247-b6d3-6152049a185e/volumes" Dec 07 19:35:39 crc kubenswrapper[4815]: I1207 19:35:39.808934 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/092eb215-4eaa-4402-9692-1cf4aa9928e3-config\") pod \"092eb215-4eaa-4402-9692-1cf4aa9928e3\" (UID: \"092eb215-4eaa-4402-9692-1cf4aa9928e3\") " Dec 07 19:35:39 crc kubenswrapper[4815]: I1207 19:35:39.808981 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/092eb215-4eaa-4402-9692-1cf4aa9928e3-ovndb-tls-certs\") pod \"092eb215-4eaa-4402-9692-1cf4aa9928e3\" (UID: \"092eb215-4eaa-4402-9692-1cf4aa9928e3\") " Dec 07 19:35:39 crc kubenswrapper[4815]: I1207 19:35:39.809040 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hhc6v\" (UniqueName: \"kubernetes.io/projected/092eb215-4eaa-4402-9692-1cf4aa9928e3-kube-api-access-hhc6v\") pod \"092eb215-4eaa-4402-9692-1cf4aa9928e3\" (UID: \"092eb215-4eaa-4402-9692-1cf4aa9928e3\") " Dec 07 19:35:39 crc kubenswrapper[4815]: I1207 19:35:39.809136 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/092eb215-4eaa-4402-9692-1cf4aa9928e3-httpd-config\") pod \"092eb215-4eaa-4402-9692-1cf4aa9928e3\" (UID: \"092eb215-4eaa-4402-9692-1cf4aa9928e3\") " Dec 07 19:35:39 crc kubenswrapper[4815]: I1207 19:35:39.809282 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/092eb215-4eaa-4402-9692-1cf4aa9928e3-combined-ca-bundle\") pod \"092eb215-4eaa-4402-9692-1cf4aa9928e3\" (UID: \"092eb215-4eaa-4402-9692-1cf4aa9928e3\") " Dec 07 19:35:39 crc kubenswrapper[4815]: I1207 19:35:39.834220 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/092eb215-4eaa-4402-9692-1cf4aa9928e3-kube-api-access-hhc6v" (OuterVolumeSpecName: "kube-api-access-hhc6v") pod "092eb215-4eaa-4402-9692-1cf4aa9928e3" (UID: "092eb215-4eaa-4402-9692-1cf4aa9928e3"). InnerVolumeSpecName "kube-api-access-hhc6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:35:39 crc kubenswrapper[4815]: I1207 19:35:39.834826 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/092eb215-4eaa-4402-9692-1cf4aa9928e3-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "092eb215-4eaa-4402-9692-1cf4aa9928e3" (UID: "092eb215-4eaa-4402-9692-1cf4aa9928e3"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:35:39 crc kubenswrapper[4815]: I1207 19:35:39.869717 4815 generic.go:334] "Generic (PLEG): container finished" podID="861aea1c-8b59-408a-a5a5-8eafebb607f1" containerID="699f68b317975d4a25de3d6499567eded70aa41b28003550bbcae5c604915bbc" exitCode=0 Dec 07 19:35:39 crc kubenswrapper[4815]: I1207 19:35:39.869806 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"861aea1c-8b59-408a-a5a5-8eafebb607f1","Type":"ContainerDied","Data":"699f68b317975d4a25de3d6499567eded70aa41b28003550bbcae5c604915bbc"} Dec 07 19:35:39 crc kubenswrapper[4815]: I1207 19:35:39.869835 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"861aea1c-8b59-408a-a5a5-8eafebb607f1","Type":"ContainerDied","Data":"ca8f473a521531654fab49f33e9d141574f2cf7005c6606ac68925e49614f8e1"} Dec 07 19:35:39 crc kubenswrapper[4815]: I1207 19:35:39.869850 4815 scope.go:117] "RemoveContainer" containerID="699f68b317975d4a25de3d6499567eded70aa41b28003550bbcae5c604915bbc" Dec 07 19:35:39 crc kubenswrapper[4815]: I1207 19:35:39.869972 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 07 19:35:39 crc kubenswrapper[4815]: I1207 19:35:39.885383 4815 generic.go:334] "Generic (PLEG): container finished" podID="092eb215-4eaa-4402-9692-1cf4aa9928e3" containerID="c162cbf7bc1bea033b438651ed33a918c8b3eed3ab44859ca45384f607ec6be9" exitCode=0 Dec 07 19:35:39 crc kubenswrapper[4815]: I1207 19:35:39.886377 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-66fb95d9b4-wdncv" Dec 07 19:35:39 crc kubenswrapper[4815]: I1207 19:35:39.886575 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-66fb95d9b4-wdncv" event={"ID":"092eb215-4eaa-4402-9692-1cf4aa9928e3","Type":"ContainerDied","Data":"c162cbf7bc1bea033b438651ed33a918c8b3eed3ab44859ca45384f607ec6be9"} Dec 07 19:35:39 crc kubenswrapper[4815]: I1207 19:35:39.886602 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-66fb95d9b4-wdncv" event={"ID":"092eb215-4eaa-4402-9692-1cf4aa9928e3","Type":"ContainerDied","Data":"fd793563c416952c0bde61671b755698f845abca2a3d0032a3e8659a591a6643"} Dec 07 19:35:39 crc kubenswrapper[4815]: I1207 19:35:39.898064 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/092eb215-4eaa-4402-9692-1cf4aa9928e3-config" (OuterVolumeSpecName: "config") pod "092eb215-4eaa-4402-9692-1cf4aa9928e3" (UID: "092eb215-4eaa-4402-9692-1cf4aa9928e3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:35:39 crc kubenswrapper[4815]: I1207 19:35:39.915736 4815 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/092eb215-4eaa-4402-9692-1cf4aa9928e3-httpd-config\") on node \"crc\" DevicePath \"\"" Dec 07 19:35:39 crc kubenswrapper[4815]: I1207 19:35:39.924102 4815 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/092eb215-4eaa-4402-9692-1cf4aa9928e3-config\") on node \"crc\" DevicePath \"\"" Dec 07 19:35:39 crc kubenswrapper[4815]: I1207 19:35:39.924116 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hhc6v\" (UniqueName: \"kubernetes.io/projected/092eb215-4eaa-4402-9692-1cf4aa9928e3-kube-api-access-hhc6v\") on node \"crc\" DevicePath \"\"" Dec 07 19:35:39 crc kubenswrapper[4815]: I1207 19:35:39.944740 4815 scope.go:117] "RemoveContainer" containerID="bb353d0635c5d5823833b5cb186d50accecefb96aef371d140c49506c817063d" Dec 07 19:35:39 crc kubenswrapper[4815]: I1207 19:35:39.969120 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/092eb215-4eaa-4402-9692-1cf4aa9928e3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "092eb215-4eaa-4402-9692-1cf4aa9928e3" (UID: "092eb215-4eaa-4402-9692-1cf4aa9928e3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:35:39 crc kubenswrapper[4815]: I1207 19:35:39.971987 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 07 19:35:40 crc kubenswrapper[4815]: I1207 19:35:40.002560 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 07 19:35:40 crc kubenswrapper[4815]: I1207 19:35:40.034137 4815 scope.go:117] "RemoveContainer" containerID="699f68b317975d4a25de3d6499567eded70aa41b28003550bbcae5c604915bbc" Dec 07 19:35:40 crc kubenswrapper[4815]: E1207 19:35:40.036627 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"699f68b317975d4a25de3d6499567eded70aa41b28003550bbcae5c604915bbc\": container with ID starting with 699f68b317975d4a25de3d6499567eded70aa41b28003550bbcae5c604915bbc not found: ID does not exist" containerID="699f68b317975d4a25de3d6499567eded70aa41b28003550bbcae5c604915bbc" Dec 07 19:35:40 crc kubenswrapper[4815]: I1207 19:35:40.036673 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"699f68b317975d4a25de3d6499567eded70aa41b28003550bbcae5c604915bbc"} err="failed to get container status \"699f68b317975d4a25de3d6499567eded70aa41b28003550bbcae5c604915bbc\": rpc error: code = NotFound desc = could not find container \"699f68b317975d4a25de3d6499567eded70aa41b28003550bbcae5c604915bbc\": container with ID starting with 699f68b317975d4a25de3d6499567eded70aa41b28003550bbcae5c604915bbc not found: ID does not exist" Dec 07 19:35:40 crc kubenswrapper[4815]: I1207 19:35:40.036769 4815 scope.go:117] "RemoveContainer" containerID="bb353d0635c5d5823833b5cb186d50accecefb96aef371d140c49506c817063d" Dec 07 19:35:40 crc kubenswrapper[4815]: I1207 19:35:40.050700 4815 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/092eb215-4eaa-4402-9692-1cf4aa9928e3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 07 19:35:40 crc kubenswrapper[4815]: E1207 19:35:40.053084 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb353d0635c5d5823833b5cb186d50accecefb96aef371d140c49506c817063d\": container with ID starting with bb353d0635c5d5823833b5cb186d50accecefb96aef371d140c49506c817063d not found: ID does not exist" containerID="bb353d0635c5d5823833b5cb186d50accecefb96aef371d140c49506c817063d" Dec 07 19:35:40 crc kubenswrapper[4815]: I1207 19:35:40.053391 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb353d0635c5d5823833b5cb186d50accecefb96aef371d140c49506c817063d"} err="failed to get container status \"bb353d0635c5d5823833b5cb186d50accecefb96aef371d140c49506c817063d\": rpc error: code = NotFound desc = could not find container \"bb353d0635c5d5823833b5cb186d50accecefb96aef371d140c49506c817063d\": container with ID starting with bb353d0635c5d5823833b5cb186d50accecefb96aef371d140c49506c817063d not found: ID does not exist" Dec 07 19:35:40 crc kubenswrapper[4815]: I1207 19:35:40.053419 4815 scope.go:117] "RemoveContainer" containerID="9a646cf1a20f69e63d455c594a9224f63eb6d8c2c3aff2790476f333402e5510" Dec 07 19:35:40 crc kubenswrapper[4815]: I1207 19:35:40.054122 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 07 19:35:40 crc kubenswrapper[4815]: E1207 19:35:40.056094 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3d79808-d535-4247-b6d3-6152049a185e" containerName="dnsmasq-dns" Dec 07 19:35:40 crc kubenswrapper[4815]: I1207 19:35:40.056115 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3d79808-d535-4247-b6d3-6152049a185e" containerName="dnsmasq-dns" Dec 07 19:35:40 crc kubenswrapper[4815]: E1207 19:35:40.056145 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="861aea1c-8b59-408a-a5a5-8eafebb607f1" containerName="probe" Dec 07 19:35:40 crc kubenswrapper[4815]: I1207 19:35:40.056151 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="861aea1c-8b59-408a-a5a5-8eafebb607f1" containerName="probe" Dec 07 19:35:40 crc kubenswrapper[4815]: E1207 19:35:40.056183 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="092eb215-4eaa-4402-9692-1cf4aa9928e3" containerName="neutron-api" Dec 07 19:35:40 crc kubenswrapper[4815]: I1207 19:35:40.056189 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="092eb215-4eaa-4402-9692-1cf4aa9928e3" containerName="neutron-api" Dec 07 19:35:40 crc kubenswrapper[4815]: E1207 19:35:40.056208 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="092eb215-4eaa-4402-9692-1cf4aa9928e3" containerName="neutron-httpd" Dec 07 19:35:40 crc kubenswrapper[4815]: I1207 19:35:40.056214 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="092eb215-4eaa-4402-9692-1cf4aa9928e3" containerName="neutron-httpd" Dec 07 19:35:40 crc kubenswrapper[4815]: E1207 19:35:40.056236 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="861aea1c-8b59-408a-a5a5-8eafebb607f1" containerName="cinder-scheduler" Dec 07 19:35:40 crc kubenswrapper[4815]: I1207 19:35:40.056242 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="861aea1c-8b59-408a-a5a5-8eafebb607f1" containerName="cinder-scheduler" Dec 07 19:35:40 crc kubenswrapper[4815]: E1207 19:35:40.056259 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3d79808-d535-4247-b6d3-6152049a185e" containerName="init" Dec 07 19:35:40 crc kubenswrapper[4815]: I1207 19:35:40.056265 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3d79808-d535-4247-b6d3-6152049a185e" containerName="init" Dec 07 19:35:40 crc kubenswrapper[4815]: I1207 19:35:40.056585 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="092eb215-4eaa-4402-9692-1cf4aa9928e3" containerName="neutron-api" Dec 07 19:35:40 crc kubenswrapper[4815]: I1207 19:35:40.056618 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="861aea1c-8b59-408a-a5a5-8eafebb607f1" containerName="probe" Dec 07 19:35:40 crc kubenswrapper[4815]: I1207 19:35:40.056630 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="861aea1c-8b59-408a-a5a5-8eafebb607f1" containerName="cinder-scheduler" Dec 07 19:35:40 crc kubenswrapper[4815]: I1207 19:35:40.056651 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="092eb215-4eaa-4402-9692-1cf4aa9928e3" containerName="neutron-httpd" Dec 07 19:35:40 crc kubenswrapper[4815]: I1207 19:35:40.056670 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3d79808-d535-4247-b6d3-6152049a185e" containerName="dnsmasq-dns" Dec 07 19:35:40 crc kubenswrapper[4815]: I1207 19:35:40.067319 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6cfdf5698d-xlqfg" Dec 07 19:35:40 crc kubenswrapper[4815]: I1207 19:35:40.069028 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 07 19:35:40 crc kubenswrapper[4815]: I1207 19:35:40.072896 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 07 19:35:40 crc kubenswrapper[4815]: I1207 19:35:40.111731 4815 scope.go:117] "RemoveContainer" containerID="c162cbf7bc1bea033b438651ed33a918c8b3eed3ab44859ca45384f607ec6be9" Dec 07 19:35:40 crc kubenswrapper[4815]: I1207 19:35:40.153981 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 07 19:35:40 crc kubenswrapper[4815]: I1207 19:35:40.154045 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82a1e678-f13e-4971-86e1-9e53a78da17a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"82a1e678-f13e-4971-86e1-9e53a78da17a\") " pod="openstack/cinder-scheduler-0" Dec 07 19:35:40 crc kubenswrapper[4815]: I1207 19:35:40.154428 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/82a1e678-f13e-4971-86e1-9e53a78da17a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"82a1e678-f13e-4971-86e1-9e53a78da17a\") " pod="openstack/cinder-scheduler-0" Dec 07 19:35:40 crc kubenswrapper[4815]: I1207 19:35:40.154620 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztxdq\" (UniqueName: \"kubernetes.io/projected/82a1e678-f13e-4971-86e1-9e53a78da17a-kube-api-access-ztxdq\") pod \"cinder-scheduler-0\" (UID: \"82a1e678-f13e-4971-86e1-9e53a78da17a\") " pod="openstack/cinder-scheduler-0" Dec 07 19:35:40 crc kubenswrapper[4815]: I1207 19:35:40.154717 4815 scope.go:117] "RemoveContainer" containerID="9a646cf1a20f69e63d455c594a9224f63eb6d8c2c3aff2790476f333402e5510" Dec 07 19:35:40 crc kubenswrapper[4815]: I1207 19:35:40.154953 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82a1e678-f13e-4971-86e1-9e53a78da17a-scripts\") pod \"cinder-scheduler-0\" (UID: \"82a1e678-f13e-4971-86e1-9e53a78da17a\") " pod="openstack/cinder-scheduler-0" Dec 07 19:35:40 crc kubenswrapper[4815]: I1207 19:35:40.155085 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/82a1e678-f13e-4971-86e1-9e53a78da17a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"82a1e678-f13e-4971-86e1-9e53a78da17a\") " pod="openstack/cinder-scheduler-0" Dec 07 19:35:40 crc kubenswrapper[4815]: I1207 19:35:40.155485 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82a1e678-f13e-4971-86e1-9e53a78da17a-config-data\") pod \"cinder-scheduler-0\" (UID: \"82a1e678-f13e-4971-86e1-9e53a78da17a\") " pod="openstack/cinder-scheduler-0" Dec 07 19:35:40 crc kubenswrapper[4815]: E1207 19:35:40.155639 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a646cf1a20f69e63d455c594a9224f63eb6d8c2c3aff2790476f333402e5510\": container with ID starting with 9a646cf1a20f69e63d455c594a9224f63eb6d8c2c3aff2790476f333402e5510 not found: ID does not exist" containerID="9a646cf1a20f69e63d455c594a9224f63eb6d8c2c3aff2790476f333402e5510" Dec 07 19:35:40 crc kubenswrapper[4815]: I1207 19:35:40.155691 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a646cf1a20f69e63d455c594a9224f63eb6d8c2c3aff2790476f333402e5510"} err="failed to get container status \"9a646cf1a20f69e63d455c594a9224f63eb6d8c2c3aff2790476f333402e5510\": rpc error: code = NotFound desc = could not find container \"9a646cf1a20f69e63d455c594a9224f63eb6d8c2c3aff2790476f333402e5510\": container with ID starting with 9a646cf1a20f69e63d455c594a9224f63eb6d8c2c3aff2790476f333402e5510 not found: ID does not exist" Dec 07 19:35:40 crc kubenswrapper[4815]: I1207 19:35:40.155730 4815 scope.go:117] "RemoveContainer" containerID="c162cbf7bc1bea033b438651ed33a918c8b3eed3ab44859ca45384f607ec6be9" Dec 07 19:35:40 crc kubenswrapper[4815]: E1207 19:35:40.156517 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c162cbf7bc1bea033b438651ed33a918c8b3eed3ab44859ca45384f607ec6be9\": container with ID starting with c162cbf7bc1bea033b438651ed33a918c8b3eed3ab44859ca45384f607ec6be9 not found: ID does not exist" containerID="c162cbf7bc1bea033b438651ed33a918c8b3eed3ab44859ca45384f607ec6be9" Dec 07 19:35:40 crc kubenswrapper[4815]: I1207 19:35:40.156625 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c162cbf7bc1bea033b438651ed33a918c8b3eed3ab44859ca45384f607ec6be9"} err="failed to get container status \"c162cbf7bc1bea033b438651ed33a918c8b3eed3ab44859ca45384f607ec6be9\": rpc error: code = NotFound desc = could not find container \"c162cbf7bc1bea033b438651ed33a918c8b3eed3ab44859ca45384f607ec6be9\": container with ID starting with c162cbf7bc1bea033b438651ed33a918c8b3eed3ab44859ca45384f607ec6be9 not found: ID does not exist" Dec 07 19:35:40 crc kubenswrapper[4815]: I1207 19:35:40.157051 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/092eb215-4eaa-4402-9692-1cf4aa9928e3-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "092eb215-4eaa-4402-9692-1cf4aa9928e3" (UID: "092eb215-4eaa-4402-9692-1cf4aa9928e3"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:35:40 crc kubenswrapper[4815]: I1207 19:35:40.256837 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztxdq\" (UniqueName: \"kubernetes.io/projected/82a1e678-f13e-4971-86e1-9e53a78da17a-kube-api-access-ztxdq\") pod \"cinder-scheduler-0\" (UID: \"82a1e678-f13e-4971-86e1-9e53a78da17a\") " pod="openstack/cinder-scheduler-0" Dec 07 19:35:40 crc kubenswrapper[4815]: I1207 19:35:40.257125 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82a1e678-f13e-4971-86e1-9e53a78da17a-scripts\") pod \"cinder-scheduler-0\" (UID: \"82a1e678-f13e-4971-86e1-9e53a78da17a\") " pod="openstack/cinder-scheduler-0" Dec 07 19:35:40 crc kubenswrapper[4815]: I1207 19:35:40.257235 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/82a1e678-f13e-4971-86e1-9e53a78da17a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"82a1e678-f13e-4971-86e1-9e53a78da17a\") " pod="openstack/cinder-scheduler-0" Dec 07 19:35:40 crc kubenswrapper[4815]: I1207 19:35:40.257394 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82a1e678-f13e-4971-86e1-9e53a78da17a-config-data\") pod \"cinder-scheduler-0\" (UID: \"82a1e678-f13e-4971-86e1-9e53a78da17a\") " pod="openstack/cinder-scheduler-0" Dec 07 19:35:40 crc kubenswrapper[4815]: I1207 19:35:40.257509 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82a1e678-f13e-4971-86e1-9e53a78da17a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"82a1e678-f13e-4971-86e1-9e53a78da17a\") " pod="openstack/cinder-scheduler-0" Dec 07 19:35:40 crc kubenswrapper[4815]: I1207 19:35:40.257603 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/82a1e678-f13e-4971-86e1-9e53a78da17a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"82a1e678-f13e-4971-86e1-9e53a78da17a\") " pod="openstack/cinder-scheduler-0" Dec 07 19:35:40 crc kubenswrapper[4815]: I1207 19:35:40.257754 4815 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/092eb215-4eaa-4402-9692-1cf4aa9928e3-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 07 19:35:40 crc kubenswrapper[4815]: I1207 19:35:40.257632 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/82a1e678-f13e-4971-86e1-9e53a78da17a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"82a1e678-f13e-4971-86e1-9e53a78da17a\") " pod="openstack/cinder-scheduler-0" Dec 07 19:35:40 crc kubenswrapper[4815]: I1207 19:35:40.260394 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82a1e678-f13e-4971-86e1-9e53a78da17a-scripts\") pod \"cinder-scheduler-0\" (UID: \"82a1e678-f13e-4971-86e1-9e53a78da17a\") " pod="openstack/cinder-scheduler-0" Dec 07 19:35:40 crc kubenswrapper[4815]: I1207 19:35:40.261342 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82a1e678-f13e-4971-86e1-9e53a78da17a-config-data\") pod \"cinder-scheduler-0\" (UID: \"82a1e678-f13e-4971-86e1-9e53a78da17a\") " pod="openstack/cinder-scheduler-0" Dec 07 19:35:40 crc kubenswrapper[4815]: I1207 19:35:40.268203 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/82a1e678-f13e-4971-86e1-9e53a78da17a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"82a1e678-f13e-4971-86e1-9e53a78da17a\") " pod="openstack/cinder-scheduler-0" Dec 07 19:35:40 crc kubenswrapper[4815]: I1207 19:35:40.269756 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82a1e678-f13e-4971-86e1-9e53a78da17a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"82a1e678-f13e-4971-86e1-9e53a78da17a\") " pod="openstack/cinder-scheduler-0" Dec 07 19:35:40 crc kubenswrapper[4815]: I1207 19:35:40.269866 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-66fb95d9b4-wdncv"] Dec 07 19:35:40 crc kubenswrapper[4815]: I1207 19:35:40.293468 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-66fb95d9b4-wdncv"] Dec 07 19:35:40 crc kubenswrapper[4815]: I1207 19:35:40.294574 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztxdq\" (UniqueName: \"kubernetes.io/projected/82a1e678-f13e-4971-86e1-9e53a78da17a-kube-api-access-ztxdq\") pod \"cinder-scheduler-0\" (UID: \"82a1e678-f13e-4971-86e1-9e53a78da17a\") " pod="openstack/cinder-scheduler-0" Dec 07 19:35:40 crc kubenswrapper[4815]: I1207 19:35:40.352183 4815 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-6cfdf5698d-xlqfg" podUID="5eb493fe-dc18-41f8-8102-7d6b906d1a63" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.150:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 07 19:35:40 crc kubenswrapper[4815]: I1207 19:35:40.412584 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 07 19:35:40 crc kubenswrapper[4815]: I1207 19:35:40.932299 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 07 19:35:41 crc kubenswrapper[4815]: I1207 19:35:41.309142 4815 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6cfdf5698d-xlqfg" podUID="5eb493fe-dc18-41f8-8102-7d6b906d1a63" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.150:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 07 19:35:41 crc kubenswrapper[4815]: I1207 19:35:41.465524 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-8bbdc95bd-g568f" Dec 07 19:35:41 crc kubenswrapper[4815]: I1207 19:35:41.720818 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6cb9b7fdf8-65cp6" Dec 07 19:35:41 crc kubenswrapper[4815]: I1207 19:35:41.812818 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="092eb215-4eaa-4402-9692-1cf4aa9928e3" path="/var/lib/kubelet/pods/092eb215-4eaa-4402-9692-1cf4aa9928e3/volumes" Dec 07 19:35:41 crc kubenswrapper[4815]: I1207 19:35:41.815051 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="861aea1c-8b59-408a-a5a5-8eafebb607f1" path="/var/lib/kubelet/pods/861aea1c-8b59-408a-a5a5-8eafebb607f1/volumes" Dec 07 19:35:41 crc kubenswrapper[4815]: I1207 19:35:41.903525 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-8bbdc95bd-g568f" Dec 07 19:35:41 crc kubenswrapper[4815]: I1207 19:35:41.930009 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"82a1e678-f13e-4971-86e1-9e53a78da17a","Type":"ContainerStarted","Data":"2b42497866e1cc3bee852e24f3289bf128ed91482ce8046a970ae6293a06c075"} Dec 07 19:35:41 crc kubenswrapper[4815]: I1207 19:35:41.930408 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"82a1e678-f13e-4971-86e1-9e53a78da17a","Type":"ContainerStarted","Data":"25ab767411d77034307aa5b647cf957f465ee3d6306f83402c4005688a160ebd"} Dec 07 19:35:42 crc kubenswrapper[4815]: I1207 19:35:42.938514 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"82a1e678-f13e-4971-86e1-9e53a78da17a","Type":"ContainerStarted","Data":"0fb0c106a39af098f7b566614757d26ae64e3ab987dbf04eecca4937e5c96203"} Dec 07 19:35:42 crc kubenswrapper[4815]: I1207 19:35:42.962596 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.962578722 podStartE2EDuration="3.962578722s" podCreationTimestamp="2025-12-07 19:35:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:35:42.959011872 +0000 UTC m=+1247.538001917" watchObservedRunningTime="2025-12-07 19:35:42.962578722 +0000 UTC m=+1247.541568767" Dec 07 19:35:44 crc kubenswrapper[4815]: I1207 19:35:44.455419 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-687459c5bc-4dzjp" Dec 07 19:35:44 crc kubenswrapper[4815]: I1207 19:35:44.865381 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6cfdf5698d-xlqfg" Dec 07 19:35:44 crc kubenswrapper[4815]: I1207 19:35:44.916709 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6cb9b7fdf8-65cp6"] Dec 07 19:35:44 crc kubenswrapper[4815]: I1207 19:35:44.916943 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6cb9b7fdf8-65cp6" podUID="03745531-bf65-4651-8ce0-de448fb99038" containerName="barbican-api-log" containerID="cri-o://3e667753123e28d9fb46a094341f67ee9ff2fed6bf88d502bbc32bb56b5964a6" gracePeriod=30 Dec 07 19:35:44 crc kubenswrapper[4815]: I1207 19:35:44.917074 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6cb9b7fdf8-65cp6" podUID="03745531-bf65-4651-8ce0-de448fb99038" containerName="barbican-api" containerID="cri-o://3993ff65020a7db120f04627f871934000c74a34730cccb6fd8b45cab1a85bc1" gracePeriod=30 Dec 07 19:35:45 crc kubenswrapper[4815]: I1207 19:35:45.412892 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 07 19:35:45 crc kubenswrapper[4815]: I1207 19:35:45.551004 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 07 19:35:45 crc kubenswrapper[4815]: I1207 19:35:45.552354 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 07 19:35:45 crc kubenswrapper[4815]: I1207 19:35:45.563544 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Dec 07 19:35:45 crc kubenswrapper[4815]: I1207 19:35:45.563666 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-c5sws" Dec 07 19:35:45 crc kubenswrapper[4815]: I1207 19:35:45.564364 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Dec 07 19:35:45 crc kubenswrapper[4815]: I1207 19:35:45.576346 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 07 19:35:45 crc kubenswrapper[4815]: I1207 19:35:45.678337 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f89ccff5-752c-4e06-ba7f-dea8d1e8d502-openstack-config\") pod \"openstackclient\" (UID: \"f89ccff5-752c-4e06-ba7f-dea8d1e8d502\") " pod="openstack/openstackclient" Dec 07 19:35:45 crc kubenswrapper[4815]: I1207 19:35:45.678466 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f89ccff5-752c-4e06-ba7f-dea8d1e8d502-openstack-config-secret\") pod \"openstackclient\" (UID: \"f89ccff5-752c-4e06-ba7f-dea8d1e8d502\") " pod="openstack/openstackclient" Dec 07 19:35:45 crc kubenswrapper[4815]: I1207 19:35:45.678496 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f89ccff5-752c-4e06-ba7f-dea8d1e8d502-combined-ca-bundle\") pod \"openstackclient\" (UID: \"f89ccff5-752c-4e06-ba7f-dea8d1e8d502\") " pod="openstack/openstackclient" Dec 07 19:35:45 crc kubenswrapper[4815]: I1207 19:35:45.678535 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wxj9\" (UniqueName: \"kubernetes.io/projected/f89ccff5-752c-4e06-ba7f-dea8d1e8d502-kube-api-access-9wxj9\") pod \"openstackclient\" (UID: \"f89ccff5-752c-4e06-ba7f-dea8d1e8d502\") " pod="openstack/openstackclient" Dec 07 19:35:45 crc kubenswrapper[4815]: I1207 19:35:45.736400 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Dec 07 19:35:45 crc kubenswrapper[4815]: E1207 19:35:45.737021 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle kube-api-access-9wxj9 openstack-config openstack-config-secret], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/openstackclient" podUID="f89ccff5-752c-4e06-ba7f-dea8d1e8d502" Dec 07 19:35:45 crc kubenswrapper[4815]: I1207 19:35:45.743250 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Dec 07 19:35:45 crc kubenswrapper[4815]: I1207 19:35:45.758962 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 07 19:35:45 crc kubenswrapper[4815]: I1207 19:35:45.759979 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 07 19:35:45 crc kubenswrapper[4815]: I1207 19:35:45.805723 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f89ccff5-752c-4e06-ba7f-dea8d1e8d502" path="/var/lib/kubelet/pods/f89ccff5-752c-4e06-ba7f-dea8d1e8d502/volumes" Dec 07 19:35:45 crc kubenswrapper[4815]: I1207 19:35:45.806343 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 07 19:35:45 crc kubenswrapper[4815]: I1207 19:35:45.815607 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f89ccff5-752c-4e06-ba7f-dea8d1e8d502-openstack-config-secret\") pod \"openstackclient\" (UID: \"f89ccff5-752c-4e06-ba7f-dea8d1e8d502\") " pod="openstack/openstackclient" Dec 07 19:35:45 crc kubenswrapper[4815]: I1207 19:35:45.815661 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f89ccff5-752c-4e06-ba7f-dea8d1e8d502-combined-ca-bundle\") pod \"openstackclient\" (UID: \"f89ccff5-752c-4e06-ba7f-dea8d1e8d502\") " pod="openstack/openstackclient" Dec 07 19:35:45 crc kubenswrapper[4815]: I1207 19:35:45.815724 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wxj9\" (UniqueName: \"kubernetes.io/projected/f89ccff5-752c-4e06-ba7f-dea8d1e8d502-kube-api-access-9wxj9\") pod \"openstackclient\" (UID: \"f89ccff5-752c-4e06-ba7f-dea8d1e8d502\") " pod="openstack/openstackclient" Dec 07 19:35:45 crc kubenswrapper[4815]: I1207 19:35:45.815780 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f89ccff5-752c-4e06-ba7f-dea8d1e8d502-openstack-config\") pod \"openstackclient\" (UID: \"f89ccff5-752c-4e06-ba7f-dea8d1e8d502\") " pod="openstack/openstackclient" Dec 07 19:35:45 crc kubenswrapper[4815]: E1207 19:35:45.819334 4815 projected.go:194] Error preparing data for projected volume kube-api-access-9wxj9 for pod openstack/openstackclient: failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (f89ccff5-752c-4e06-ba7f-dea8d1e8d502) does not match the UID in record. The object might have been deleted and then recreated Dec 07 19:35:45 crc kubenswrapper[4815]: E1207 19:35:45.825795 4815 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f89ccff5-752c-4e06-ba7f-dea8d1e8d502-kube-api-access-9wxj9 podName:f89ccff5-752c-4e06-ba7f-dea8d1e8d502 nodeName:}" failed. No retries permitted until 2025-12-07 19:35:46.325764759 +0000 UTC m=+1250.904754804 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-9wxj9" (UniqueName: "kubernetes.io/projected/f89ccff5-752c-4e06-ba7f-dea8d1e8d502-kube-api-access-9wxj9") pod "openstackclient" (UID: "f89ccff5-752c-4e06-ba7f-dea8d1e8d502") : failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (f89ccff5-752c-4e06-ba7f-dea8d1e8d502) does not match the UID in record. The object might have been deleted and then recreated Dec 07 19:35:45 crc kubenswrapper[4815]: I1207 19:35:45.821515 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f89ccff5-752c-4e06-ba7f-dea8d1e8d502-openstack-config\") pod \"openstackclient\" (UID: \"f89ccff5-752c-4e06-ba7f-dea8d1e8d502\") " pod="openstack/openstackclient" Dec 07 19:35:45 crc kubenswrapper[4815]: I1207 19:35:45.831424 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f89ccff5-752c-4e06-ba7f-dea8d1e8d502-combined-ca-bundle\") pod \"openstackclient\" (UID: \"f89ccff5-752c-4e06-ba7f-dea8d1e8d502\") " pod="openstack/openstackclient" Dec 07 19:35:45 crc kubenswrapper[4815]: I1207 19:35:45.854622 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f89ccff5-752c-4e06-ba7f-dea8d1e8d502-openstack-config-secret\") pod \"openstackclient\" (UID: \"f89ccff5-752c-4e06-ba7f-dea8d1e8d502\") " pod="openstack/openstackclient" Dec 07 19:35:45 crc kubenswrapper[4815]: I1207 19:35:45.919479 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d115a489-cf3a-4e3c-af23-fbe3875a65f2-openstack-config\") pod \"openstackclient\" (UID: \"d115a489-cf3a-4e3c-af23-fbe3875a65f2\") " pod="openstack/openstackclient" Dec 07 19:35:45 crc kubenswrapper[4815]: I1207 19:35:45.919561 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d115a489-cf3a-4e3c-af23-fbe3875a65f2-openstack-config-secret\") pod \"openstackclient\" (UID: \"d115a489-cf3a-4e3c-af23-fbe3875a65f2\") " pod="openstack/openstackclient" Dec 07 19:35:45 crc kubenswrapper[4815]: I1207 19:35:45.919697 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68bvv\" (UniqueName: \"kubernetes.io/projected/d115a489-cf3a-4e3c-af23-fbe3875a65f2-kube-api-access-68bvv\") pod \"openstackclient\" (UID: \"d115a489-cf3a-4e3c-af23-fbe3875a65f2\") " pod="openstack/openstackclient" Dec 07 19:35:45 crc kubenswrapper[4815]: I1207 19:35:45.919899 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d115a489-cf3a-4e3c-af23-fbe3875a65f2-combined-ca-bundle\") pod \"openstackclient\" (UID: \"d115a489-cf3a-4e3c-af23-fbe3875a65f2\") " pod="openstack/openstackclient" Dec 07 19:35:45 crc kubenswrapper[4815]: I1207 19:35:45.964215 4815 generic.go:334] "Generic (PLEG): container finished" podID="03745531-bf65-4651-8ce0-de448fb99038" containerID="3e667753123e28d9fb46a094341f67ee9ff2fed6bf88d502bbc32bb56b5964a6" exitCode=143 Dec 07 19:35:45 crc kubenswrapper[4815]: I1207 19:35:45.964503 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 07 19:35:45 crc kubenswrapper[4815]: I1207 19:35:45.965126 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6cb9b7fdf8-65cp6" event={"ID":"03745531-bf65-4651-8ce0-de448fb99038","Type":"ContainerDied","Data":"3e667753123e28d9fb46a094341f67ee9ff2fed6bf88d502bbc32bb56b5964a6"} Dec 07 19:35:45 crc kubenswrapper[4815]: I1207 19:35:45.973998 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 07 19:35:45 crc kubenswrapper[4815]: I1207 19:35:45.976867 4815 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="f89ccff5-752c-4e06-ba7f-dea8d1e8d502" podUID="d115a489-cf3a-4e3c-af23-fbe3875a65f2" Dec 07 19:35:46 crc kubenswrapper[4815]: I1207 19:35:46.021718 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d115a489-cf3a-4e3c-af23-fbe3875a65f2-openstack-config-secret\") pod \"openstackclient\" (UID: \"d115a489-cf3a-4e3c-af23-fbe3875a65f2\") " pod="openstack/openstackclient" Dec 07 19:35:46 crc kubenswrapper[4815]: I1207 19:35:46.022079 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68bvv\" (UniqueName: \"kubernetes.io/projected/d115a489-cf3a-4e3c-af23-fbe3875a65f2-kube-api-access-68bvv\") pod \"openstackclient\" (UID: \"d115a489-cf3a-4e3c-af23-fbe3875a65f2\") " pod="openstack/openstackclient" Dec 07 19:35:46 crc kubenswrapper[4815]: I1207 19:35:46.022161 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d115a489-cf3a-4e3c-af23-fbe3875a65f2-combined-ca-bundle\") pod \"openstackclient\" (UID: \"d115a489-cf3a-4e3c-af23-fbe3875a65f2\") " pod="openstack/openstackclient" Dec 07 19:35:46 crc kubenswrapper[4815]: I1207 19:35:46.022273 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d115a489-cf3a-4e3c-af23-fbe3875a65f2-openstack-config\") pod \"openstackclient\" (UID: \"d115a489-cf3a-4e3c-af23-fbe3875a65f2\") " pod="openstack/openstackclient" Dec 07 19:35:46 crc kubenswrapper[4815]: I1207 19:35:46.023222 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d115a489-cf3a-4e3c-af23-fbe3875a65f2-openstack-config\") pod \"openstackclient\" (UID: \"d115a489-cf3a-4e3c-af23-fbe3875a65f2\") " pod="openstack/openstackclient" Dec 07 19:35:46 crc kubenswrapper[4815]: I1207 19:35:46.027606 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d115a489-cf3a-4e3c-af23-fbe3875a65f2-openstack-config-secret\") pod \"openstackclient\" (UID: \"d115a489-cf3a-4e3c-af23-fbe3875a65f2\") " pod="openstack/openstackclient" Dec 07 19:35:46 crc kubenswrapper[4815]: I1207 19:35:46.037934 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d115a489-cf3a-4e3c-af23-fbe3875a65f2-combined-ca-bundle\") pod \"openstackclient\" (UID: \"d115a489-cf3a-4e3c-af23-fbe3875a65f2\") " pod="openstack/openstackclient" Dec 07 19:35:46 crc kubenswrapper[4815]: I1207 19:35:46.041546 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68bvv\" (UniqueName: \"kubernetes.io/projected/d115a489-cf3a-4e3c-af23-fbe3875a65f2-kube-api-access-68bvv\") pod \"openstackclient\" (UID: \"d115a489-cf3a-4e3c-af23-fbe3875a65f2\") " pod="openstack/openstackclient" Dec 07 19:35:46 crc kubenswrapper[4815]: I1207 19:35:46.108672 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 07 19:35:46 crc kubenswrapper[4815]: I1207 19:35:46.125535 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f89ccff5-752c-4e06-ba7f-dea8d1e8d502-combined-ca-bundle\") pod \"f89ccff5-752c-4e06-ba7f-dea8d1e8d502\" (UID: \"f89ccff5-752c-4e06-ba7f-dea8d1e8d502\") " Dec 07 19:35:46 crc kubenswrapper[4815]: I1207 19:35:46.125693 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f89ccff5-752c-4e06-ba7f-dea8d1e8d502-openstack-config-secret\") pod \"f89ccff5-752c-4e06-ba7f-dea8d1e8d502\" (UID: \"f89ccff5-752c-4e06-ba7f-dea8d1e8d502\") " Dec 07 19:35:46 crc kubenswrapper[4815]: I1207 19:35:46.125792 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f89ccff5-752c-4e06-ba7f-dea8d1e8d502-openstack-config\") pod \"f89ccff5-752c-4e06-ba7f-dea8d1e8d502\" (UID: \"f89ccff5-752c-4e06-ba7f-dea8d1e8d502\") " Dec 07 19:35:46 crc kubenswrapper[4815]: I1207 19:35:46.126332 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9wxj9\" (UniqueName: \"kubernetes.io/projected/f89ccff5-752c-4e06-ba7f-dea8d1e8d502-kube-api-access-9wxj9\") on node \"crc\" DevicePath \"\"" Dec 07 19:35:46 crc kubenswrapper[4815]: I1207 19:35:46.126511 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f89ccff5-752c-4e06-ba7f-dea8d1e8d502-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "f89ccff5-752c-4e06-ba7f-dea8d1e8d502" (UID: "f89ccff5-752c-4e06-ba7f-dea8d1e8d502"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:35:46 crc kubenswrapper[4815]: I1207 19:35:46.132179 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f89ccff5-752c-4e06-ba7f-dea8d1e8d502-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f89ccff5-752c-4e06-ba7f-dea8d1e8d502" (UID: "f89ccff5-752c-4e06-ba7f-dea8d1e8d502"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:35:46 crc kubenswrapper[4815]: I1207 19:35:46.132266 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f89ccff5-752c-4e06-ba7f-dea8d1e8d502-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "f89ccff5-752c-4e06-ba7f-dea8d1e8d502" (UID: "f89ccff5-752c-4e06-ba7f-dea8d1e8d502"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:35:46 crc kubenswrapper[4815]: I1207 19:35:46.227621 4815 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f89ccff5-752c-4e06-ba7f-dea8d1e8d502-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 07 19:35:46 crc kubenswrapper[4815]: I1207 19:35:46.227653 4815 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f89ccff5-752c-4e06-ba7f-dea8d1e8d502-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 07 19:35:46 crc kubenswrapper[4815]: I1207 19:35:46.227669 4815 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f89ccff5-752c-4e06-ba7f-dea8d1e8d502-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 07 19:35:46 crc kubenswrapper[4815]: I1207 19:35:46.617756 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 07 19:35:46 crc kubenswrapper[4815]: I1207 19:35:46.979284 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"d115a489-cf3a-4e3c-af23-fbe3875a65f2","Type":"ContainerStarted","Data":"48950c6faef7b901d69154b0e20917acdbe01c31c84be68fe53317e0342e518c"} Dec 07 19:35:46 crc kubenswrapper[4815]: I1207 19:35:46.979715 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 07 19:35:46 crc kubenswrapper[4815]: I1207 19:35:46.994055 4815 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="f89ccff5-752c-4e06-ba7f-dea8d1e8d502" podUID="d115a489-cf3a-4e3c-af23-fbe3875a65f2" Dec 07 19:35:47 crc kubenswrapper[4815]: I1207 19:35:47.337971 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 07 19:35:47 crc kubenswrapper[4815]: I1207 19:35:47.783197 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f89ccff5-752c-4e06-ba7f-dea8d1e8d502" path="/var/lib/kubelet/pods/f89ccff5-752c-4e06-ba7f-dea8d1e8d502/volumes" Dec 07 19:35:48 crc kubenswrapper[4815]: I1207 19:35:48.122816 4815 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6cb9b7fdf8-65cp6" podUID="03745531-bf65-4651-8ce0-de448fb99038" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.146:9311/healthcheck\": read tcp 10.217.0.2:55718->10.217.0.146:9311: read: connection reset by peer" Dec 07 19:35:48 crc kubenswrapper[4815]: I1207 19:35:48.124574 4815 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6cb9b7fdf8-65cp6" podUID="03745531-bf65-4651-8ce0-de448fb99038" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.146:9311/healthcheck\": read tcp 10.217.0.2:55720->10.217.0.146:9311: read: connection reset by peer" Dec 07 19:35:48 crc kubenswrapper[4815]: I1207 19:35:48.623725 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6cb9b7fdf8-65cp6" Dec 07 19:35:48 crc kubenswrapper[4815]: I1207 19:35:48.764601 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/03745531-bf65-4651-8ce0-de448fb99038-logs\") pod \"03745531-bf65-4651-8ce0-de448fb99038\" (UID: \"03745531-bf65-4651-8ce0-de448fb99038\") " Dec 07 19:35:48 crc kubenswrapper[4815]: I1207 19:35:48.764713 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03745531-bf65-4651-8ce0-de448fb99038-config-data\") pod \"03745531-bf65-4651-8ce0-de448fb99038\" (UID: \"03745531-bf65-4651-8ce0-de448fb99038\") " Dec 07 19:35:48 crc kubenswrapper[4815]: I1207 19:35:48.765283 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03745531-bf65-4651-8ce0-de448fb99038-logs" (OuterVolumeSpecName: "logs") pod "03745531-bf65-4651-8ce0-de448fb99038" (UID: "03745531-bf65-4651-8ce0-de448fb99038"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 07 19:35:48 crc kubenswrapper[4815]: I1207 19:35:48.765500 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/03745531-bf65-4651-8ce0-de448fb99038-config-data-custom\") pod \"03745531-bf65-4651-8ce0-de448fb99038\" (UID: \"03745531-bf65-4651-8ce0-de448fb99038\") " Dec 07 19:35:48 crc kubenswrapper[4815]: I1207 19:35:48.765524 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r5m6q\" (UniqueName: \"kubernetes.io/projected/03745531-bf65-4651-8ce0-de448fb99038-kube-api-access-r5m6q\") pod \"03745531-bf65-4651-8ce0-de448fb99038\" (UID: \"03745531-bf65-4651-8ce0-de448fb99038\") " Dec 07 19:35:48 crc kubenswrapper[4815]: I1207 19:35:48.765692 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03745531-bf65-4651-8ce0-de448fb99038-combined-ca-bundle\") pod \"03745531-bf65-4651-8ce0-de448fb99038\" (UID: \"03745531-bf65-4651-8ce0-de448fb99038\") " Dec 07 19:35:48 crc kubenswrapper[4815]: I1207 19:35:48.766006 4815 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/03745531-bf65-4651-8ce0-de448fb99038-logs\") on node \"crc\" DevicePath \"\"" Dec 07 19:35:48 crc kubenswrapper[4815]: I1207 19:35:48.770490 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03745531-bf65-4651-8ce0-de448fb99038-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "03745531-bf65-4651-8ce0-de448fb99038" (UID: "03745531-bf65-4651-8ce0-de448fb99038"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:35:48 crc kubenswrapper[4815]: I1207 19:35:48.775089 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03745531-bf65-4651-8ce0-de448fb99038-kube-api-access-r5m6q" (OuterVolumeSpecName: "kube-api-access-r5m6q") pod "03745531-bf65-4651-8ce0-de448fb99038" (UID: "03745531-bf65-4651-8ce0-de448fb99038"). InnerVolumeSpecName "kube-api-access-r5m6q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:35:48 crc kubenswrapper[4815]: I1207 19:35:48.818118 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03745531-bf65-4651-8ce0-de448fb99038-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "03745531-bf65-4651-8ce0-de448fb99038" (UID: "03745531-bf65-4651-8ce0-de448fb99038"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:35:48 crc kubenswrapper[4815]: I1207 19:35:48.836133 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03745531-bf65-4651-8ce0-de448fb99038-config-data" (OuterVolumeSpecName: "config-data") pod "03745531-bf65-4651-8ce0-de448fb99038" (UID: "03745531-bf65-4651-8ce0-de448fb99038"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:35:48 crc kubenswrapper[4815]: I1207 19:35:48.867883 4815 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/03745531-bf65-4651-8ce0-de448fb99038-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 07 19:35:48 crc kubenswrapper[4815]: I1207 19:35:48.867923 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r5m6q\" (UniqueName: \"kubernetes.io/projected/03745531-bf65-4651-8ce0-de448fb99038-kube-api-access-r5m6q\") on node \"crc\" DevicePath \"\"" Dec 07 19:35:48 crc kubenswrapper[4815]: I1207 19:35:48.867936 4815 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03745531-bf65-4651-8ce0-de448fb99038-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 07 19:35:48 crc kubenswrapper[4815]: I1207 19:35:48.867944 4815 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03745531-bf65-4651-8ce0-de448fb99038-config-data\") on node \"crc\" DevicePath \"\"" Dec 07 19:35:49 crc kubenswrapper[4815]: I1207 19:35:49.007404 4815 generic.go:334] "Generic (PLEG): container finished" podID="03745531-bf65-4651-8ce0-de448fb99038" containerID="3993ff65020a7db120f04627f871934000c74a34730cccb6fd8b45cab1a85bc1" exitCode=0 Dec 07 19:35:49 crc kubenswrapper[4815]: I1207 19:35:49.007451 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6cb9b7fdf8-65cp6" event={"ID":"03745531-bf65-4651-8ce0-de448fb99038","Type":"ContainerDied","Data":"3993ff65020a7db120f04627f871934000c74a34730cccb6fd8b45cab1a85bc1"} Dec 07 19:35:49 crc kubenswrapper[4815]: I1207 19:35:49.007496 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6cb9b7fdf8-65cp6" event={"ID":"03745531-bf65-4651-8ce0-de448fb99038","Type":"ContainerDied","Data":"4e124be2aca8b2ca6ce39e2d7244ec4885a5c1a02a9116350b06737962af79f2"} Dec 07 19:35:49 crc kubenswrapper[4815]: I1207 19:35:49.007519 4815 scope.go:117] "RemoveContainer" containerID="3993ff65020a7db120f04627f871934000c74a34730cccb6fd8b45cab1a85bc1" Dec 07 19:35:49 crc kubenswrapper[4815]: I1207 19:35:49.010552 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6cb9b7fdf8-65cp6" Dec 07 19:35:49 crc kubenswrapper[4815]: I1207 19:35:49.053242 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6cb9b7fdf8-65cp6"] Dec 07 19:35:49 crc kubenswrapper[4815]: I1207 19:35:49.058095 4815 scope.go:117] "RemoveContainer" containerID="3e667753123e28d9fb46a094341f67ee9ff2fed6bf88d502bbc32bb56b5964a6" Dec 07 19:35:49 crc kubenswrapper[4815]: I1207 19:35:49.062794 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-6cb9b7fdf8-65cp6"] Dec 07 19:35:49 crc kubenswrapper[4815]: I1207 19:35:49.081646 4815 scope.go:117] "RemoveContainer" containerID="3993ff65020a7db120f04627f871934000c74a34730cccb6fd8b45cab1a85bc1" Dec 07 19:35:49 crc kubenswrapper[4815]: E1207 19:35:49.082313 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3993ff65020a7db120f04627f871934000c74a34730cccb6fd8b45cab1a85bc1\": container with ID starting with 3993ff65020a7db120f04627f871934000c74a34730cccb6fd8b45cab1a85bc1 not found: ID does not exist" containerID="3993ff65020a7db120f04627f871934000c74a34730cccb6fd8b45cab1a85bc1" Dec 07 19:35:49 crc kubenswrapper[4815]: I1207 19:35:49.082445 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3993ff65020a7db120f04627f871934000c74a34730cccb6fd8b45cab1a85bc1"} err="failed to get container status \"3993ff65020a7db120f04627f871934000c74a34730cccb6fd8b45cab1a85bc1\": rpc error: code = NotFound desc = could not find container \"3993ff65020a7db120f04627f871934000c74a34730cccb6fd8b45cab1a85bc1\": container with ID starting with 3993ff65020a7db120f04627f871934000c74a34730cccb6fd8b45cab1a85bc1 not found: ID does not exist" Dec 07 19:35:49 crc kubenswrapper[4815]: I1207 19:35:49.082557 4815 scope.go:117] "RemoveContainer" containerID="3e667753123e28d9fb46a094341f67ee9ff2fed6bf88d502bbc32bb56b5964a6" Dec 07 19:35:49 crc kubenswrapper[4815]: E1207 19:35:49.083172 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e667753123e28d9fb46a094341f67ee9ff2fed6bf88d502bbc32bb56b5964a6\": container with ID starting with 3e667753123e28d9fb46a094341f67ee9ff2fed6bf88d502bbc32bb56b5964a6 not found: ID does not exist" containerID="3e667753123e28d9fb46a094341f67ee9ff2fed6bf88d502bbc32bb56b5964a6" Dec 07 19:35:49 crc kubenswrapper[4815]: I1207 19:35:49.083300 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e667753123e28d9fb46a094341f67ee9ff2fed6bf88d502bbc32bb56b5964a6"} err="failed to get container status \"3e667753123e28d9fb46a094341f67ee9ff2fed6bf88d502bbc32bb56b5964a6\": rpc error: code = NotFound desc = could not find container \"3e667753123e28d9fb46a094341f67ee9ff2fed6bf88d502bbc32bb56b5964a6\": container with ID starting with 3e667753123e28d9fb46a094341f67ee9ff2fed6bf88d502bbc32bb56b5964a6 not found: ID does not exist" Dec 07 19:35:49 crc kubenswrapper[4815]: I1207 19:35:49.810284 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03745531-bf65-4651-8ce0-de448fb99038" path="/var/lib/kubelet/pods/03745531-bf65-4651-8ce0-de448fb99038/volumes" Dec 07 19:35:50 crc kubenswrapper[4815]: I1207 19:35:50.785388 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 07 19:35:53 crc kubenswrapper[4815]: I1207 19:35:53.565516 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 07 19:35:53 crc kubenswrapper[4815]: I1207 19:35:53.566626 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1" containerName="ceilometer-central-agent" containerID="cri-o://e298f74a21b244c75c8fb6f5c735b8de7527b7362c9969e50ebf1236c8e1e037" gracePeriod=30 Dec 07 19:35:53 crc kubenswrapper[4815]: I1207 19:35:53.566779 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1" containerName="proxy-httpd" containerID="cri-o://c6b7bf7f663c29627ec6376c4bd7a71b49b335071e518b1c3bed920cfda6acc3" gracePeriod=30 Dec 07 19:35:53 crc kubenswrapper[4815]: I1207 19:35:53.566817 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1" containerName="sg-core" containerID="cri-o://de12606e922eed5e3216cbd98f35ac9f51c4271f8248bff6d4c6e51d8e266e67" gracePeriod=30 Dec 07 19:35:53 crc kubenswrapper[4815]: I1207 19:35:53.566847 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1" containerName="ceilometer-notification-agent" containerID="cri-o://0b9247d293527d45f23d929de7857537057e57de5415b603aaf0653a50d00168" gracePeriod=30 Dec 07 19:35:53 crc kubenswrapper[4815]: I1207 19:35:53.581396 4815 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Dec 07 19:35:54 crc kubenswrapper[4815]: I1207 19:35:54.055474 4815 generic.go:334] "Generic (PLEG): container finished" podID="392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1" containerID="c6b7bf7f663c29627ec6376c4bd7a71b49b335071e518b1c3bed920cfda6acc3" exitCode=0 Dec 07 19:35:54 crc kubenswrapper[4815]: I1207 19:35:54.055506 4815 generic.go:334] "Generic (PLEG): container finished" podID="392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1" containerID="de12606e922eed5e3216cbd98f35ac9f51c4271f8248bff6d4c6e51d8e266e67" exitCode=2 Dec 07 19:35:54 crc kubenswrapper[4815]: I1207 19:35:54.055526 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1","Type":"ContainerDied","Data":"c6b7bf7f663c29627ec6376c4bd7a71b49b335071e518b1c3bed920cfda6acc3"} Dec 07 19:35:54 crc kubenswrapper[4815]: I1207 19:35:54.055550 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1","Type":"ContainerDied","Data":"de12606e922eed5e3216cbd98f35ac9f51c4271f8248bff6d4c6e51d8e266e67"} Dec 07 19:35:56 crc kubenswrapper[4815]: I1207 19:35:55.069838 4815 generic.go:334] "Generic (PLEG): container finished" podID="392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1" containerID="e298f74a21b244c75c8fb6f5c735b8de7527b7362c9969e50ebf1236c8e1e037" exitCode=0 Dec 07 19:35:56 crc kubenswrapper[4815]: I1207 19:35:55.070007 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1","Type":"ContainerDied","Data":"e298f74a21b244c75c8fb6f5c735b8de7527b7362c9969e50ebf1236c8e1e037"} Dec 07 19:35:56 crc kubenswrapper[4815]: I1207 19:35:56.308707 4815 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.148:3000/\": dial tcp 10.217.0.148:3000: connect: connection refused" Dec 07 19:35:57 crc kubenswrapper[4815]: I1207 19:35:57.152710 4815 generic.go:334] "Generic (PLEG): container finished" podID="392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1" containerID="0b9247d293527d45f23d929de7857537057e57de5415b603aaf0653a50d00168" exitCode=0 Dec 07 19:35:57 crc kubenswrapper[4815]: I1207 19:35:57.152755 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1","Type":"ContainerDied","Data":"0b9247d293527d45f23d929de7857537057e57de5415b603aaf0653a50d00168"} Dec 07 19:35:57 crc kubenswrapper[4815]: I1207 19:35:57.254426 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-mxr8g"] Dec 07 19:35:57 crc kubenswrapper[4815]: E1207 19:35:57.254755 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03745531-bf65-4651-8ce0-de448fb99038" containerName="barbican-api" Dec 07 19:35:57 crc kubenswrapper[4815]: I1207 19:35:57.254772 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="03745531-bf65-4651-8ce0-de448fb99038" containerName="barbican-api" Dec 07 19:35:57 crc kubenswrapper[4815]: E1207 19:35:57.254811 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03745531-bf65-4651-8ce0-de448fb99038" containerName="barbican-api-log" Dec 07 19:35:57 crc kubenswrapper[4815]: I1207 19:35:57.254818 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="03745531-bf65-4651-8ce0-de448fb99038" containerName="barbican-api-log" Dec 07 19:35:57 crc kubenswrapper[4815]: I1207 19:35:57.254982 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="03745531-bf65-4651-8ce0-de448fb99038" containerName="barbican-api-log" Dec 07 19:35:57 crc kubenswrapper[4815]: I1207 19:35:57.255002 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="03745531-bf65-4651-8ce0-de448fb99038" containerName="barbican-api" Dec 07 19:35:57 crc kubenswrapper[4815]: I1207 19:35:57.255544 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-mxr8g" Dec 07 19:35:57 crc kubenswrapper[4815]: I1207 19:35:57.271194 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-mxr8g"] Dec 07 19:35:57 crc kubenswrapper[4815]: I1207 19:35:57.354167 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-9jkcq"] Dec 07 19:35:57 crc kubenswrapper[4815]: I1207 19:35:57.416731 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87680dbf-76ea-41e7-9349-4a83b07b7c8f-operator-scripts\") pod \"nova-api-db-create-mxr8g\" (UID: \"87680dbf-76ea-41e7-9349-4a83b07b7c8f\") " pod="openstack/nova-api-db-create-mxr8g" Dec 07 19:35:57 crc kubenswrapper[4815]: I1207 19:35:57.416778 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jpxz\" (UniqueName: \"kubernetes.io/projected/87680dbf-76ea-41e7-9349-4a83b07b7c8f-kube-api-access-9jpxz\") pod \"nova-api-db-create-mxr8g\" (UID: \"87680dbf-76ea-41e7-9349-4a83b07b7c8f\") " pod="openstack/nova-api-db-create-mxr8g" Dec 07 19:35:57 crc kubenswrapper[4815]: I1207 19:35:57.417781 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-9jkcq" Dec 07 19:35:57 crc kubenswrapper[4815]: I1207 19:35:57.441387 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-9jkcq"] Dec 07 19:35:57 crc kubenswrapper[4815]: I1207 19:35:57.454389 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-ba1d-account-create-update-kt8v7"] Dec 07 19:35:57 crc kubenswrapper[4815]: I1207 19:35:57.456067 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-ba1d-account-create-update-kt8v7" Dec 07 19:35:57 crc kubenswrapper[4815]: I1207 19:35:57.458334 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Dec 07 19:35:57 crc kubenswrapper[4815]: I1207 19:35:57.477194 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-ba1d-account-create-update-kt8v7"] Dec 07 19:35:57 crc kubenswrapper[4815]: I1207 19:35:57.518866 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kslhq\" (UniqueName: \"kubernetes.io/projected/d740fe41-3740-496c-b34d-3fd2f63fb619-kube-api-access-kslhq\") pod \"nova-cell0-db-create-9jkcq\" (UID: \"d740fe41-3740-496c-b34d-3fd2f63fb619\") " pod="openstack/nova-cell0-db-create-9jkcq" Dec 07 19:35:57 crc kubenswrapper[4815]: I1207 19:35:57.519151 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d740fe41-3740-496c-b34d-3fd2f63fb619-operator-scripts\") pod \"nova-cell0-db-create-9jkcq\" (UID: \"d740fe41-3740-496c-b34d-3fd2f63fb619\") " pod="openstack/nova-cell0-db-create-9jkcq" Dec 07 19:35:57 crc kubenswrapper[4815]: I1207 19:35:57.519218 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87680dbf-76ea-41e7-9349-4a83b07b7c8f-operator-scripts\") pod \"nova-api-db-create-mxr8g\" (UID: \"87680dbf-76ea-41e7-9349-4a83b07b7c8f\") " pod="openstack/nova-api-db-create-mxr8g" Dec 07 19:35:57 crc kubenswrapper[4815]: I1207 19:35:57.519274 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jpxz\" (UniqueName: \"kubernetes.io/projected/87680dbf-76ea-41e7-9349-4a83b07b7c8f-kube-api-access-9jpxz\") pod \"nova-api-db-create-mxr8g\" (UID: \"87680dbf-76ea-41e7-9349-4a83b07b7c8f\") " pod="openstack/nova-api-db-create-mxr8g" Dec 07 19:35:57 crc kubenswrapper[4815]: I1207 19:35:57.520524 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87680dbf-76ea-41e7-9349-4a83b07b7c8f-operator-scripts\") pod \"nova-api-db-create-mxr8g\" (UID: \"87680dbf-76ea-41e7-9349-4a83b07b7c8f\") " pod="openstack/nova-api-db-create-mxr8g" Dec 07 19:35:57 crc kubenswrapper[4815]: I1207 19:35:57.548341 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jpxz\" (UniqueName: \"kubernetes.io/projected/87680dbf-76ea-41e7-9349-4a83b07b7c8f-kube-api-access-9jpxz\") pod \"nova-api-db-create-mxr8g\" (UID: \"87680dbf-76ea-41e7-9349-4a83b07b7c8f\") " pod="openstack/nova-api-db-create-mxr8g" Dec 07 19:35:57 crc kubenswrapper[4815]: I1207 19:35:57.557974 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-gmwt8"] Dec 07 19:35:57 crc kubenswrapper[4815]: I1207 19:35:57.559608 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-gmwt8" Dec 07 19:35:57 crc kubenswrapper[4815]: I1207 19:35:57.570671 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-gmwt8"] Dec 07 19:35:57 crc kubenswrapper[4815]: I1207 19:35:57.571240 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-mxr8g" Dec 07 19:35:57 crc kubenswrapper[4815]: I1207 19:35:57.621138 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d740fe41-3740-496c-b34d-3fd2f63fb619-operator-scripts\") pod \"nova-cell0-db-create-9jkcq\" (UID: \"d740fe41-3740-496c-b34d-3fd2f63fb619\") " pod="openstack/nova-cell0-db-create-9jkcq" Dec 07 19:35:57 crc kubenswrapper[4815]: I1207 19:35:57.621501 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54cb9bbd-aa49-4113-8094-db794e456915-operator-scripts\") pod \"nova-api-ba1d-account-create-update-kt8v7\" (UID: \"54cb9bbd-aa49-4113-8094-db794e456915\") " pod="openstack/nova-api-ba1d-account-create-update-kt8v7" Dec 07 19:35:57 crc kubenswrapper[4815]: I1207 19:35:57.621730 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kslhq\" (UniqueName: \"kubernetes.io/projected/d740fe41-3740-496c-b34d-3fd2f63fb619-kube-api-access-kslhq\") pod \"nova-cell0-db-create-9jkcq\" (UID: \"d740fe41-3740-496c-b34d-3fd2f63fb619\") " pod="openstack/nova-cell0-db-create-9jkcq" Dec 07 19:35:57 crc kubenswrapper[4815]: I1207 19:35:57.622193 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgp87\" (UniqueName: \"kubernetes.io/projected/54cb9bbd-aa49-4113-8094-db794e456915-kube-api-access-sgp87\") pod \"nova-api-ba1d-account-create-update-kt8v7\" (UID: \"54cb9bbd-aa49-4113-8094-db794e456915\") " pod="openstack/nova-api-ba1d-account-create-update-kt8v7" Dec 07 19:35:57 crc kubenswrapper[4815]: I1207 19:35:57.622445 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d740fe41-3740-496c-b34d-3fd2f63fb619-operator-scripts\") pod \"nova-cell0-db-create-9jkcq\" (UID: \"d740fe41-3740-496c-b34d-3fd2f63fb619\") " pod="openstack/nova-cell0-db-create-9jkcq" Dec 07 19:35:57 crc kubenswrapper[4815]: I1207 19:35:57.640593 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kslhq\" (UniqueName: \"kubernetes.io/projected/d740fe41-3740-496c-b34d-3fd2f63fb619-kube-api-access-kslhq\") pod \"nova-cell0-db-create-9jkcq\" (UID: \"d740fe41-3740-496c-b34d-3fd2f63fb619\") " pod="openstack/nova-cell0-db-create-9jkcq" Dec 07 19:35:57 crc kubenswrapper[4815]: I1207 19:35:57.673517 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-d187-account-create-update-db6lq"] Dec 07 19:35:57 crc kubenswrapper[4815]: I1207 19:35:57.674636 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-d187-account-create-update-db6lq" Dec 07 19:35:57 crc kubenswrapper[4815]: I1207 19:35:57.676505 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Dec 07 19:35:57 crc kubenswrapper[4815]: I1207 19:35:57.685726 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-d187-account-create-update-db6lq"] Dec 07 19:35:57 crc kubenswrapper[4815]: I1207 19:35:57.724882 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54cb9bbd-aa49-4113-8094-db794e456915-operator-scripts\") pod \"nova-api-ba1d-account-create-update-kt8v7\" (UID: \"54cb9bbd-aa49-4113-8094-db794e456915\") " pod="openstack/nova-api-ba1d-account-create-update-kt8v7" Dec 07 19:35:57 crc kubenswrapper[4815]: I1207 19:35:57.724952 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e656eb7e-25aa-4982-a61a-d5a41cfef90d-operator-scripts\") pod \"nova-cell1-db-create-gmwt8\" (UID: \"e656eb7e-25aa-4982-a61a-d5a41cfef90d\") " pod="openstack/nova-cell1-db-create-gmwt8" Dec 07 19:35:57 crc kubenswrapper[4815]: I1207 19:35:57.724982 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99p56\" (UniqueName: \"kubernetes.io/projected/e656eb7e-25aa-4982-a61a-d5a41cfef90d-kube-api-access-99p56\") pod \"nova-cell1-db-create-gmwt8\" (UID: \"e656eb7e-25aa-4982-a61a-d5a41cfef90d\") " pod="openstack/nova-cell1-db-create-gmwt8" Dec 07 19:35:57 crc kubenswrapper[4815]: I1207 19:35:57.725026 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgp87\" (UniqueName: \"kubernetes.io/projected/54cb9bbd-aa49-4113-8094-db794e456915-kube-api-access-sgp87\") pod \"nova-api-ba1d-account-create-update-kt8v7\" (UID: \"54cb9bbd-aa49-4113-8094-db794e456915\") " pod="openstack/nova-api-ba1d-account-create-update-kt8v7" Dec 07 19:35:57 crc kubenswrapper[4815]: I1207 19:35:57.725604 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54cb9bbd-aa49-4113-8094-db794e456915-operator-scripts\") pod \"nova-api-ba1d-account-create-update-kt8v7\" (UID: \"54cb9bbd-aa49-4113-8094-db794e456915\") " pod="openstack/nova-api-ba1d-account-create-update-kt8v7" Dec 07 19:35:57 crc kubenswrapper[4815]: I1207 19:35:57.745534 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgp87\" (UniqueName: \"kubernetes.io/projected/54cb9bbd-aa49-4113-8094-db794e456915-kube-api-access-sgp87\") pod \"nova-api-ba1d-account-create-update-kt8v7\" (UID: \"54cb9bbd-aa49-4113-8094-db794e456915\") " pod="openstack/nova-api-ba1d-account-create-update-kt8v7" Dec 07 19:35:57 crc kubenswrapper[4815]: I1207 19:35:57.767835 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-9jkcq" Dec 07 19:35:57 crc kubenswrapper[4815]: I1207 19:35:57.781292 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-ba1d-account-create-update-kt8v7" Dec 07 19:35:57 crc kubenswrapper[4815]: I1207 19:35:57.829253 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c4283dd6-964a-4f3d-8509-7a2c7fbd393f-operator-scripts\") pod \"nova-cell0-d187-account-create-update-db6lq\" (UID: \"c4283dd6-964a-4f3d-8509-7a2c7fbd393f\") " pod="openstack/nova-cell0-d187-account-create-update-db6lq" Dec 07 19:35:57 crc kubenswrapper[4815]: I1207 19:35:57.829330 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e656eb7e-25aa-4982-a61a-d5a41cfef90d-operator-scripts\") pod \"nova-cell1-db-create-gmwt8\" (UID: \"e656eb7e-25aa-4982-a61a-d5a41cfef90d\") " pod="openstack/nova-cell1-db-create-gmwt8" Dec 07 19:35:57 crc kubenswrapper[4815]: I1207 19:35:57.829360 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99p56\" (UniqueName: \"kubernetes.io/projected/e656eb7e-25aa-4982-a61a-d5a41cfef90d-kube-api-access-99p56\") pod \"nova-cell1-db-create-gmwt8\" (UID: \"e656eb7e-25aa-4982-a61a-d5a41cfef90d\") " pod="openstack/nova-cell1-db-create-gmwt8" Dec 07 19:35:57 crc kubenswrapper[4815]: I1207 19:35:57.829382 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtll6\" (UniqueName: \"kubernetes.io/projected/c4283dd6-964a-4f3d-8509-7a2c7fbd393f-kube-api-access-rtll6\") pod \"nova-cell0-d187-account-create-update-db6lq\" (UID: \"c4283dd6-964a-4f3d-8509-7a2c7fbd393f\") " pod="openstack/nova-cell0-d187-account-create-update-db6lq" Dec 07 19:35:57 crc kubenswrapper[4815]: I1207 19:35:57.830136 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e656eb7e-25aa-4982-a61a-d5a41cfef90d-operator-scripts\") pod \"nova-cell1-db-create-gmwt8\" (UID: \"e656eb7e-25aa-4982-a61a-d5a41cfef90d\") " pod="openstack/nova-cell1-db-create-gmwt8" Dec 07 19:35:57 crc kubenswrapper[4815]: I1207 19:35:57.937604 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtll6\" (UniqueName: \"kubernetes.io/projected/c4283dd6-964a-4f3d-8509-7a2c7fbd393f-kube-api-access-rtll6\") pod \"nova-cell0-d187-account-create-update-db6lq\" (UID: \"c4283dd6-964a-4f3d-8509-7a2c7fbd393f\") " pod="openstack/nova-cell0-d187-account-create-update-db6lq" Dec 07 19:35:57 crc kubenswrapper[4815]: I1207 19:35:57.937740 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c4283dd6-964a-4f3d-8509-7a2c7fbd393f-operator-scripts\") pod \"nova-cell0-d187-account-create-update-db6lq\" (UID: \"c4283dd6-964a-4f3d-8509-7a2c7fbd393f\") " pod="openstack/nova-cell0-d187-account-create-update-db6lq" Dec 07 19:35:57 crc kubenswrapper[4815]: I1207 19:35:57.938788 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c4283dd6-964a-4f3d-8509-7a2c7fbd393f-operator-scripts\") pod \"nova-cell0-d187-account-create-update-db6lq\" (UID: \"c4283dd6-964a-4f3d-8509-7a2c7fbd393f\") " pod="openstack/nova-cell0-d187-account-create-update-db6lq" Dec 07 19:35:57 crc kubenswrapper[4815]: I1207 19:35:57.939439 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99p56\" (UniqueName: \"kubernetes.io/projected/e656eb7e-25aa-4982-a61a-d5a41cfef90d-kube-api-access-99p56\") pod \"nova-cell1-db-create-gmwt8\" (UID: \"e656eb7e-25aa-4982-a61a-d5a41cfef90d\") " pod="openstack/nova-cell1-db-create-gmwt8" Dec 07 19:35:57 crc kubenswrapper[4815]: I1207 19:35:57.962992 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-5cd5-account-create-update-tdn22"] Dec 07 19:35:57 crc kubenswrapper[4815]: I1207 19:35:57.964101 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-5cd5-account-create-update-tdn22" Dec 07 19:35:57 crc kubenswrapper[4815]: I1207 19:35:57.967600 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Dec 07 19:35:57 crc kubenswrapper[4815]: I1207 19:35:57.978452 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtll6\" (UniqueName: \"kubernetes.io/projected/c4283dd6-964a-4f3d-8509-7a2c7fbd393f-kube-api-access-rtll6\") pod \"nova-cell0-d187-account-create-update-db6lq\" (UID: \"c4283dd6-964a-4f3d-8509-7a2c7fbd393f\") " pod="openstack/nova-cell0-d187-account-create-update-db6lq" Dec 07 19:35:57 crc kubenswrapper[4815]: I1207 19:35:57.997151 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-5cd5-account-create-update-tdn22"] Dec 07 19:35:58 crc kubenswrapper[4815]: I1207 19:35:58.002459 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-d187-account-create-update-db6lq" Dec 07 19:35:58 crc kubenswrapper[4815]: I1207 19:35:58.140435 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cca9cdb9-858e-463a-b89f-4fe86331b304-operator-scripts\") pod \"nova-cell1-5cd5-account-create-update-tdn22\" (UID: \"cca9cdb9-858e-463a-b89f-4fe86331b304\") " pod="openstack/nova-cell1-5cd5-account-create-update-tdn22" Dec 07 19:35:58 crc kubenswrapper[4815]: I1207 19:35:58.140551 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lvss\" (UniqueName: \"kubernetes.io/projected/cca9cdb9-858e-463a-b89f-4fe86331b304-kube-api-access-9lvss\") pod \"nova-cell1-5cd5-account-create-update-tdn22\" (UID: \"cca9cdb9-858e-463a-b89f-4fe86331b304\") " pod="openstack/nova-cell1-5cd5-account-create-update-tdn22" Dec 07 19:35:58 crc kubenswrapper[4815]: I1207 19:35:58.209020 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-gmwt8" Dec 07 19:35:58 crc kubenswrapper[4815]: I1207 19:35:58.242053 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lvss\" (UniqueName: \"kubernetes.io/projected/cca9cdb9-858e-463a-b89f-4fe86331b304-kube-api-access-9lvss\") pod \"nova-cell1-5cd5-account-create-update-tdn22\" (UID: \"cca9cdb9-858e-463a-b89f-4fe86331b304\") " pod="openstack/nova-cell1-5cd5-account-create-update-tdn22" Dec 07 19:35:58 crc kubenswrapper[4815]: I1207 19:35:58.242210 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cca9cdb9-858e-463a-b89f-4fe86331b304-operator-scripts\") pod \"nova-cell1-5cd5-account-create-update-tdn22\" (UID: \"cca9cdb9-858e-463a-b89f-4fe86331b304\") " pod="openstack/nova-cell1-5cd5-account-create-update-tdn22" Dec 07 19:35:58 crc kubenswrapper[4815]: I1207 19:35:58.242892 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cca9cdb9-858e-463a-b89f-4fe86331b304-operator-scripts\") pod \"nova-cell1-5cd5-account-create-update-tdn22\" (UID: \"cca9cdb9-858e-463a-b89f-4fe86331b304\") " pod="openstack/nova-cell1-5cd5-account-create-update-tdn22" Dec 07 19:35:58 crc kubenswrapper[4815]: I1207 19:35:58.262938 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lvss\" (UniqueName: \"kubernetes.io/projected/cca9cdb9-858e-463a-b89f-4fe86331b304-kube-api-access-9lvss\") pod \"nova-cell1-5cd5-account-create-update-tdn22\" (UID: \"cca9cdb9-858e-463a-b89f-4fe86331b304\") " pod="openstack/nova-cell1-5cd5-account-create-update-tdn22" Dec 07 19:35:58 crc kubenswrapper[4815]: I1207 19:35:58.338551 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-5cd5-account-create-update-tdn22" Dec 07 19:36:01 crc kubenswrapper[4815]: I1207 19:36:01.414814 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 07 19:36:01 crc kubenswrapper[4815]: I1207 19:36:01.508150 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1-combined-ca-bundle\") pod \"392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1\" (UID: \"392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1\") " Dec 07 19:36:01 crc kubenswrapper[4815]: I1207 19:36:01.508424 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1-log-httpd\") pod \"392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1\" (UID: \"392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1\") " Dec 07 19:36:01 crc kubenswrapper[4815]: I1207 19:36:01.508454 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1-run-httpd\") pod \"392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1\" (UID: \"392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1\") " Dec 07 19:36:01 crc kubenswrapper[4815]: I1207 19:36:01.508478 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1-sg-core-conf-yaml\") pod \"392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1\" (UID: \"392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1\") " Dec 07 19:36:01 crc kubenswrapper[4815]: I1207 19:36:01.508499 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lm5wj\" (UniqueName: \"kubernetes.io/projected/392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1-kube-api-access-lm5wj\") pod \"392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1\" (UID: \"392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1\") " Dec 07 19:36:01 crc kubenswrapper[4815]: I1207 19:36:01.510541 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1" (UID: "392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 07 19:36:01 crc kubenswrapper[4815]: I1207 19:36:01.511126 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1" (UID: "392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 07 19:36:01 crc kubenswrapper[4815]: I1207 19:36:01.513625 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1-kube-api-access-lm5wj" (OuterVolumeSpecName: "kube-api-access-lm5wj") pod "392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1" (UID: "392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1"). InnerVolumeSpecName "kube-api-access-lm5wj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:36:01 crc kubenswrapper[4815]: I1207 19:36:01.541966 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-9jkcq"] Dec 07 19:36:01 crc kubenswrapper[4815]: I1207 19:36:01.562276 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1" (UID: "392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:36:01 crc kubenswrapper[4815]: I1207 19:36:01.619701 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1-config-data\") pod \"392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1\" (UID: \"392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1\") " Dec 07 19:36:01 crc kubenswrapper[4815]: I1207 19:36:01.619836 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1-scripts\") pod \"392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1\" (UID: \"392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1\") " Dec 07 19:36:01 crc kubenswrapper[4815]: I1207 19:36:01.620328 4815 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 07 19:36:01 crc kubenswrapper[4815]: I1207 19:36:01.620345 4815 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 07 19:36:01 crc kubenswrapper[4815]: I1207 19:36:01.620358 4815 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 07 19:36:01 crc kubenswrapper[4815]: I1207 19:36:01.620370 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lm5wj\" (UniqueName: \"kubernetes.io/projected/392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1-kube-api-access-lm5wj\") on node \"crc\" DevicePath \"\"" Dec 07 19:36:01 crc kubenswrapper[4815]: I1207 19:36:01.624284 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1-scripts" (OuterVolumeSpecName: "scripts") pod "392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1" (UID: "392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:36:01 crc kubenswrapper[4815]: I1207 19:36:01.632022 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1" (UID: "392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:36:01 crc kubenswrapper[4815]: I1207 19:36:01.727325 4815 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1-scripts\") on node \"crc\" DevicePath \"\"" Dec 07 19:36:01 crc kubenswrapper[4815]: I1207 19:36:01.730298 4815 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 07 19:36:01 crc kubenswrapper[4815]: I1207 19:36:01.740872 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-mxr8g"] Dec 07 19:36:01 crc kubenswrapper[4815]: I1207 19:36:01.754275 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-ba1d-account-create-update-kt8v7"] Dec 07 19:36:01 crc kubenswrapper[4815]: I1207 19:36:01.847498 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1-config-data" (OuterVolumeSpecName: "config-data") pod "392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1" (UID: "392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:36:01 crc kubenswrapper[4815]: I1207 19:36:01.862046 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-gmwt8"] Dec 07 19:36:01 crc kubenswrapper[4815]: I1207 19:36:01.921590 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-d187-account-create-update-db6lq"] Dec 07 19:36:01 crc kubenswrapper[4815]: I1207 19:36:01.930610 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-5cd5-account-create-update-tdn22"] Dec 07 19:36:01 crc kubenswrapper[4815]: I1207 19:36:01.934119 4815 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1-config-data\") on node \"crc\" DevicePath \"\"" Dec 07 19:36:01 crc kubenswrapper[4815]: W1207 19:36:01.978716 4815 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcca9cdb9_858e_463a_b89f_4fe86331b304.slice/crio-ccfb2e2ce2620b5054ab843ee65944970962647d2cd858208bd6232bc978e8f5 WatchSource:0}: Error finding container ccfb2e2ce2620b5054ab843ee65944970962647d2cd858208bd6232bc978e8f5: Status 404 returned error can't find the container with id ccfb2e2ce2620b5054ab843ee65944970962647d2cd858208bd6232bc978e8f5 Dec 07 19:36:02 crc kubenswrapper[4815]: I1207 19:36:02.231431 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-9jkcq" event={"ID":"d740fe41-3740-496c-b34d-3fd2f63fb619","Type":"ContainerStarted","Data":"63c108d380c4d742b2fa6c28ab59f9759c819309bd5dc1bcb83bb82eaf53d029"} Dec 07 19:36:02 crc kubenswrapper[4815]: I1207 19:36:02.231482 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-9jkcq" event={"ID":"d740fe41-3740-496c-b34d-3fd2f63fb619","Type":"ContainerStarted","Data":"7b74433f1477265988ef2877029d9a9619be4f4679367b0dbfffc981489d302e"} Dec 07 19:36:02 crc kubenswrapper[4815]: I1207 19:36:02.234618 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-mxr8g" event={"ID":"87680dbf-76ea-41e7-9349-4a83b07b7c8f","Type":"ContainerStarted","Data":"31df9400bc214ed9b14b1b8916413c0977d9624efba7003e4d8c173732d571c0"} Dec 07 19:36:02 crc kubenswrapper[4815]: I1207 19:36:02.234651 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-mxr8g" event={"ID":"87680dbf-76ea-41e7-9349-4a83b07b7c8f","Type":"ContainerStarted","Data":"c333c8be53d52fe9f1fd3b0063826ef57c4505bbbcdd43ad466c19797c55ff87"} Dec 07 19:36:02 crc kubenswrapper[4815]: I1207 19:36:02.239276 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1","Type":"ContainerDied","Data":"c240d9f7e31d799e089a6148381fe8455fe8aa2707de858295af5fb4dee28565"} Dec 07 19:36:02 crc kubenswrapper[4815]: I1207 19:36:02.239318 4815 scope.go:117] "RemoveContainer" containerID="c6b7bf7f663c29627ec6376c4bd7a71b49b335071e518b1c3bed920cfda6acc3" Dec 07 19:36:02 crc kubenswrapper[4815]: I1207 19:36:02.239319 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 07 19:36:02 crc kubenswrapper[4815]: I1207 19:36:02.255875 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"d115a489-cf3a-4e3c-af23-fbe3875a65f2","Type":"ContainerStarted","Data":"b19363b29d6e67487d08278a0f812d1d357b061c64a3aff63463d45a186f58f6"} Dec 07 19:36:02 crc kubenswrapper[4815]: I1207 19:36:02.260036 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-9jkcq" podStartSLOduration=5.260023416 podStartE2EDuration="5.260023416s" podCreationTimestamp="2025-12-07 19:35:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:36:02.257174035 +0000 UTC m=+1266.836164070" watchObservedRunningTime="2025-12-07 19:36:02.260023416 +0000 UTC m=+1266.839013461" Dec 07 19:36:02 crc kubenswrapper[4815]: I1207 19:36:02.264314 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-d187-account-create-update-db6lq" event={"ID":"c4283dd6-964a-4f3d-8509-7a2c7fbd393f","Type":"ContainerStarted","Data":"0cec2bb0329f76f7f7a557247163d68d41b2eed2fffcd968cf72da9ca9e73cba"} Dec 07 19:36:02 crc kubenswrapper[4815]: I1207 19:36:02.264361 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-d187-account-create-update-db6lq" event={"ID":"c4283dd6-964a-4f3d-8509-7a2c7fbd393f","Type":"ContainerStarted","Data":"1c952e4669032d960120fe0ee39b774b8a361da093926fcba32c66e1cf6ef070"} Dec 07 19:36:02 crc kubenswrapper[4815]: I1207 19:36:02.288017 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-ba1d-account-create-update-kt8v7" event={"ID":"54cb9bbd-aa49-4113-8094-db794e456915","Type":"ContainerStarted","Data":"055213ce89c021f00d002f19dbb91660ad69d0f48893edf9abb3298d5ef36bc6"} Dec 07 19:36:02 crc kubenswrapper[4815]: I1207 19:36:02.288067 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-ba1d-account-create-update-kt8v7" event={"ID":"54cb9bbd-aa49-4113-8094-db794e456915","Type":"ContainerStarted","Data":"90d776b6a61b3128bf7045d0ee3f11da61e30244e0463d690bad10872409b6e9"} Dec 07 19:36:02 crc kubenswrapper[4815]: I1207 19:36:02.289535 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.994910337 podStartE2EDuration="17.289525239s" podCreationTimestamp="2025-12-07 19:35:45 +0000 UTC" firstStartedPulling="2025-12-07 19:35:46.632175201 +0000 UTC m=+1251.211165246" lastFinishedPulling="2025-12-07 19:36:00.926790103 +0000 UTC m=+1265.505780148" observedRunningTime="2025-12-07 19:36:02.280250737 +0000 UTC m=+1266.859240782" watchObservedRunningTime="2025-12-07 19:36:02.289525239 +0000 UTC m=+1266.868515274" Dec 07 19:36:02 crc kubenswrapper[4815]: I1207 19:36:02.291054 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-gmwt8" event={"ID":"e656eb7e-25aa-4982-a61a-d5a41cfef90d","Type":"ContainerStarted","Data":"7bcc6874aa3d429722f2cc094154f26214b1835ffe964a06f0c7f27672616f32"} Dec 07 19:36:02 crc kubenswrapper[4815]: I1207 19:36:02.291085 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-gmwt8" event={"ID":"e656eb7e-25aa-4982-a61a-d5a41cfef90d","Type":"ContainerStarted","Data":"d307783bc70029009e8f880fab65bf371a16aeb597f92d99314060345fa2eb2a"} Dec 07 19:36:02 crc kubenswrapper[4815]: I1207 19:36:02.296866 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-5cd5-account-create-update-tdn22" event={"ID":"cca9cdb9-858e-463a-b89f-4fe86331b304","Type":"ContainerStarted","Data":"2d596b7f5dc3f88d2fb84b5c96a9b7351ba0f39a7c0fe80afa378773bdfeef7f"} Dec 07 19:36:02 crc kubenswrapper[4815]: I1207 19:36:02.296899 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-5cd5-account-create-update-tdn22" event={"ID":"cca9cdb9-858e-463a-b89f-4fe86331b304","Type":"ContainerStarted","Data":"ccfb2e2ce2620b5054ab843ee65944970962647d2cd858208bd6232bc978e8f5"} Dec 07 19:36:02 crc kubenswrapper[4815]: I1207 19:36:02.302353 4815 scope.go:117] "RemoveContainer" containerID="de12606e922eed5e3216cbd98f35ac9f51c4271f8248bff6d4c6e51d8e266e67" Dec 07 19:36:02 crc kubenswrapper[4815]: I1207 19:36:02.330926 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-mxr8g" podStartSLOduration=5.330893636 podStartE2EDuration="5.330893636s" podCreationTimestamp="2025-12-07 19:35:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:36:02.297801302 +0000 UTC m=+1266.876791347" watchObservedRunningTime="2025-12-07 19:36:02.330893636 +0000 UTC m=+1266.909883681" Dec 07 19:36:02 crc kubenswrapper[4815]: I1207 19:36:02.348357 4815 scope.go:117] "RemoveContainer" containerID="0b9247d293527d45f23d929de7857537057e57de5415b603aaf0653a50d00168" Dec 07 19:36:02 crc kubenswrapper[4815]: I1207 19:36:02.372854 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-5cd5-account-create-update-tdn22" podStartSLOduration=5.37283182 podStartE2EDuration="5.37283182s" podCreationTimestamp="2025-12-07 19:35:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:36:02.337528404 +0000 UTC m=+1266.916518449" watchObservedRunningTime="2025-12-07 19:36:02.37283182 +0000 UTC m=+1266.951821865" Dec 07 19:36:02 crc kubenswrapper[4815]: I1207 19:36:02.412852 4815 scope.go:117] "RemoveContainer" containerID="e298f74a21b244c75c8fb6f5c735b8de7527b7362c9969e50ebf1236c8e1e037" Dec 07 19:36:02 crc kubenswrapper[4815]: I1207 19:36:02.416732 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-ba1d-account-create-update-kt8v7" podStartSLOduration=5.416710069 podStartE2EDuration="5.416710069s" podCreationTimestamp="2025-12-07 19:35:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:36:02.376080052 +0000 UTC m=+1266.955070107" watchObservedRunningTime="2025-12-07 19:36:02.416710069 +0000 UTC m=+1266.995700114" Dec 07 19:36:02 crc kubenswrapper[4815]: I1207 19:36:02.437211 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 07 19:36:02 crc kubenswrapper[4815]: I1207 19:36:02.457698 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 07 19:36:02 crc kubenswrapper[4815]: I1207 19:36:02.458740 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-d187-account-create-update-db6lq" podStartSLOduration=5.458715824 podStartE2EDuration="5.458715824s" podCreationTimestamp="2025-12-07 19:35:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:36:02.447560279 +0000 UTC m=+1267.026550324" watchObservedRunningTime="2025-12-07 19:36:02.458715824 +0000 UTC m=+1267.037705869" Dec 07 19:36:02 crc kubenswrapper[4815]: I1207 19:36:02.520697 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 07 19:36:02 crc kubenswrapper[4815]: E1207 19:36:02.521132 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1" containerName="ceilometer-notification-agent" Dec 07 19:36:02 crc kubenswrapper[4815]: I1207 19:36:02.521148 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1" containerName="ceilometer-notification-agent" Dec 07 19:36:02 crc kubenswrapper[4815]: E1207 19:36:02.521170 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1" containerName="sg-core" Dec 07 19:36:02 crc kubenswrapper[4815]: I1207 19:36:02.521177 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1" containerName="sg-core" Dec 07 19:36:02 crc kubenswrapper[4815]: E1207 19:36:02.521191 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1" containerName="proxy-httpd" Dec 07 19:36:02 crc kubenswrapper[4815]: I1207 19:36:02.521197 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1" containerName="proxy-httpd" Dec 07 19:36:02 crc kubenswrapper[4815]: E1207 19:36:02.521206 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1" containerName="ceilometer-central-agent" Dec 07 19:36:02 crc kubenswrapper[4815]: I1207 19:36:02.521214 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1" containerName="ceilometer-central-agent" Dec 07 19:36:02 crc kubenswrapper[4815]: I1207 19:36:02.521383 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1" containerName="proxy-httpd" Dec 07 19:36:02 crc kubenswrapper[4815]: I1207 19:36:02.521394 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1" containerName="ceilometer-central-agent" Dec 07 19:36:02 crc kubenswrapper[4815]: I1207 19:36:02.521412 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1" containerName="ceilometer-notification-agent" Dec 07 19:36:02 crc kubenswrapper[4815]: I1207 19:36:02.521424 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1" containerName="sg-core" Dec 07 19:36:02 crc kubenswrapper[4815]: I1207 19:36:02.523293 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 07 19:36:02 crc kubenswrapper[4815]: I1207 19:36:02.523517 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-gmwt8" podStartSLOduration=5.523495423 podStartE2EDuration="5.523495423s" podCreationTimestamp="2025-12-07 19:35:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:36:02.464392485 +0000 UTC m=+1267.043382530" watchObservedRunningTime="2025-12-07 19:36:02.523495423 +0000 UTC m=+1267.102485468" Dec 07 19:36:02 crc kubenswrapper[4815]: I1207 19:36:02.526103 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 07 19:36:02 crc kubenswrapper[4815]: I1207 19:36:02.526430 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 07 19:36:02 crc kubenswrapper[4815]: I1207 19:36:02.563828 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 07 19:36:02 crc kubenswrapper[4815]: I1207 19:36:02.656616 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84eb52e7-a5e6-4471-b9d6-75a9584c7e6b-scripts\") pod \"ceilometer-0\" (UID: \"84eb52e7-a5e6-4471-b9d6-75a9584c7e6b\") " pod="openstack/ceilometer-0" Dec 07 19:36:02 crc kubenswrapper[4815]: I1207 19:36:02.657219 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/84eb52e7-a5e6-4471-b9d6-75a9584c7e6b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"84eb52e7-a5e6-4471-b9d6-75a9584c7e6b\") " pod="openstack/ceilometer-0" Dec 07 19:36:02 crc kubenswrapper[4815]: I1207 19:36:02.657318 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/84eb52e7-a5e6-4471-b9d6-75a9584c7e6b-log-httpd\") pod \"ceilometer-0\" (UID: \"84eb52e7-a5e6-4471-b9d6-75a9584c7e6b\") " pod="openstack/ceilometer-0" Dec 07 19:36:02 crc kubenswrapper[4815]: I1207 19:36:02.657348 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25cnt\" (UniqueName: \"kubernetes.io/projected/84eb52e7-a5e6-4471-b9d6-75a9584c7e6b-kube-api-access-25cnt\") pod \"ceilometer-0\" (UID: \"84eb52e7-a5e6-4471-b9d6-75a9584c7e6b\") " pod="openstack/ceilometer-0" Dec 07 19:36:02 crc kubenswrapper[4815]: I1207 19:36:02.657382 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84eb52e7-a5e6-4471-b9d6-75a9584c7e6b-config-data\") pod \"ceilometer-0\" (UID: \"84eb52e7-a5e6-4471-b9d6-75a9584c7e6b\") " pod="openstack/ceilometer-0" Dec 07 19:36:02 crc kubenswrapper[4815]: I1207 19:36:02.657413 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84eb52e7-a5e6-4471-b9d6-75a9584c7e6b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"84eb52e7-a5e6-4471-b9d6-75a9584c7e6b\") " pod="openstack/ceilometer-0" Dec 07 19:36:02 crc kubenswrapper[4815]: I1207 19:36:02.657446 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/84eb52e7-a5e6-4471-b9d6-75a9584c7e6b-run-httpd\") pod \"ceilometer-0\" (UID: \"84eb52e7-a5e6-4471-b9d6-75a9584c7e6b\") " pod="openstack/ceilometer-0" Dec 07 19:36:02 crc kubenswrapper[4815]: I1207 19:36:02.759084 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/84eb52e7-a5e6-4471-b9d6-75a9584c7e6b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"84eb52e7-a5e6-4471-b9d6-75a9584c7e6b\") " pod="openstack/ceilometer-0" Dec 07 19:36:02 crc kubenswrapper[4815]: I1207 19:36:02.759141 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84eb52e7-a5e6-4471-b9d6-75a9584c7e6b-scripts\") pod \"ceilometer-0\" (UID: \"84eb52e7-a5e6-4471-b9d6-75a9584c7e6b\") " pod="openstack/ceilometer-0" Dec 07 19:36:02 crc kubenswrapper[4815]: I1207 19:36:02.759223 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/84eb52e7-a5e6-4471-b9d6-75a9584c7e6b-log-httpd\") pod \"ceilometer-0\" (UID: \"84eb52e7-a5e6-4471-b9d6-75a9584c7e6b\") " pod="openstack/ceilometer-0" Dec 07 19:36:02 crc kubenswrapper[4815]: I1207 19:36:02.759257 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25cnt\" (UniqueName: \"kubernetes.io/projected/84eb52e7-a5e6-4471-b9d6-75a9584c7e6b-kube-api-access-25cnt\") pod \"ceilometer-0\" (UID: \"84eb52e7-a5e6-4471-b9d6-75a9584c7e6b\") " pod="openstack/ceilometer-0" Dec 07 19:36:02 crc kubenswrapper[4815]: I1207 19:36:02.759302 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84eb52e7-a5e6-4471-b9d6-75a9584c7e6b-config-data\") pod \"ceilometer-0\" (UID: \"84eb52e7-a5e6-4471-b9d6-75a9584c7e6b\") " pod="openstack/ceilometer-0" Dec 07 19:36:02 crc kubenswrapper[4815]: I1207 19:36:02.759342 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84eb52e7-a5e6-4471-b9d6-75a9584c7e6b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"84eb52e7-a5e6-4471-b9d6-75a9584c7e6b\") " pod="openstack/ceilometer-0" Dec 07 19:36:02 crc kubenswrapper[4815]: I1207 19:36:02.759376 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/84eb52e7-a5e6-4471-b9d6-75a9584c7e6b-run-httpd\") pod \"ceilometer-0\" (UID: \"84eb52e7-a5e6-4471-b9d6-75a9584c7e6b\") " pod="openstack/ceilometer-0" Dec 07 19:36:02 crc kubenswrapper[4815]: I1207 19:36:02.759885 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/84eb52e7-a5e6-4471-b9d6-75a9584c7e6b-log-httpd\") pod \"ceilometer-0\" (UID: \"84eb52e7-a5e6-4471-b9d6-75a9584c7e6b\") " pod="openstack/ceilometer-0" Dec 07 19:36:02 crc kubenswrapper[4815]: I1207 19:36:02.760288 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/84eb52e7-a5e6-4471-b9d6-75a9584c7e6b-run-httpd\") pod \"ceilometer-0\" (UID: \"84eb52e7-a5e6-4471-b9d6-75a9584c7e6b\") " pod="openstack/ceilometer-0" Dec 07 19:36:02 crc kubenswrapper[4815]: I1207 19:36:02.763797 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/84eb52e7-a5e6-4471-b9d6-75a9584c7e6b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"84eb52e7-a5e6-4471-b9d6-75a9584c7e6b\") " pod="openstack/ceilometer-0" Dec 07 19:36:02 crc kubenswrapper[4815]: I1207 19:36:02.764482 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84eb52e7-a5e6-4471-b9d6-75a9584c7e6b-config-data\") pod \"ceilometer-0\" (UID: \"84eb52e7-a5e6-4471-b9d6-75a9584c7e6b\") " pod="openstack/ceilometer-0" Dec 07 19:36:02 crc kubenswrapper[4815]: I1207 19:36:02.772726 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84eb52e7-a5e6-4471-b9d6-75a9584c7e6b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"84eb52e7-a5e6-4471-b9d6-75a9584c7e6b\") " pod="openstack/ceilometer-0" Dec 07 19:36:02 crc kubenswrapper[4815]: I1207 19:36:02.776734 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25cnt\" (UniqueName: \"kubernetes.io/projected/84eb52e7-a5e6-4471-b9d6-75a9584c7e6b-kube-api-access-25cnt\") pod \"ceilometer-0\" (UID: \"84eb52e7-a5e6-4471-b9d6-75a9584c7e6b\") " pod="openstack/ceilometer-0" Dec 07 19:36:02 crc kubenswrapper[4815]: I1207 19:36:02.776737 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84eb52e7-a5e6-4471-b9d6-75a9584c7e6b-scripts\") pod \"ceilometer-0\" (UID: \"84eb52e7-a5e6-4471-b9d6-75a9584c7e6b\") " pod="openstack/ceilometer-0" Dec 07 19:36:02 crc kubenswrapper[4815]: I1207 19:36:02.855158 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 07 19:36:03 crc kubenswrapper[4815]: I1207 19:36:03.316203 4815 generic.go:334] "Generic (PLEG): container finished" podID="cca9cdb9-858e-463a-b89f-4fe86331b304" containerID="2d596b7f5dc3f88d2fb84b5c96a9b7351ba0f39a7c0fe80afa378773bdfeef7f" exitCode=0 Dec 07 19:36:03 crc kubenswrapper[4815]: I1207 19:36:03.316329 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-5cd5-account-create-update-tdn22" event={"ID":"cca9cdb9-858e-463a-b89f-4fe86331b304","Type":"ContainerDied","Data":"2d596b7f5dc3f88d2fb84b5c96a9b7351ba0f39a7c0fe80afa378773bdfeef7f"} Dec 07 19:36:03 crc kubenswrapper[4815]: I1207 19:36:03.318610 4815 generic.go:334] "Generic (PLEG): container finished" podID="d740fe41-3740-496c-b34d-3fd2f63fb619" containerID="63c108d380c4d742b2fa6c28ab59f9759c819309bd5dc1bcb83bb82eaf53d029" exitCode=0 Dec 07 19:36:03 crc kubenswrapper[4815]: I1207 19:36:03.318659 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-9jkcq" event={"ID":"d740fe41-3740-496c-b34d-3fd2f63fb619","Type":"ContainerDied","Data":"63c108d380c4d742b2fa6c28ab59f9759c819309bd5dc1bcb83bb82eaf53d029"} Dec 07 19:36:03 crc kubenswrapper[4815]: I1207 19:36:03.326078 4815 generic.go:334] "Generic (PLEG): container finished" podID="87680dbf-76ea-41e7-9349-4a83b07b7c8f" containerID="31df9400bc214ed9b14b1b8916413c0977d9624efba7003e4d8c173732d571c0" exitCode=0 Dec 07 19:36:03 crc kubenswrapper[4815]: I1207 19:36:03.326217 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-mxr8g" event={"ID":"87680dbf-76ea-41e7-9349-4a83b07b7c8f","Type":"ContainerDied","Data":"31df9400bc214ed9b14b1b8916413c0977d9624efba7003e4d8c173732d571c0"} Dec 07 19:36:03 crc kubenswrapper[4815]: I1207 19:36:03.336351 4815 generic.go:334] "Generic (PLEG): container finished" podID="c4283dd6-964a-4f3d-8509-7a2c7fbd393f" containerID="0cec2bb0329f76f7f7a557247163d68d41b2eed2fffcd968cf72da9ca9e73cba" exitCode=0 Dec 07 19:36:03 crc kubenswrapper[4815]: I1207 19:36:03.336470 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-d187-account-create-update-db6lq" event={"ID":"c4283dd6-964a-4f3d-8509-7a2c7fbd393f","Type":"ContainerDied","Data":"0cec2bb0329f76f7f7a557247163d68d41b2eed2fffcd968cf72da9ca9e73cba"} Dec 07 19:36:03 crc kubenswrapper[4815]: I1207 19:36:03.338303 4815 generic.go:334] "Generic (PLEG): container finished" podID="54cb9bbd-aa49-4113-8094-db794e456915" containerID="055213ce89c021f00d002f19dbb91660ad69d0f48893edf9abb3298d5ef36bc6" exitCode=0 Dec 07 19:36:03 crc kubenswrapper[4815]: I1207 19:36:03.338360 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-ba1d-account-create-update-kt8v7" event={"ID":"54cb9bbd-aa49-4113-8094-db794e456915","Type":"ContainerDied","Data":"055213ce89c021f00d002f19dbb91660ad69d0f48893edf9abb3298d5ef36bc6"} Dec 07 19:36:03 crc kubenswrapper[4815]: I1207 19:36:03.339477 4815 generic.go:334] "Generic (PLEG): container finished" podID="e656eb7e-25aa-4982-a61a-d5a41cfef90d" containerID="7bcc6874aa3d429722f2cc094154f26214b1835ffe964a06f0c7f27672616f32" exitCode=0 Dec 07 19:36:03 crc kubenswrapper[4815]: I1207 19:36:03.340403 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-gmwt8" event={"ID":"e656eb7e-25aa-4982-a61a-d5a41cfef90d","Type":"ContainerDied","Data":"7bcc6874aa3d429722f2cc094154f26214b1835ffe964a06f0c7f27672616f32"} Dec 07 19:36:03 crc kubenswrapper[4815]: W1207 19:36:03.398901 4815 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84eb52e7_a5e6_4471_b9d6_75a9584c7e6b.slice/crio-aceb690fc359187dc2d5e0bc1979484b32b3c98e0f08e37bfd759603ac0f97b3 WatchSource:0}: Error finding container aceb690fc359187dc2d5e0bc1979484b32b3c98e0f08e37bfd759603ac0f97b3: Status 404 returned error can't find the container with id aceb690fc359187dc2d5e0bc1979484b32b3c98e0f08e37bfd759603ac0f97b3 Dec 07 19:36:03 crc kubenswrapper[4815]: I1207 19:36:03.404895 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 07 19:36:03 crc kubenswrapper[4815]: I1207 19:36:03.782778 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1" path="/var/lib/kubelet/pods/392be0e7-00b4-4e4b-8d1a-93a4cbe2bcc1/volumes" Dec 07 19:36:04 crc kubenswrapper[4815]: I1207 19:36:04.358577 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"84eb52e7-a5e6-4471-b9d6-75a9584c7e6b","Type":"ContainerStarted","Data":"b25add5ea3c513ebffadb0e1fc7eb0da1a44ba4a7b17d68cce3665a87d90dcce"} Dec 07 19:36:04 crc kubenswrapper[4815]: I1207 19:36:04.359456 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"84eb52e7-a5e6-4471-b9d6-75a9584c7e6b","Type":"ContainerStarted","Data":"aceb690fc359187dc2d5e0bc1979484b32b3c98e0f08e37bfd759603ac0f97b3"} Dec 07 19:36:04 crc kubenswrapper[4815]: I1207 19:36:04.735987 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-mxr8g" Dec 07 19:36:04 crc kubenswrapper[4815]: I1207 19:36:04.814168 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jpxz\" (UniqueName: \"kubernetes.io/projected/87680dbf-76ea-41e7-9349-4a83b07b7c8f-kube-api-access-9jpxz\") pod \"87680dbf-76ea-41e7-9349-4a83b07b7c8f\" (UID: \"87680dbf-76ea-41e7-9349-4a83b07b7c8f\") " Dec 07 19:36:04 crc kubenswrapper[4815]: I1207 19:36:04.814542 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87680dbf-76ea-41e7-9349-4a83b07b7c8f-operator-scripts\") pod \"87680dbf-76ea-41e7-9349-4a83b07b7c8f\" (UID: \"87680dbf-76ea-41e7-9349-4a83b07b7c8f\") " Dec 07 19:36:04 crc kubenswrapper[4815]: I1207 19:36:04.815590 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87680dbf-76ea-41e7-9349-4a83b07b7c8f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "87680dbf-76ea-41e7-9349-4a83b07b7c8f" (UID: "87680dbf-76ea-41e7-9349-4a83b07b7c8f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:36:04 crc kubenswrapper[4815]: I1207 19:36:04.831182 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87680dbf-76ea-41e7-9349-4a83b07b7c8f-kube-api-access-9jpxz" (OuterVolumeSpecName: "kube-api-access-9jpxz") pod "87680dbf-76ea-41e7-9349-4a83b07b7c8f" (UID: "87680dbf-76ea-41e7-9349-4a83b07b7c8f"). InnerVolumeSpecName "kube-api-access-9jpxz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:36:04 crc kubenswrapper[4815]: I1207 19:36:04.917637 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9jpxz\" (UniqueName: \"kubernetes.io/projected/87680dbf-76ea-41e7-9349-4a83b07b7c8f-kube-api-access-9jpxz\") on node \"crc\" DevicePath \"\"" Dec 07 19:36:04 crc kubenswrapper[4815]: I1207 19:36:04.917677 4815 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87680dbf-76ea-41e7-9349-4a83b07b7c8f-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 07 19:36:05 crc kubenswrapper[4815]: I1207 19:36:05.126952 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-5cd5-account-create-update-tdn22" Dec 07 19:36:05 crc kubenswrapper[4815]: I1207 19:36:05.128589 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-ba1d-account-create-update-kt8v7" Dec 07 19:36:05 crc kubenswrapper[4815]: I1207 19:36:05.140054 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-gmwt8" Dec 07 19:36:05 crc kubenswrapper[4815]: I1207 19:36:05.148003 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-d187-account-create-update-db6lq" Dec 07 19:36:05 crc kubenswrapper[4815]: I1207 19:36:05.164210 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-9jkcq" Dec 07 19:36:05 crc kubenswrapper[4815]: I1207 19:36:05.231746 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rtll6\" (UniqueName: \"kubernetes.io/projected/c4283dd6-964a-4f3d-8509-7a2c7fbd393f-kube-api-access-rtll6\") pod \"c4283dd6-964a-4f3d-8509-7a2c7fbd393f\" (UID: \"c4283dd6-964a-4f3d-8509-7a2c7fbd393f\") " Dec 07 19:36:05 crc kubenswrapper[4815]: I1207 19:36:05.231879 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99p56\" (UniqueName: \"kubernetes.io/projected/e656eb7e-25aa-4982-a61a-d5a41cfef90d-kube-api-access-99p56\") pod \"e656eb7e-25aa-4982-a61a-d5a41cfef90d\" (UID: \"e656eb7e-25aa-4982-a61a-d5a41cfef90d\") " Dec 07 19:36:05 crc kubenswrapper[4815]: I1207 19:36:05.242047 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sgp87\" (UniqueName: \"kubernetes.io/projected/54cb9bbd-aa49-4113-8094-db794e456915-kube-api-access-sgp87\") pod \"54cb9bbd-aa49-4113-8094-db794e456915\" (UID: \"54cb9bbd-aa49-4113-8094-db794e456915\") " Dec 07 19:36:05 crc kubenswrapper[4815]: I1207 19:36:05.242121 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c4283dd6-964a-4f3d-8509-7a2c7fbd393f-operator-scripts\") pod \"c4283dd6-964a-4f3d-8509-7a2c7fbd393f\" (UID: \"c4283dd6-964a-4f3d-8509-7a2c7fbd393f\") " Dec 07 19:36:05 crc kubenswrapper[4815]: I1207 19:36:05.242162 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e656eb7e-25aa-4982-a61a-d5a41cfef90d-operator-scripts\") pod \"e656eb7e-25aa-4982-a61a-d5a41cfef90d\" (UID: \"e656eb7e-25aa-4982-a61a-d5a41cfef90d\") " Dec 07 19:36:05 crc kubenswrapper[4815]: I1207 19:36:05.242210 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54cb9bbd-aa49-4113-8094-db794e456915-operator-scripts\") pod \"54cb9bbd-aa49-4113-8094-db794e456915\" (UID: \"54cb9bbd-aa49-4113-8094-db794e456915\") " Dec 07 19:36:05 crc kubenswrapper[4815]: I1207 19:36:05.242314 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d740fe41-3740-496c-b34d-3fd2f63fb619-operator-scripts\") pod \"d740fe41-3740-496c-b34d-3fd2f63fb619\" (UID: \"d740fe41-3740-496c-b34d-3fd2f63fb619\") " Dec 07 19:36:05 crc kubenswrapper[4815]: I1207 19:36:05.242353 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9lvss\" (UniqueName: \"kubernetes.io/projected/cca9cdb9-858e-463a-b89f-4fe86331b304-kube-api-access-9lvss\") pod \"cca9cdb9-858e-463a-b89f-4fe86331b304\" (UID: \"cca9cdb9-858e-463a-b89f-4fe86331b304\") " Dec 07 19:36:05 crc kubenswrapper[4815]: I1207 19:36:05.242364 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e656eb7e-25aa-4982-a61a-d5a41cfef90d-kube-api-access-99p56" (OuterVolumeSpecName: "kube-api-access-99p56") pod "e656eb7e-25aa-4982-a61a-d5a41cfef90d" (UID: "e656eb7e-25aa-4982-a61a-d5a41cfef90d"). InnerVolumeSpecName "kube-api-access-99p56". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:36:05 crc kubenswrapper[4815]: I1207 19:36:05.242418 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cca9cdb9-858e-463a-b89f-4fe86331b304-operator-scripts\") pod \"cca9cdb9-858e-463a-b89f-4fe86331b304\" (UID: \"cca9cdb9-858e-463a-b89f-4fe86331b304\") " Dec 07 19:36:05 crc kubenswrapper[4815]: I1207 19:36:05.242445 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kslhq\" (UniqueName: \"kubernetes.io/projected/d740fe41-3740-496c-b34d-3fd2f63fb619-kube-api-access-kslhq\") pod \"d740fe41-3740-496c-b34d-3fd2f63fb619\" (UID: \"d740fe41-3740-496c-b34d-3fd2f63fb619\") " Dec 07 19:36:05 crc kubenswrapper[4815]: I1207 19:36:05.243276 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4283dd6-964a-4f3d-8509-7a2c7fbd393f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c4283dd6-964a-4f3d-8509-7a2c7fbd393f" (UID: "c4283dd6-964a-4f3d-8509-7a2c7fbd393f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:36:05 crc kubenswrapper[4815]: I1207 19:36:05.243289 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d740fe41-3740-496c-b34d-3fd2f63fb619-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d740fe41-3740-496c-b34d-3fd2f63fb619" (UID: "d740fe41-3740-496c-b34d-3fd2f63fb619"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:36:05 crc kubenswrapper[4815]: I1207 19:36:05.243646 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e656eb7e-25aa-4982-a61a-d5a41cfef90d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e656eb7e-25aa-4982-a61a-d5a41cfef90d" (UID: "e656eb7e-25aa-4982-a61a-d5a41cfef90d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:36:05 crc kubenswrapper[4815]: I1207 19:36:05.243895 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99p56\" (UniqueName: \"kubernetes.io/projected/e656eb7e-25aa-4982-a61a-d5a41cfef90d-kube-api-access-99p56\") on node \"crc\" DevicePath \"\"" Dec 07 19:36:05 crc kubenswrapper[4815]: I1207 19:36:05.243932 4815 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c4283dd6-964a-4f3d-8509-7a2c7fbd393f-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 07 19:36:05 crc kubenswrapper[4815]: I1207 19:36:05.243943 4815 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d740fe41-3740-496c-b34d-3fd2f63fb619-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 07 19:36:05 crc kubenswrapper[4815]: I1207 19:36:05.244020 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4283dd6-964a-4f3d-8509-7a2c7fbd393f-kube-api-access-rtll6" (OuterVolumeSpecName: "kube-api-access-rtll6") pod "c4283dd6-964a-4f3d-8509-7a2c7fbd393f" (UID: "c4283dd6-964a-4f3d-8509-7a2c7fbd393f"). InnerVolumeSpecName "kube-api-access-rtll6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:36:05 crc kubenswrapper[4815]: I1207 19:36:05.244134 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54cb9bbd-aa49-4113-8094-db794e456915-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "54cb9bbd-aa49-4113-8094-db794e456915" (UID: "54cb9bbd-aa49-4113-8094-db794e456915"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:36:05 crc kubenswrapper[4815]: I1207 19:36:05.244371 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cca9cdb9-858e-463a-b89f-4fe86331b304-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cca9cdb9-858e-463a-b89f-4fe86331b304" (UID: "cca9cdb9-858e-463a-b89f-4fe86331b304"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:36:05 crc kubenswrapper[4815]: I1207 19:36:05.249362 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54cb9bbd-aa49-4113-8094-db794e456915-kube-api-access-sgp87" (OuterVolumeSpecName: "kube-api-access-sgp87") pod "54cb9bbd-aa49-4113-8094-db794e456915" (UID: "54cb9bbd-aa49-4113-8094-db794e456915"). InnerVolumeSpecName "kube-api-access-sgp87". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:36:05 crc kubenswrapper[4815]: I1207 19:36:05.249447 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d740fe41-3740-496c-b34d-3fd2f63fb619-kube-api-access-kslhq" (OuterVolumeSpecName: "kube-api-access-kslhq") pod "d740fe41-3740-496c-b34d-3fd2f63fb619" (UID: "d740fe41-3740-496c-b34d-3fd2f63fb619"). InnerVolumeSpecName "kube-api-access-kslhq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:36:05 crc kubenswrapper[4815]: I1207 19:36:05.259352 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cca9cdb9-858e-463a-b89f-4fe86331b304-kube-api-access-9lvss" (OuterVolumeSpecName: "kube-api-access-9lvss") pod "cca9cdb9-858e-463a-b89f-4fe86331b304" (UID: "cca9cdb9-858e-463a-b89f-4fe86331b304"). InnerVolumeSpecName "kube-api-access-9lvss". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:36:05 crc kubenswrapper[4815]: I1207 19:36:05.345320 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sgp87\" (UniqueName: \"kubernetes.io/projected/54cb9bbd-aa49-4113-8094-db794e456915-kube-api-access-sgp87\") on node \"crc\" DevicePath \"\"" Dec 07 19:36:05 crc kubenswrapper[4815]: I1207 19:36:05.345586 4815 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e656eb7e-25aa-4982-a61a-d5a41cfef90d-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 07 19:36:05 crc kubenswrapper[4815]: I1207 19:36:05.345595 4815 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54cb9bbd-aa49-4113-8094-db794e456915-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 07 19:36:05 crc kubenswrapper[4815]: I1207 19:36:05.345606 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9lvss\" (UniqueName: \"kubernetes.io/projected/cca9cdb9-858e-463a-b89f-4fe86331b304-kube-api-access-9lvss\") on node \"crc\" DevicePath \"\"" Dec 07 19:36:05 crc kubenswrapper[4815]: I1207 19:36:05.345615 4815 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cca9cdb9-858e-463a-b89f-4fe86331b304-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 07 19:36:05 crc kubenswrapper[4815]: I1207 19:36:05.345623 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kslhq\" (UniqueName: \"kubernetes.io/projected/d740fe41-3740-496c-b34d-3fd2f63fb619-kube-api-access-kslhq\") on node \"crc\" DevicePath \"\"" Dec 07 19:36:05 crc kubenswrapper[4815]: I1207 19:36:05.345633 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rtll6\" (UniqueName: \"kubernetes.io/projected/c4283dd6-964a-4f3d-8509-7a2c7fbd393f-kube-api-access-rtll6\") on node \"crc\" DevicePath \"\"" Dec 07 19:36:05 crc kubenswrapper[4815]: I1207 19:36:05.375266 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"84eb52e7-a5e6-4471-b9d6-75a9584c7e6b","Type":"ContainerStarted","Data":"de244cd6e37d338d8d9c2acdf1b6071dd12ee1485c2a12862627a6f29f4dba28"} Dec 07 19:36:05 crc kubenswrapper[4815]: I1207 19:36:05.376992 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-mxr8g" event={"ID":"87680dbf-76ea-41e7-9349-4a83b07b7c8f","Type":"ContainerDied","Data":"c333c8be53d52fe9f1fd3b0063826ef57c4505bbbcdd43ad466c19797c55ff87"} Dec 07 19:36:05 crc kubenswrapper[4815]: I1207 19:36:05.377030 4815 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c333c8be53d52fe9f1fd3b0063826ef57c4505bbbcdd43ad466c19797c55ff87" Dec 07 19:36:05 crc kubenswrapper[4815]: I1207 19:36:05.377107 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-mxr8g" Dec 07 19:36:05 crc kubenswrapper[4815]: I1207 19:36:05.386787 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-d187-account-create-update-db6lq" event={"ID":"c4283dd6-964a-4f3d-8509-7a2c7fbd393f","Type":"ContainerDied","Data":"1c952e4669032d960120fe0ee39b774b8a361da093926fcba32c66e1cf6ef070"} Dec 07 19:36:05 crc kubenswrapper[4815]: I1207 19:36:05.386823 4815 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1c952e4669032d960120fe0ee39b774b8a361da093926fcba32c66e1cf6ef070" Dec 07 19:36:05 crc kubenswrapper[4815]: I1207 19:36:05.386888 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-d187-account-create-update-db6lq" Dec 07 19:36:05 crc kubenswrapper[4815]: I1207 19:36:05.393112 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-ba1d-account-create-update-kt8v7" event={"ID":"54cb9bbd-aa49-4113-8094-db794e456915","Type":"ContainerDied","Data":"90d776b6a61b3128bf7045d0ee3f11da61e30244e0463d690bad10872409b6e9"} Dec 07 19:36:05 crc kubenswrapper[4815]: I1207 19:36:05.393158 4815 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="90d776b6a61b3128bf7045d0ee3f11da61e30244e0463d690bad10872409b6e9" Dec 07 19:36:05 crc kubenswrapper[4815]: I1207 19:36:05.393193 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-ba1d-account-create-update-kt8v7" Dec 07 19:36:05 crc kubenswrapper[4815]: I1207 19:36:05.395167 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-gmwt8" Dec 07 19:36:05 crc kubenswrapper[4815]: I1207 19:36:05.395246 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-gmwt8" event={"ID":"e656eb7e-25aa-4982-a61a-d5a41cfef90d","Type":"ContainerDied","Data":"d307783bc70029009e8f880fab65bf371a16aeb597f92d99314060345fa2eb2a"} Dec 07 19:36:05 crc kubenswrapper[4815]: I1207 19:36:05.395339 4815 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d307783bc70029009e8f880fab65bf371a16aeb597f92d99314060345fa2eb2a" Dec 07 19:36:05 crc kubenswrapper[4815]: I1207 19:36:05.397465 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-5cd5-account-create-update-tdn22" event={"ID":"cca9cdb9-858e-463a-b89f-4fe86331b304","Type":"ContainerDied","Data":"ccfb2e2ce2620b5054ab843ee65944970962647d2cd858208bd6232bc978e8f5"} Dec 07 19:36:05 crc kubenswrapper[4815]: I1207 19:36:05.397513 4815 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ccfb2e2ce2620b5054ab843ee65944970962647d2cd858208bd6232bc978e8f5" Dec 07 19:36:05 crc kubenswrapper[4815]: I1207 19:36:05.397598 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-5cd5-account-create-update-tdn22" Dec 07 19:36:05 crc kubenswrapper[4815]: I1207 19:36:05.405911 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-9jkcq" event={"ID":"d740fe41-3740-496c-b34d-3fd2f63fb619","Type":"ContainerDied","Data":"7b74433f1477265988ef2877029d9a9619be4f4679367b0dbfffc981489d302e"} Dec 07 19:36:05 crc kubenswrapper[4815]: I1207 19:36:05.405955 4815 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7b74433f1477265988ef2877029d9a9619be4f4679367b0dbfffc981489d302e" Dec 07 19:36:05 crc kubenswrapper[4815]: I1207 19:36:05.406002 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-9jkcq" Dec 07 19:36:06 crc kubenswrapper[4815]: I1207 19:36:06.416146 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"84eb52e7-a5e6-4471-b9d6-75a9584c7e6b","Type":"ContainerStarted","Data":"e0d472f2be4f0597306a75cd5d3c713aaffccbd39dff616212df6d3947b789ee"} Dec 07 19:36:07 crc kubenswrapper[4815]: I1207 19:36:07.567718 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"84eb52e7-a5e6-4471-b9d6-75a9584c7e6b","Type":"ContainerStarted","Data":"b5187464354e586305647640459cdcaa606947c60e97fca6a534379da2586cde"} Dec 07 19:36:07 crc kubenswrapper[4815]: I1207 19:36:07.570201 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 07 19:36:07 crc kubenswrapper[4815]: I1207 19:36:07.594097 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.396398668 podStartE2EDuration="5.594082876s" podCreationTimestamp="2025-12-07 19:36:02 +0000 UTC" firstStartedPulling="2025-12-07 19:36:03.401299039 +0000 UTC m=+1267.980289084" lastFinishedPulling="2025-12-07 19:36:06.598983247 +0000 UTC m=+1271.177973292" observedRunningTime="2025-12-07 19:36:07.587841359 +0000 UTC m=+1272.166831404" watchObservedRunningTime="2025-12-07 19:36:07.594082876 +0000 UTC m=+1272.173072911" Dec 07 19:36:08 crc kubenswrapper[4815]: I1207 19:36:08.006762 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-qgp5k"] Dec 07 19:36:08 crc kubenswrapper[4815]: E1207 19:36:08.007312 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54cb9bbd-aa49-4113-8094-db794e456915" containerName="mariadb-account-create-update" Dec 07 19:36:08 crc kubenswrapper[4815]: I1207 19:36:08.007373 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="54cb9bbd-aa49-4113-8094-db794e456915" containerName="mariadb-account-create-update" Dec 07 19:36:08 crc kubenswrapper[4815]: E1207 19:36:08.007431 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4283dd6-964a-4f3d-8509-7a2c7fbd393f" containerName="mariadb-account-create-update" Dec 07 19:36:08 crc kubenswrapper[4815]: I1207 19:36:08.007503 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4283dd6-964a-4f3d-8509-7a2c7fbd393f" containerName="mariadb-account-create-update" Dec 07 19:36:08 crc kubenswrapper[4815]: E1207 19:36:08.007555 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d740fe41-3740-496c-b34d-3fd2f63fb619" containerName="mariadb-database-create" Dec 07 19:36:08 crc kubenswrapper[4815]: I1207 19:36:08.007608 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="d740fe41-3740-496c-b34d-3fd2f63fb619" containerName="mariadb-database-create" Dec 07 19:36:08 crc kubenswrapper[4815]: E1207 19:36:08.007669 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e656eb7e-25aa-4982-a61a-d5a41cfef90d" containerName="mariadb-database-create" Dec 07 19:36:08 crc kubenswrapper[4815]: I1207 19:36:08.007717 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="e656eb7e-25aa-4982-a61a-d5a41cfef90d" containerName="mariadb-database-create" Dec 07 19:36:08 crc kubenswrapper[4815]: E1207 19:36:08.007769 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cca9cdb9-858e-463a-b89f-4fe86331b304" containerName="mariadb-account-create-update" Dec 07 19:36:08 crc kubenswrapper[4815]: I1207 19:36:08.007814 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="cca9cdb9-858e-463a-b89f-4fe86331b304" containerName="mariadb-account-create-update" Dec 07 19:36:08 crc kubenswrapper[4815]: E1207 19:36:08.007867 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87680dbf-76ea-41e7-9349-4a83b07b7c8f" containerName="mariadb-database-create" Dec 07 19:36:08 crc kubenswrapper[4815]: I1207 19:36:08.007934 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="87680dbf-76ea-41e7-9349-4a83b07b7c8f" containerName="mariadb-database-create" Dec 07 19:36:08 crc kubenswrapper[4815]: I1207 19:36:08.008161 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="54cb9bbd-aa49-4113-8094-db794e456915" containerName="mariadb-account-create-update" Dec 07 19:36:08 crc kubenswrapper[4815]: I1207 19:36:08.008220 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="e656eb7e-25aa-4982-a61a-d5a41cfef90d" containerName="mariadb-database-create" Dec 07 19:36:08 crc kubenswrapper[4815]: I1207 19:36:08.008278 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="d740fe41-3740-496c-b34d-3fd2f63fb619" containerName="mariadb-database-create" Dec 07 19:36:08 crc kubenswrapper[4815]: I1207 19:36:08.008344 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="cca9cdb9-858e-463a-b89f-4fe86331b304" containerName="mariadb-account-create-update" Dec 07 19:36:08 crc kubenswrapper[4815]: I1207 19:36:08.008412 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4283dd6-964a-4f3d-8509-7a2c7fbd393f" containerName="mariadb-account-create-update" Dec 07 19:36:08 crc kubenswrapper[4815]: I1207 19:36:08.008619 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="87680dbf-76ea-41e7-9349-4a83b07b7c8f" containerName="mariadb-database-create" Dec 07 19:36:08 crc kubenswrapper[4815]: I1207 19:36:08.009234 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-qgp5k" Dec 07 19:36:08 crc kubenswrapper[4815]: I1207 19:36:08.012904 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-bzmmg" Dec 07 19:36:08 crc kubenswrapper[4815]: I1207 19:36:08.013377 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Dec 07 19:36:08 crc kubenswrapper[4815]: I1207 19:36:08.014315 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 07 19:36:08 crc kubenswrapper[4815]: I1207 19:36:08.027869 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-qgp5k"] Dec 07 19:36:08 crc kubenswrapper[4815]: I1207 19:36:08.130415 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83eabbaf-4dd5-4f1e-b9f8-bc98f0b295c5-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-qgp5k\" (UID: \"83eabbaf-4dd5-4f1e-b9f8-bc98f0b295c5\") " pod="openstack/nova-cell0-conductor-db-sync-qgp5k" Dec 07 19:36:08 crc kubenswrapper[4815]: I1207 19:36:08.130492 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83eabbaf-4dd5-4f1e-b9f8-bc98f0b295c5-config-data\") pod \"nova-cell0-conductor-db-sync-qgp5k\" (UID: \"83eabbaf-4dd5-4f1e-b9f8-bc98f0b295c5\") " pod="openstack/nova-cell0-conductor-db-sync-qgp5k" Dec 07 19:36:08 crc kubenswrapper[4815]: I1207 19:36:08.130513 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pw99p\" (UniqueName: \"kubernetes.io/projected/83eabbaf-4dd5-4f1e-b9f8-bc98f0b295c5-kube-api-access-pw99p\") pod \"nova-cell0-conductor-db-sync-qgp5k\" (UID: \"83eabbaf-4dd5-4f1e-b9f8-bc98f0b295c5\") " pod="openstack/nova-cell0-conductor-db-sync-qgp5k" Dec 07 19:36:08 crc kubenswrapper[4815]: I1207 19:36:08.130555 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83eabbaf-4dd5-4f1e-b9f8-bc98f0b295c5-scripts\") pod \"nova-cell0-conductor-db-sync-qgp5k\" (UID: \"83eabbaf-4dd5-4f1e-b9f8-bc98f0b295c5\") " pod="openstack/nova-cell0-conductor-db-sync-qgp5k" Dec 07 19:36:08 crc kubenswrapper[4815]: I1207 19:36:08.308139 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83eabbaf-4dd5-4f1e-b9f8-bc98f0b295c5-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-qgp5k\" (UID: \"83eabbaf-4dd5-4f1e-b9f8-bc98f0b295c5\") " pod="openstack/nova-cell0-conductor-db-sync-qgp5k" Dec 07 19:36:08 crc kubenswrapper[4815]: I1207 19:36:08.308293 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pw99p\" (UniqueName: \"kubernetes.io/projected/83eabbaf-4dd5-4f1e-b9f8-bc98f0b295c5-kube-api-access-pw99p\") pod \"nova-cell0-conductor-db-sync-qgp5k\" (UID: \"83eabbaf-4dd5-4f1e-b9f8-bc98f0b295c5\") " pod="openstack/nova-cell0-conductor-db-sync-qgp5k" Dec 07 19:36:08 crc kubenswrapper[4815]: I1207 19:36:08.308317 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83eabbaf-4dd5-4f1e-b9f8-bc98f0b295c5-config-data\") pod \"nova-cell0-conductor-db-sync-qgp5k\" (UID: \"83eabbaf-4dd5-4f1e-b9f8-bc98f0b295c5\") " pod="openstack/nova-cell0-conductor-db-sync-qgp5k" Dec 07 19:36:08 crc kubenswrapper[4815]: I1207 19:36:08.308408 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83eabbaf-4dd5-4f1e-b9f8-bc98f0b295c5-scripts\") pod \"nova-cell0-conductor-db-sync-qgp5k\" (UID: \"83eabbaf-4dd5-4f1e-b9f8-bc98f0b295c5\") " pod="openstack/nova-cell0-conductor-db-sync-qgp5k" Dec 07 19:36:08 crc kubenswrapper[4815]: I1207 19:36:08.315252 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83eabbaf-4dd5-4f1e-b9f8-bc98f0b295c5-scripts\") pod \"nova-cell0-conductor-db-sync-qgp5k\" (UID: \"83eabbaf-4dd5-4f1e-b9f8-bc98f0b295c5\") " pod="openstack/nova-cell0-conductor-db-sync-qgp5k" Dec 07 19:36:08 crc kubenswrapper[4815]: I1207 19:36:08.316604 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83eabbaf-4dd5-4f1e-b9f8-bc98f0b295c5-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-qgp5k\" (UID: \"83eabbaf-4dd5-4f1e-b9f8-bc98f0b295c5\") " pod="openstack/nova-cell0-conductor-db-sync-qgp5k" Dec 07 19:36:08 crc kubenswrapper[4815]: I1207 19:36:08.324354 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83eabbaf-4dd5-4f1e-b9f8-bc98f0b295c5-config-data\") pod \"nova-cell0-conductor-db-sync-qgp5k\" (UID: \"83eabbaf-4dd5-4f1e-b9f8-bc98f0b295c5\") " pod="openstack/nova-cell0-conductor-db-sync-qgp5k" Dec 07 19:36:08 crc kubenswrapper[4815]: I1207 19:36:08.363489 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pw99p\" (UniqueName: \"kubernetes.io/projected/83eabbaf-4dd5-4f1e-b9f8-bc98f0b295c5-kube-api-access-pw99p\") pod \"nova-cell0-conductor-db-sync-qgp5k\" (UID: \"83eabbaf-4dd5-4f1e-b9f8-bc98f0b295c5\") " pod="openstack/nova-cell0-conductor-db-sync-qgp5k" Dec 07 19:36:08 crc kubenswrapper[4815]: I1207 19:36:08.631883 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-qgp5k" Dec 07 19:36:08 crc kubenswrapper[4815]: I1207 19:36:08.930410 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-qgp5k"] Dec 07 19:36:09 crc kubenswrapper[4815]: I1207 19:36:09.584482 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-qgp5k" event={"ID":"83eabbaf-4dd5-4f1e-b9f8-bc98f0b295c5","Type":"ContainerStarted","Data":"4ac1adb440241f680071cd017447d5eecf913e7a2c4a42f9d0fd38a94bb96e65"} Dec 07 19:36:12 crc kubenswrapper[4815]: I1207 19:36:12.669198 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 07 19:36:12 crc kubenswrapper[4815]: I1207 19:36:12.670053 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="84eb52e7-a5e6-4471-b9d6-75a9584c7e6b" containerName="ceilometer-central-agent" containerID="cri-o://b25add5ea3c513ebffadb0e1fc7eb0da1a44ba4a7b17d68cce3665a87d90dcce" gracePeriod=30 Dec 07 19:36:12 crc kubenswrapper[4815]: I1207 19:36:12.670173 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="84eb52e7-a5e6-4471-b9d6-75a9584c7e6b" containerName="sg-core" containerID="cri-o://e0d472f2be4f0597306a75cd5d3c713aaffccbd39dff616212df6d3947b789ee" gracePeriod=30 Dec 07 19:36:12 crc kubenswrapper[4815]: I1207 19:36:12.670223 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="84eb52e7-a5e6-4471-b9d6-75a9584c7e6b" containerName="ceilometer-notification-agent" containerID="cri-o://de244cd6e37d338d8d9c2acdf1b6071dd12ee1485c2a12862627a6f29f4dba28" gracePeriod=30 Dec 07 19:36:12 crc kubenswrapper[4815]: I1207 19:36:12.670227 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="84eb52e7-a5e6-4471-b9d6-75a9584c7e6b" containerName="proxy-httpd" containerID="cri-o://b5187464354e586305647640459cdcaa606947c60e97fca6a534379da2586cde" gracePeriod=30 Dec 07 19:36:13 crc kubenswrapper[4815]: I1207 19:36:13.638467 4815 generic.go:334] "Generic (PLEG): container finished" podID="84eb52e7-a5e6-4471-b9d6-75a9584c7e6b" containerID="b5187464354e586305647640459cdcaa606947c60e97fca6a534379da2586cde" exitCode=0 Dec 07 19:36:13 crc kubenswrapper[4815]: I1207 19:36:13.638781 4815 generic.go:334] "Generic (PLEG): container finished" podID="84eb52e7-a5e6-4471-b9d6-75a9584c7e6b" containerID="e0d472f2be4f0597306a75cd5d3c713aaffccbd39dff616212df6d3947b789ee" exitCode=2 Dec 07 19:36:13 crc kubenswrapper[4815]: I1207 19:36:13.638800 4815 generic.go:334] "Generic (PLEG): container finished" podID="84eb52e7-a5e6-4471-b9d6-75a9584c7e6b" containerID="de244cd6e37d338d8d9c2acdf1b6071dd12ee1485c2a12862627a6f29f4dba28" exitCode=0 Dec 07 19:36:13 crc kubenswrapper[4815]: I1207 19:36:13.638612 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"84eb52e7-a5e6-4471-b9d6-75a9584c7e6b","Type":"ContainerDied","Data":"b5187464354e586305647640459cdcaa606947c60e97fca6a534379da2586cde"} Dec 07 19:36:13 crc kubenswrapper[4815]: I1207 19:36:13.638839 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"84eb52e7-a5e6-4471-b9d6-75a9584c7e6b","Type":"ContainerDied","Data":"e0d472f2be4f0597306a75cd5d3c713aaffccbd39dff616212df6d3947b789ee"} Dec 07 19:36:13 crc kubenswrapper[4815]: I1207 19:36:13.638853 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"84eb52e7-a5e6-4471-b9d6-75a9584c7e6b","Type":"ContainerDied","Data":"de244cd6e37d338d8d9c2acdf1b6071dd12ee1485c2a12862627a6f29f4dba28"} Dec 07 19:36:16 crc kubenswrapper[4815]: I1207 19:36:16.684269 4815 generic.go:334] "Generic (PLEG): container finished" podID="84eb52e7-a5e6-4471-b9d6-75a9584c7e6b" containerID="b25add5ea3c513ebffadb0e1fc7eb0da1a44ba4a7b17d68cce3665a87d90dcce" exitCode=0 Dec 07 19:36:16 crc kubenswrapper[4815]: I1207 19:36:16.684351 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"84eb52e7-a5e6-4471-b9d6-75a9584c7e6b","Type":"ContainerDied","Data":"b25add5ea3c513ebffadb0e1fc7eb0da1a44ba4a7b17d68cce3665a87d90dcce"} Dec 07 19:36:18 crc kubenswrapper[4815]: I1207 19:36:18.589070 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 07 19:36:18 crc kubenswrapper[4815]: I1207 19:36:18.715280 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"84eb52e7-a5e6-4471-b9d6-75a9584c7e6b","Type":"ContainerDied","Data":"aceb690fc359187dc2d5e0bc1979484b32b3c98e0f08e37bfd759603ac0f97b3"} Dec 07 19:36:18 crc kubenswrapper[4815]: I1207 19:36:18.715948 4815 scope.go:117] "RemoveContainer" containerID="b5187464354e586305647640459cdcaa606947c60e97fca6a534379da2586cde" Dec 07 19:36:18 crc kubenswrapper[4815]: I1207 19:36:18.715834 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 07 19:36:18 crc kubenswrapper[4815]: I1207 19:36:18.718726 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84eb52e7-a5e6-4471-b9d6-75a9584c7e6b-combined-ca-bundle\") pod \"84eb52e7-a5e6-4471-b9d6-75a9584c7e6b\" (UID: \"84eb52e7-a5e6-4471-b9d6-75a9584c7e6b\") " Dec 07 19:36:18 crc kubenswrapper[4815]: I1207 19:36:18.718797 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/84eb52e7-a5e6-4471-b9d6-75a9584c7e6b-log-httpd\") pod \"84eb52e7-a5e6-4471-b9d6-75a9584c7e6b\" (UID: \"84eb52e7-a5e6-4471-b9d6-75a9584c7e6b\") " Dec 07 19:36:18 crc kubenswrapper[4815]: I1207 19:36:18.718866 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84eb52e7-a5e6-4471-b9d6-75a9584c7e6b-scripts\") pod \"84eb52e7-a5e6-4471-b9d6-75a9584c7e6b\" (UID: \"84eb52e7-a5e6-4471-b9d6-75a9584c7e6b\") " Dec 07 19:36:18 crc kubenswrapper[4815]: I1207 19:36:18.718974 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/84eb52e7-a5e6-4471-b9d6-75a9584c7e6b-run-httpd\") pod \"84eb52e7-a5e6-4471-b9d6-75a9584c7e6b\" (UID: \"84eb52e7-a5e6-4471-b9d6-75a9584c7e6b\") " Dec 07 19:36:18 crc kubenswrapper[4815]: I1207 19:36:18.719182 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-25cnt\" (UniqueName: \"kubernetes.io/projected/84eb52e7-a5e6-4471-b9d6-75a9584c7e6b-kube-api-access-25cnt\") pod \"84eb52e7-a5e6-4471-b9d6-75a9584c7e6b\" (UID: \"84eb52e7-a5e6-4471-b9d6-75a9584c7e6b\") " Dec 07 19:36:18 crc kubenswrapper[4815]: I1207 19:36:18.719226 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/84eb52e7-a5e6-4471-b9d6-75a9584c7e6b-sg-core-conf-yaml\") pod \"84eb52e7-a5e6-4471-b9d6-75a9584c7e6b\" (UID: \"84eb52e7-a5e6-4471-b9d6-75a9584c7e6b\") " Dec 07 19:36:18 crc kubenswrapper[4815]: I1207 19:36:18.719272 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84eb52e7-a5e6-4471-b9d6-75a9584c7e6b-config-data\") pod \"84eb52e7-a5e6-4471-b9d6-75a9584c7e6b\" (UID: \"84eb52e7-a5e6-4471-b9d6-75a9584c7e6b\") " Dec 07 19:36:18 crc kubenswrapper[4815]: I1207 19:36:18.720209 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84eb52e7-a5e6-4471-b9d6-75a9584c7e6b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "84eb52e7-a5e6-4471-b9d6-75a9584c7e6b" (UID: "84eb52e7-a5e6-4471-b9d6-75a9584c7e6b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 07 19:36:18 crc kubenswrapper[4815]: I1207 19:36:18.720429 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84eb52e7-a5e6-4471-b9d6-75a9584c7e6b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "84eb52e7-a5e6-4471-b9d6-75a9584c7e6b" (UID: "84eb52e7-a5e6-4471-b9d6-75a9584c7e6b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 07 19:36:18 crc kubenswrapper[4815]: I1207 19:36:18.719265 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-qgp5k" event={"ID":"83eabbaf-4dd5-4f1e-b9f8-bc98f0b295c5","Type":"ContainerStarted","Data":"ecde84fb50b3ec0ddf493efa89a6e6252d087d8723e4f8cf94993d146ec52765"} Dec 07 19:36:18 crc kubenswrapper[4815]: I1207 19:36:18.724881 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84eb52e7-a5e6-4471-b9d6-75a9584c7e6b-scripts" (OuterVolumeSpecName: "scripts") pod "84eb52e7-a5e6-4471-b9d6-75a9584c7e6b" (UID: "84eb52e7-a5e6-4471-b9d6-75a9584c7e6b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:36:18 crc kubenswrapper[4815]: I1207 19:36:18.735880 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84eb52e7-a5e6-4471-b9d6-75a9584c7e6b-kube-api-access-25cnt" (OuterVolumeSpecName: "kube-api-access-25cnt") pod "84eb52e7-a5e6-4471-b9d6-75a9584c7e6b" (UID: "84eb52e7-a5e6-4471-b9d6-75a9584c7e6b"). InnerVolumeSpecName "kube-api-access-25cnt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:36:18 crc kubenswrapper[4815]: I1207 19:36:18.746316 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-qgp5k" podStartSLOduration=2.387309323 podStartE2EDuration="11.74629175s" podCreationTimestamp="2025-12-07 19:36:07 +0000 UTC" firstStartedPulling="2025-12-07 19:36:08.949726791 +0000 UTC m=+1273.528716836" lastFinishedPulling="2025-12-07 19:36:18.308709218 +0000 UTC m=+1282.887699263" observedRunningTime="2025-12-07 19:36:18.73850226 +0000 UTC m=+1283.317492315" watchObservedRunningTime="2025-12-07 19:36:18.74629175 +0000 UTC m=+1283.325281795" Dec 07 19:36:18 crc kubenswrapper[4815]: I1207 19:36:18.750379 4815 scope.go:117] "RemoveContainer" containerID="e0d472f2be4f0597306a75cd5d3c713aaffccbd39dff616212df6d3947b789ee" Dec 07 19:36:18 crc kubenswrapper[4815]: I1207 19:36:18.766823 4815 scope.go:117] "RemoveContainer" containerID="de244cd6e37d338d8d9c2acdf1b6071dd12ee1485c2a12862627a6f29f4dba28" Dec 07 19:36:18 crc kubenswrapper[4815]: I1207 19:36:18.780117 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84eb52e7-a5e6-4471-b9d6-75a9584c7e6b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "84eb52e7-a5e6-4471-b9d6-75a9584c7e6b" (UID: "84eb52e7-a5e6-4471-b9d6-75a9584c7e6b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:36:18 crc kubenswrapper[4815]: I1207 19:36:18.785267 4815 scope.go:117] "RemoveContainer" containerID="b25add5ea3c513ebffadb0e1fc7eb0da1a44ba4a7b17d68cce3665a87d90dcce" Dec 07 19:36:18 crc kubenswrapper[4815]: I1207 19:36:18.793739 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84eb52e7-a5e6-4471-b9d6-75a9584c7e6b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "84eb52e7-a5e6-4471-b9d6-75a9584c7e6b" (UID: "84eb52e7-a5e6-4471-b9d6-75a9584c7e6b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:36:18 crc kubenswrapper[4815]: I1207 19:36:18.822217 4815 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/84eb52e7-a5e6-4471-b9d6-75a9584c7e6b-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 07 19:36:18 crc kubenswrapper[4815]: I1207 19:36:18.822333 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-25cnt\" (UniqueName: \"kubernetes.io/projected/84eb52e7-a5e6-4471-b9d6-75a9584c7e6b-kube-api-access-25cnt\") on node \"crc\" DevicePath \"\"" Dec 07 19:36:18 crc kubenswrapper[4815]: I1207 19:36:18.822404 4815 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/84eb52e7-a5e6-4471-b9d6-75a9584c7e6b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 07 19:36:18 crc kubenswrapper[4815]: I1207 19:36:18.822459 4815 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84eb52e7-a5e6-4471-b9d6-75a9584c7e6b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 07 19:36:18 crc kubenswrapper[4815]: I1207 19:36:18.822513 4815 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/84eb52e7-a5e6-4471-b9d6-75a9584c7e6b-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 07 19:36:18 crc kubenswrapper[4815]: I1207 19:36:18.822564 4815 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84eb52e7-a5e6-4471-b9d6-75a9584c7e6b-scripts\") on node \"crc\" DevicePath \"\"" Dec 07 19:36:18 crc kubenswrapper[4815]: I1207 19:36:18.844264 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84eb52e7-a5e6-4471-b9d6-75a9584c7e6b-config-data" (OuterVolumeSpecName: "config-data") pod "84eb52e7-a5e6-4471-b9d6-75a9584c7e6b" (UID: "84eb52e7-a5e6-4471-b9d6-75a9584c7e6b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:36:18 crc kubenswrapper[4815]: I1207 19:36:18.923695 4815 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84eb52e7-a5e6-4471-b9d6-75a9584c7e6b-config-data\") on node \"crc\" DevicePath \"\"" Dec 07 19:36:19 crc kubenswrapper[4815]: I1207 19:36:19.070492 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 07 19:36:19 crc kubenswrapper[4815]: I1207 19:36:19.081273 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 07 19:36:19 crc kubenswrapper[4815]: I1207 19:36:19.101087 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 07 19:36:19 crc kubenswrapper[4815]: E1207 19:36:19.101738 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84eb52e7-a5e6-4471-b9d6-75a9584c7e6b" containerName="sg-core" Dec 07 19:36:19 crc kubenswrapper[4815]: I1207 19:36:19.101807 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="84eb52e7-a5e6-4471-b9d6-75a9584c7e6b" containerName="sg-core" Dec 07 19:36:19 crc kubenswrapper[4815]: E1207 19:36:19.101926 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84eb52e7-a5e6-4471-b9d6-75a9584c7e6b" containerName="ceilometer-notification-agent" Dec 07 19:36:19 crc kubenswrapper[4815]: I1207 19:36:19.101986 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="84eb52e7-a5e6-4471-b9d6-75a9584c7e6b" containerName="ceilometer-notification-agent" Dec 07 19:36:19 crc kubenswrapper[4815]: E1207 19:36:19.102069 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84eb52e7-a5e6-4471-b9d6-75a9584c7e6b" containerName="ceilometer-central-agent" Dec 07 19:36:19 crc kubenswrapper[4815]: I1207 19:36:19.102139 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="84eb52e7-a5e6-4471-b9d6-75a9584c7e6b" containerName="ceilometer-central-agent" Dec 07 19:36:19 crc kubenswrapper[4815]: E1207 19:36:19.102217 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84eb52e7-a5e6-4471-b9d6-75a9584c7e6b" containerName="proxy-httpd" Dec 07 19:36:19 crc kubenswrapper[4815]: I1207 19:36:19.102289 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="84eb52e7-a5e6-4471-b9d6-75a9584c7e6b" containerName="proxy-httpd" Dec 07 19:36:19 crc kubenswrapper[4815]: I1207 19:36:19.102548 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="84eb52e7-a5e6-4471-b9d6-75a9584c7e6b" containerName="ceilometer-notification-agent" Dec 07 19:36:19 crc kubenswrapper[4815]: I1207 19:36:19.102640 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="84eb52e7-a5e6-4471-b9d6-75a9584c7e6b" containerName="sg-core" Dec 07 19:36:19 crc kubenswrapper[4815]: I1207 19:36:19.102722 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="84eb52e7-a5e6-4471-b9d6-75a9584c7e6b" containerName="proxy-httpd" Dec 07 19:36:19 crc kubenswrapper[4815]: I1207 19:36:19.102803 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="84eb52e7-a5e6-4471-b9d6-75a9584c7e6b" containerName="ceilometer-central-agent" Dec 07 19:36:19 crc kubenswrapper[4815]: I1207 19:36:19.104839 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 07 19:36:19 crc kubenswrapper[4815]: I1207 19:36:19.108064 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 07 19:36:19 crc kubenswrapper[4815]: I1207 19:36:19.108274 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 07 19:36:19 crc kubenswrapper[4815]: I1207 19:36:19.117561 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 07 19:36:19 crc kubenswrapper[4815]: I1207 19:36:19.231339 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b54fd738-3402-4969-a7b5-348bb50802cb-run-httpd\") pod \"ceilometer-0\" (UID: \"b54fd738-3402-4969-a7b5-348bb50802cb\") " pod="openstack/ceilometer-0" Dec 07 19:36:19 crc kubenswrapper[4815]: I1207 19:36:19.231664 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnbhh\" (UniqueName: \"kubernetes.io/projected/b54fd738-3402-4969-a7b5-348bb50802cb-kube-api-access-qnbhh\") pod \"ceilometer-0\" (UID: \"b54fd738-3402-4969-a7b5-348bb50802cb\") " pod="openstack/ceilometer-0" Dec 07 19:36:19 crc kubenswrapper[4815]: I1207 19:36:19.231736 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b54fd738-3402-4969-a7b5-348bb50802cb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b54fd738-3402-4969-a7b5-348bb50802cb\") " pod="openstack/ceilometer-0" Dec 07 19:36:19 crc kubenswrapper[4815]: I1207 19:36:19.231773 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b54fd738-3402-4969-a7b5-348bb50802cb-log-httpd\") pod \"ceilometer-0\" (UID: \"b54fd738-3402-4969-a7b5-348bb50802cb\") " pod="openstack/ceilometer-0" Dec 07 19:36:19 crc kubenswrapper[4815]: I1207 19:36:19.231793 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b54fd738-3402-4969-a7b5-348bb50802cb-scripts\") pod \"ceilometer-0\" (UID: \"b54fd738-3402-4969-a7b5-348bb50802cb\") " pod="openstack/ceilometer-0" Dec 07 19:36:19 crc kubenswrapper[4815]: I1207 19:36:19.231819 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b54fd738-3402-4969-a7b5-348bb50802cb-config-data\") pod \"ceilometer-0\" (UID: \"b54fd738-3402-4969-a7b5-348bb50802cb\") " pod="openstack/ceilometer-0" Dec 07 19:36:19 crc kubenswrapper[4815]: I1207 19:36:19.231855 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b54fd738-3402-4969-a7b5-348bb50802cb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b54fd738-3402-4969-a7b5-348bb50802cb\") " pod="openstack/ceilometer-0" Dec 07 19:36:19 crc kubenswrapper[4815]: I1207 19:36:19.333316 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b54fd738-3402-4969-a7b5-348bb50802cb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b54fd738-3402-4969-a7b5-348bb50802cb\") " pod="openstack/ceilometer-0" Dec 07 19:36:19 crc kubenswrapper[4815]: I1207 19:36:19.333375 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b54fd738-3402-4969-a7b5-348bb50802cb-log-httpd\") pod \"ceilometer-0\" (UID: \"b54fd738-3402-4969-a7b5-348bb50802cb\") " pod="openstack/ceilometer-0" Dec 07 19:36:19 crc kubenswrapper[4815]: I1207 19:36:19.333402 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b54fd738-3402-4969-a7b5-348bb50802cb-scripts\") pod \"ceilometer-0\" (UID: \"b54fd738-3402-4969-a7b5-348bb50802cb\") " pod="openstack/ceilometer-0" Dec 07 19:36:19 crc kubenswrapper[4815]: I1207 19:36:19.333431 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b54fd738-3402-4969-a7b5-348bb50802cb-config-data\") pod \"ceilometer-0\" (UID: \"b54fd738-3402-4969-a7b5-348bb50802cb\") " pod="openstack/ceilometer-0" Dec 07 19:36:19 crc kubenswrapper[4815]: I1207 19:36:19.333468 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b54fd738-3402-4969-a7b5-348bb50802cb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b54fd738-3402-4969-a7b5-348bb50802cb\") " pod="openstack/ceilometer-0" Dec 07 19:36:19 crc kubenswrapper[4815]: I1207 19:36:19.333492 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b54fd738-3402-4969-a7b5-348bb50802cb-run-httpd\") pod \"ceilometer-0\" (UID: \"b54fd738-3402-4969-a7b5-348bb50802cb\") " pod="openstack/ceilometer-0" Dec 07 19:36:19 crc kubenswrapper[4815]: I1207 19:36:19.333960 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b54fd738-3402-4969-a7b5-348bb50802cb-log-httpd\") pod \"ceilometer-0\" (UID: \"b54fd738-3402-4969-a7b5-348bb50802cb\") " pod="openstack/ceilometer-0" Dec 07 19:36:19 crc kubenswrapper[4815]: I1207 19:36:19.334082 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b54fd738-3402-4969-a7b5-348bb50802cb-run-httpd\") pod \"ceilometer-0\" (UID: \"b54fd738-3402-4969-a7b5-348bb50802cb\") " pod="openstack/ceilometer-0" Dec 07 19:36:19 crc kubenswrapper[4815]: I1207 19:36:19.334249 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnbhh\" (UniqueName: \"kubernetes.io/projected/b54fd738-3402-4969-a7b5-348bb50802cb-kube-api-access-qnbhh\") pod \"ceilometer-0\" (UID: \"b54fd738-3402-4969-a7b5-348bb50802cb\") " pod="openstack/ceilometer-0" Dec 07 19:36:19 crc kubenswrapper[4815]: I1207 19:36:19.339901 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b54fd738-3402-4969-a7b5-348bb50802cb-scripts\") pod \"ceilometer-0\" (UID: \"b54fd738-3402-4969-a7b5-348bb50802cb\") " pod="openstack/ceilometer-0" Dec 07 19:36:19 crc kubenswrapper[4815]: I1207 19:36:19.340740 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b54fd738-3402-4969-a7b5-348bb50802cb-config-data\") pod \"ceilometer-0\" (UID: \"b54fd738-3402-4969-a7b5-348bb50802cb\") " pod="openstack/ceilometer-0" Dec 07 19:36:19 crc kubenswrapper[4815]: I1207 19:36:19.344667 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b54fd738-3402-4969-a7b5-348bb50802cb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b54fd738-3402-4969-a7b5-348bb50802cb\") " pod="openstack/ceilometer-0" Dec 07 19:36:19 crc kubenswrapper[4815]: I1207 19:36:19.350148 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b54fd738-3402-4969-a7b5-348bb50802cb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b54fd738-3402-4969-a7b5-348bb50802cb\") " pod="openstack/ceilometer-0" Dec 07 19:36:19 crc kubenswrapper[4815]: I1207 19:36:19.352593 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnbhh\" (UniqueName: \"kubernetes.io/projected/b54fd738-3402-4969-a7b5-348bb50802cb-kube-api-access-qnbhh\") pod \"ceilometer-0\" (UID: \"b54fd738-3402-4969-a7b5-348bb50802cb\") " pod="openstack/ceilometer-0" Dec 07 19:36:19 crc kubenswrapper[4815]: I1207 19:36:19.437291 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 07 19:36:19 crc kubenswrapper[4815]: I1207 19:36:19.784353 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84eb52e7-a5e6-4471-b9d6-75a9584c7e6b" path="/var/lib/kubelet/pods/84eb52e7-a5e6-4471-b9d6-75a9584c7e6b/volumes" Dec 07 19:36:19 crc kubenswrapper[4815]: W1207 19:36:19.914785 4815 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb54fd738_3402_4969_a7b5_348bb50802cb.slice/crio-2f554b746be7b67ad0df1d46065007514e77a6f700c06730ec854d0baa6eff9c WatchSource:0}: Error finding container 2f554b746be7b67ad0df1d46065007514e77a6f700c06730ec854d0baa6eff9c: Status 404 returned error can't find the container with id 2f554b746be7b67ad0df1d46065007514e77a6f700c06730ec854d0baa6eff9c Dec 07 19:36:19 crc kubenswrapper[4815]: I1207 19:36:19.918549 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 07 19:36:20 crc kubenswrapper[4815]: I1207 19:36:20.747629 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b54fd738-3402-4969-a7b5-348bb50802cb","Type":"ContainerStarted","Data":"e84e0d9c057ff0dccd894ce588e5182e8c975caa8163ce63305aab73801a7b3c"} Dec 07 19:36:20 crc kubenswrapper[4815]: I1207 19:36:20.748105 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b54fd738-3402-4969-a7b5-348bb50802cb","Type":"ContainerStarted","Data":"2f554b746be7b67ad0df1d46065007514e77a6f700c06730ec854d0baa6eff9c"} Dec 07 19:36:21 crc kubenswrapper[4815]: I1207 19:36:21.759519 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b54fd738-3402-4969-a7b5-348bb50802cb","Type":"ContainerStarted","Data":"b38ad080ab273ecc852542a06d26d23620338320c13ef3ceafdf9142996eff0e"} Dec 07 19:36:22 crc kubenswrapper[4815]: I1207 19:36:22.770100 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b54fd738-3402-4969-a7b5-348bb50802cb","Type":"ContainerStarted","Data":"1a5542976f7dd3c51a5c257561bd41f5bdae1511b1847939657d90eadb7d4559"} Dec 07 19:36:23 crc kubenswrapper[4815]: I1207 19:36:23.793852 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b54fd738-3402-4969-a7b5-348bb50802cb","Type":"ContainerStarted","Data":"0c11b514267e1af828d819e7f52a907b254b7f028df1d55089facafa61e8567b"} Dec 07 19:36:23 crc kubenswrapper[4815]: I1207 19:36:23.795200 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 07 19:36:31 crc kubenswrapper[4815]: I1207 19:36:31.899317 4815 generic.go:334] "Generic (PLEG): container finished" podID="83eabbaf-4dd5-4f1e-b9f8-bc98f0b295c5" containerID="ecde84fb50b3ec0ddf493efa89a6e6252d087d8723e4f8cf94993d146ec52765" exitCode=0 Dec 07 19:36:31 crc kubenswrapper[4815]: I1207 19:36:31.899708 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-qgp5k" event={"ID":"83eabbaf-4dd5-4f1e-b9f8-bc98f0b295c5","Type":"ContainerDied","Data":"ecde84fb50b3ec0ddf493efa89a6e6252d087d8723e4f8cf94993d146ec52765"} Dec 07 19:36:31 crc kubenswrapper[4815]: I1207 19:36:31.921410 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=9.296296809 podStartE2EDuration="12.921390891s" podCreationTimestamp="2025-12-07 19:36:19 +0000 UTC" firstStartedPulling="2025-12-07 19:36:19.918656571 +0000 UTC m=+1284.497646636" lastFinishedPulling="2025-12-07 19:36:23.543750673 +0000 UTC m=+1288.122740718" observedRunningTime="2025-12-07 19:36:23.820390972 +0000 UTC m=+1288.399381017" watchObservedRunningTime="2025-12-07 19:36:31.921390891 +0000 UTC m=+1296.500380936" Dec 07 19:36:33 crc kubenswrapper[4815]: I1207 19:36:33.259323 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-qgp5k" Dec 07 19:36:33 crc kubenswrapper[4815]: I1207 19:36:33.485756 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83eabbaf-4dd5-4f1e-b9f8-bc98f0b295c5-combined-ca-bundle\") pod \"83eabbaf-4dd5-4f1e-b9f8-bc98f0b295c5\" (UID: \"83eabbaf-4dd5-4f1e-b9f8-bc98f0b295c5\") " Dec 07 19:36:33 crc kubenswrapper[4815]: I1207 19:36:33.485861 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83eabbaf-4dd5-4f1e-b9f8-bc98f0b295c5-config-data\") pod \"83eabbaf-4dd5-4f1e-b9f8-bc98f0b295c5\" (UID: \"83eabbaf-4dd5-4f1e-b9f8-bc98f0b295c5\") " Dec 07 19:36:33 crc kubenswrapper[4815]: I1207 19:36:33.486020 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83eabbaf-4dd5-4f1e-b9f8-bc98f0b295c5-scripts\") pod \"83eabbaf-4dd5-4f1e-b9f8-bc98f0b295c5\" (UID: \"83eabbaf-4dd5-4f1e-b9f8-bc98f0b295c5\") " Dec 07 19:36:33 crc kubenswrapper[4815]: I1207 19:36:33.486116 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pw99p\" (UniqueName: \"kubernetes.io/projected/83eabbaf-4dd5-4f1e-b9f8-bc98f0b295c5-kube-api-access-pw99p\") pod \"83eabbaf-4dd5-4f1e-b9f8-bc98f0b295c5\" (UID: \"83eabbaf-4dd5-4f1e-b9f8-bc98f0b295c5\") " Dec 07 19:36:33 crc kubenswrapper[4815]: I1207 19:36:33.498595 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83eabbaf-4dd5-4f1e-b9f8-bc98f0b295c5-scripts" (OuterVolumeSpecName: "scripts") pod "83eabbaf-4dd5-4f1e-b9f8-bc98f0b295c5" (UID: "83eabbaf-4dd5-4f1e-b9f8-bc98f0b295c5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:36:33 crc kubenswrapper[4815]: I1207 19:36:33.498733 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83eabbaf-4dd5-4f1e-b9f8-bc98f0b295c5-kube-api-access-pw99p" (OuterVolumeSpecName: "kube-api-access-pw99p") pod "83eabbaf-4dd5-4f1e-b9f8-bc98f0b295c5" (UID: "83eabbaf-4dd5-4f1e-b9f8-bc98f0b295c5"). InnerVolumeSpecName "kube-api-access-pw99p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:36:33 crc kubenswrapper[4815]: I1207 19:36:33.514185 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83eabbaf-4dd5-4f1e-b9f8-bc98f0b295c5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "83eabbaf-4dd5-4f1e-b9f8-bc98f0b295c5" (UID: "83eabbaf-4dd5-4f1e-b9f8-bc98f0b295c5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:36:33 crc kubenswrapper[4815]: I1207 19:36:33.517060 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83eabbaf-4dd5-4f1e-b9f8-bc98f0b295c5-config-data" (OuterVolumeSpecName: "config-data") pod "83eabbaf-4dd5-4f1e-b9f8-bc98f0b295c5" (UID: "83eabbaf-4dd5-4f1e-b9f8-bc98f0b295c5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:36:33 crc kubenswrapper[4815]: I1207 19:36:33.588490 4815 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83eabbaf-4dd5-4f1e-b9f8-bc98f0b295c5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 07 19:36:33 crc kubenswrapper[4815]: I1207 19:36:33.588527 4815 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83eabbaf-4dd5-4f1e-b9f8-bc98f0b295c5-config-data\") on node \"crc\" DevicePath \"\"" Dec 07 19:36:33 crc kubenswrapper[4815]: I1207 19:36:33.588538 4815 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83eabbaf-4dd5-4f1e-b9f8-bc98f0b295c5-scripts\") on node \"crc\" DevicePath \"\"" Dec 07 19:36:33 crc kubenswrapper[4815]: I1207 19:36:33.588551 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pw99p\" (UniqueName: \"kubernetes.io/projected/83eabbaf-4dd5-4f1e-b9f8-bc98f0b295c5-kube-api-access-pw99p\") on node \"crc\" DevicePath \"\"" Dec 07 19:36:33 crc kubenswrapper[4815]: I1207 19:36:33.918979 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-qgp5k" event={"ID":"83eabbaf-4dd5-4f1e-b9f8-bc98f0b295c5","Type":"ContainerDied","Data":"4ac1adb440241f680071cd017447d5eecf913e7a2c4a42f9d0fd38a94bb96e65"} Dec 07 19:36:33 crc kubenswrapper[4815]: I1207 19:36:33.919032 4815 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4ac1adb440241f680071cd017447d5eecf913e7a2c4a42f9d0fd38a94bb96e65" Dec 07 19:36:33 crc kubenswrapper[4815]: I1207 19:36:33.919190 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-qgp5k" Dec 07 19:36:34 crc kubenswrapper[4815]: I1207 19:36:34.170519 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 07 19:36:34 crc kubenswrapper[4815]: E1207 19:36:34.180166 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83eabbaf-4dd5-4f1e-b9f8-bc98f0b295c5" containerName="nova-cell0-conductor-db-sync" Dec 07 19:36:34 crc kubenswrapper[4815]: I1207 19:36:34.180202 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="83eabbaf-4dd5-4f1e-b9f8-bc98f0b295c5" containerName="nova-cell0-conductor-db-sync" Dec 07 19:36:34 crc kubenswrapper[4815]: I1207 19:36:34.180410 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="83eabbaf-4dd5-4f1e-b9f8-bc98f0b295c5" containerName="nova-cell0-conductor-db-sync" Dec 07 19:36:34 crc kubenswrapper[4815]: I1207 19:36:34.181051 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 07 19:36:34 crc kubenswrapper[4815]: I1207 19:36:34.184707 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-bzmmg" Dec 07 19:36:34 crc kubenswrapper[4815]: I1207 19:36:34.185403 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 07 19:36:34 crc kubenswrapper[4815]: I1207 19:36:34.212433 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 07 19:36:34 crc kubenswrapper[4815]: I1207 19:36:34.346548 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2efcb819-9e42-4fbb-ad60-8264a02dc9a0-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"2efcb819-9e42-4fbb-ad60-8264a02dc9a0\") " pod="openstack/nova-cell0-conductor-0" Dec 07 19:36:34 crc kubenswrapper[4815]: I1207 19:36:34.346620 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88j8q\" (UniqueName: \"kubernetes.io/projected/2efcb819-9e42-4fbb-ad60-8264a02dc9a0-kube-api-access-88j8q\") pod \"nova-cell0-conductor-0\" (UID: \"2efcb819-9e42-4fbb-ad60-8264a02dc9a0\") " pod="openstack/nova-cell0-conductor-0" Dec 07 19:36:34 crc kubenswrapper[4815]: I1207 19:36:34.346790 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2efcb819-9e42-4fbb-ad60-8264a02dc9a0-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"2efcb819-9e42-4fbb-ad60-8264a02dc9a0\") " pod="openstack/nova-cell0-conductor-0" Dec 07 19:36:34 crc kubenswrapper[4815]: I1207 19:36:34.449889 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2efcb819-9e42-4fbb-ad60-8264a02dc9a0-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"2efcb819-9e42-4fbb-ad60-8264a02dc9a0\") " pod="openstack/nova-cell0-conductor-0" Dec 07 19:36:34 crc kubenswrapper[4815]: I1207 19:36:34.450001 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88j8q\" (UniqueName: \"kubernetes.io/projected/2efcb819-9e42-4fbb-ad60-8264a02dc9a0-kube-api-access-88j8q\") pod \"nova-cell0-conductor-0\" (UID: \"2efcb819-9e42-4fbb-ad60-8264a02dc9a0\") " pod="openstack/nova-cell0-conductor-0" Dec 07 19:36:34 crc kubenswrapper[4815]: I1207 19:36:34.450132 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2efcb819-9e42-4fbb-ad60-8264a02dc9a0-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"2efcb819-9e42-4fbb-ad60-8264a02dc9a0\") " pod="openstack/nova-cell0-conductor-0" Dec 07 19:36:34 crc kubenswrapper[4815]: I1207 19:36:34.458524 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2efcb819-9e42-4fbb-ad60-8264a02dc9a0-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"2efcb819-9e42-4fbb-ad60-8264a02dc9a0\") " pod="openstack/nova-cell0-conductor-0" Dec 07 19:36:34 crc kubenswrapper[4815]: I1207 19:36:34.458669 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2efcb819-9e42-4fbb-ad60-8264a02dc9a0-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"2efcb819-9e42-4fbb-ad60-8264a02dc9a0\") " pod="openstack/nova-cell0-conductor-0" Dec 07 19:36:34 crc kubenswrapper[4815]: I1207 19:36:34.469333 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88j8q\" (UniqueName: \"kubernetes.io/projected/2efcb819-9e42-4fbb-ad60-8264a02dc9a0-kube-api-access-88j8q\") pod \"nova-cell0-conductor-0\" (UID: \"2efcb819-9e42-4fbb-ad60-8264a02dc9a0\") " pod="openstack/nova-cell0-conductor-0" Dec 07 19:36:34 crc kubenswrapper[4815]: I1207 19:36:34.514070 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 07 19:36:35 crc kubenswrapper[4815]: I1207 19:36:35.026657 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 07 19:36:35 crc kubenswrapper[4815]: I1207 19:36:35.944508 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"2efcb819-9e42-4fbb-ad60-8264a02dc9a0","Type":"ContainerStarted","Data":"0ea0ee3b88e35ed8fff1368d8cb38987cacc9146000479099b0189f001827652"} Dec 07 19:36:35 crc kubenswrapper[4815]: I1207 19:36:35.945873 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 07 19:36:35 crc kubenswrapper[4815]: I1207 19:36:35.945983 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"2efcb819-9e42-4fbb-ad60-8264a02dc9a0","Type":"ContainerStarted","Data":"f3d1579b450910fba4f6de97429f175339e5b678f7cf14edc492ca3309923502"} Dec 07 19:36:35 crc kubenswrapper[4815]: I1207 19:36:35.978898 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=1.978881319 podStartE2EDuration="1.978881319s" podCreationTimestamp="2025-12-07 19:36:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:36:35.974357741 +0000 UTC m=+1300.553347786" watchObservedRunningTime="2025-12-07 19:36:35.978881319 +0000 UTC m=+1300.557871364" Dec 07 19:36:44 crc kubenswrapper[4815]: I1207 19:36:44.548589 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Dec 07 19:36:45 crc kubenswrapper[4815]: I1207 19:36:45.097254 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-tsf6c"] Dec 07 19:36:45 crc kubenswrapper[4815]: I1207 19:36:45.098588 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-tsf6c" Dec 07 19:36:45 crc kubenswrapper[4815]: I1207 19:36:45.115488 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-tsf6c"] Dec 07 19:36:45 crc kubenswrapper[4815]: I1207 19:36:45.117957 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Dec 07 19:36:45 crc kubenswrapper[4815]: I1207 19:36:45.118013 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Dec 07 19:36:45 crc kubenswrapper[4815]: I1207 19:36:45.221243 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cdbde09-9f1a-4448-8fa3-372d29371084-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-tsf6c\" (UID: \"2cdbde09-9f1a-4448-8fa3-372d29371084\") " pod="openstack/nova-cell0-cell-mapping-tsf6c" Dec 07 19:36:45 crc kubenswrapper[4815]: I1207 19:36:45.221879 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2cdbde09-9f1a-4448-8fa3-372d29371084-scripts\") pod \"nova-cell0-cell-mapping-tsf6c\" (UID: \"2cdbde09-9f1a-4448-8fa3-372d29371084\") " pod="openstack/nova-cell0-cell-mapping-tsf6c" Dec 07 19:36:45 crc kubenswrapper[4815]: I1207 19:36:45.222076 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cdbde09-9f1a-4448-8fa3-372d29371084-config-data\") pod \"nova-cell0-cell-mapping-tsf6c\" (UID: \"2cdbde09-9f1a-4448-8fa3-372d29371084\") " pod="openstack/nova-cell0-cell-mapping-tsf6c" Dec 07 19:36:45 crc kubenswrapper[4815]: I1207 19:36:45.222218 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bx5vt\" (UniqueName: \"kubernetes.io/projected/2cdbde09-9f1a-4448-8fa3-372d29371084-kube-api-access-bx5vt\") pod \"nova-cell0-cell-mapping-tsf6c\" (UID: \"2cdbde09-9f1a-4448-8fa3-372d29371084\") " pod="openstack/nova-cell0-cell-mapping-tsf6c" Dec 07 19:36:45 crc kubenswrapper[4815]: I1207 19:36:45.315866 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 07 19:36:45 crc kubenswrapper[4815]: I1207 19:36:45.319848 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 07 19:36:45 crc kubenswrapper[4815]: I1207 19:36:45.323125 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bx5vt\" (UniqueName: \"kubernetes.io/projected/2cdbde09-9f1a-4448-8fa3-372d29371084-kube-api-access-bx5vt\") pod \"nova-cell0-cell-mapping-tsf6c\" (UID: \"2cdbde09-9f1a-4448-8fa3-372d29371084\") " pod="openstack/nova-cell0-cell-mapping-tsf6c" Dec 07 19:36:45 crc kubenswrapper[4815]: I1207 19:36:45.323251 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ade483a-7452-4233-be4b-d2de4c5c6549-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8ade483a-7452-4233-be4b-d2de4c5c6549\") " pod="openstack/nova-api-0" Dec 07 19:36:45 crc kubenswrapper[4815]: I1207 19:36:45.323303 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cdbde09-9f1a-4448-8fa3-372d29371084-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-tsf6c\" (UID: \"2cdbde09-9f1a-4448-8fa3-372d29371084\") " pod="openstack/nova-cell0-cell-mapping-tsf6c" Dec 07 19:36:45 crc kubenswrapper[4815]: I1207 19:36:45.323327 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ade483a-7452-4233-be4b-d2de4c5c6549-logs\") pod \"nova-api-0\" (UID: \"8ade483a-7452-4233-be4b-d2de4c5c6549\") " pod="openstack/nova-api-0" Dec 07 19:36:45 crc kubenswrapper[4815]: I1207 19:36:45.323353 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ade483a-7452-4233-be4b-d2de4c5c6549-config-data\") pod \"nova-api-0\" (UID: \"8ade483a-7452-4233-be4b-d2de4c5c6549\") " pod="openstack/nova-api-0" Dec 07 19:36:45 crc kubenswrapper[4815]: I1207 19:36:45.323381 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjxvb\" (UniqueName: \"kubernetes.io/projected/8ade483a-7452-4233-be4b-d2de4c5c6549-kube-api-access-fjxvb\") pod \"nova-api-0\" (UID: \"8ade483a-7452-4233-be4b-d2de4c5c6549\") " pod="openstack/nova-api-0" Dec 07 19:36:45 crc kubenswrapper[4815]: I1207 19:36:45.323406 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2cdbde09-9f1a-4448-8fa3-372d29371084-scripts\") pod \"nova-cell0-cell-mapping-tsf6c\" (UID: \"2cdbde09-9f1a-4448-8fa3-372d29371084\") " pod="openstack/nova-cell0-cell-mapping-tsf6c" Dec 07 19:36:45 crc kubenswrapper[4815]: I1207 19:36:45.323462 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cdbde09-9f1a-4448-8fa3-372d29371084-config-data\") pod \"nova-cell0-cell-mapping-tsf6c\" (UID: \"2cdbde09-9f1a-4448-8fa3-372d29371084\") " pod="openstack/nova-cell0-cell-mapping-tsf6c" Dec 07 19:36:45 crc kubenswrapper[4815]: I1207 19:36:45.324999 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 07 19:36:45 crc kubenswrapper[4815]: I1207 19:36:45.372538 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bx5vt\" (UniqueName: \"kubernetes.io/projected/2cdbde09-9f1a-4448-8fa3-372d29371084-kube-api-access-bx5vt\") pod \"nova-cell0-cell-mapping-tsf6c\" (UID: \"2cdbde09-9f1a-4448-8fa3-372d29371084\") " pod="openstack/nova-cell0-cell-mapping-tsf6c" Dec 07 19:36:45 crc kubenswrapper[4815]: I1207 19:36:45.372995 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2cdbde09-9f1a-4448-8fa3-372d29371084-scripts\") pod \"nova-cell0-cell-mapping-tsf6c\" (UID: \"2cdbde09-9f1a-4448-8fa3-372d29371084\") " pod="openstack/nova-cell0-cell-mapping-tsf6c" Dec 07 19:36:45 crc kubenswrapper[4815]: I1207 19:36:45.373647 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cdbde09-9f1a-4448-8fa3-372d29371084-config-data\") pod \"nova-cell0-cell-mapping-tsf6c\" (UID: \"2cdbde09-9f1a-4448-8fa3-372d29371084\") " pod="openstack/nova-cell0-cell-mapping-tsf6c" Dec 07 19:36:45 crc kubenswrapper[4815]: I1207 19:36:45.392936 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cdbde09-9f1a-4448-8fa3-372d29371084-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-tsf6c\" (UID: \"2cdbde09-9f1a-4448-8fa3-372d29371084\") " pod="openstack/nova-cell0-cell-mapping-tsf6c" Dec 07 19:36:45 crc kubenswrapper[4815]: I1207 19:36:45.395143 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 07 19:36:45 crc kubenswrapper[4815]: I1207 19:36:45.396424 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 07 19:36:45 crc kubenswrapper[4815]: I1207 19:36:45.400166 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 07 19:36:45 crc kubenswrapper[4815]: I1207 19:36:45.425534 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 07 19:36:45 crc kubenswrapper[4815]: I1207 19:36:45.426972 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ade483a-7452-4233-be4b-d2de4c5c6549-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8ade483a-7452-4233-be4b-d2de4c5c6549\") " pod="openstack/nova-api-0" Dec 07 19:36:45 crc kubenswrapper[4815]: I1207 19:36:45.427159 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ade483a-7452-4233-be4b-d2de4c5c6549-logs\") pod \"nova-api-0\" (UID: \"8ade483a-7452-4233-be4b-d2de4c5c6549\") " pod="openstack/nova-api-0" Dec 07 19:36:45 crc kubenswrapper[4815]: I1207 19:36:45.427246 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ade483a-7452-4233-be4b-d2de4c5c6549-config-data\") pod \"nova-api-0\" (UID: \"8ade483a-7452-4233-be4b-d2de4c5c6549\") " pod="openstack/nova-api-0" Dec 07 19:36:45 crc kubenswrapper[4815]: I1207 19:36:45.427328 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjxvb\" (UniqueName: \"kubernetes.io/projected/8ade483a-7452-4233-be4b-d2de4c5c6549-kube-api-access-fjxvb\") pod \"nova-api-0\" (UID: \"8ade483a-7452-4233-be4b-d2de4c5c6549\") " pod="openstack/nova-api-0" Dec 07 19:36:45 crc kubenswrapper[4815]: I1207 19:36:45.439374 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-tsf6c" Dec 07 19:36:45 crc kubenswrapper[4815]: I1207 19:36:45.440313 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ade483a-7452-4233-be4b-d2de4c5c6549-logs\") pod \"nova-api-0\" (UID: \"8ade483a-7452-4233-be4b-d2de4c5c6549\") " pod="openstack/nova-api-0" Dec 07 19:36:45 crc kubenswrapper[4815]: I1207 19:36:45.441533 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 07 19:36:45 crc kubenswrapper[4815]: I1207 19:36:45.443437 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ade483a-7452-4233-be4b-d2de4c5c6549-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8ade483a-7452-4233-be4b-d2de4c5c6549\") " pod="openstack/nova-api-0" Dec 07 19:36:45 crc kubenswrapper[4815]: I1207 19:36:45.451593 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ade483a-7452-4233-be4b-d2de4c5c6549-config-data\") pod \"nova-api-0\" (UID: \"8ade483a-7452-4233-be4b-d2de4c5c6549\") " pod="openstack/nova-api-0" Dec 07 19:36:45 crc kubenswrapper[4815]: I1207 19:36:45.481852 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjxvb\" (UniqueName: \"kubernetes.io/projected/8ade483a-7452-4233-be4b-d2de4c5c6549-kube-api-access-fjxvb\") pod \"nova-api-0\" (UID: \"8ade483a-7452-4233-be4b-d2de4c5c6549\") " pod="openstack/nova-api-0" Dec 07 19:36:45 crc kubenswrapper[4815]: I1207 19:36:45.559561 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46b8c248-9f13-484f-8401-ecb16b5c05fd-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"46b8c248-9f13-484f-8401-ecb16b5c05fd\") " pod="openstack/nova-scheduler-0" Dec 07 19:36:45 crc kubenswrapper[4815]: I1207 19:36:45.559670 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44ff4\" (UniqueName: \"kubernetes.io/projected/46b8c248-9f13-484f-8401-ecb16b5c05fd-kube-api-access-44ff4\") pod \"nova-scheduler-0\" (UID: \"46b8c248-9f13-484f-8401-ecb16b5c05fd\") " pod="openstack/nova-scheduler-0" Dec 07 19:36:45 crc kubenswrapper[4815]: I1207 19:36:45.559710 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46b8c248-9f13-484f-8401-ecb16b5c05fd-config-data\") pod \"nova-scheduler-0\" (UID: \"46b8c248-9f13-484f-8401-ecb16b5c05fd\") " pod="openstack/nova-scheduler-0" Dec 07 19:36:45 crc kubenswrapper[4815]: I1207 19:36:45.594520 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 07 19:36:45 crc kubenswrapper[4815]: I1207 19:36:45.666031 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46b8c248-9f13-484f-8401-ecb16b5c05fd-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"46b8c248-9f13-484f-8401-ecb16b5c05fd\") " pod="openstack/nova-scheduler-0" Dec 07 19:36:45 crc kubenswrapper[4815]: I1207 19:36:45.666299 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44ff4\" (UniqueName: \"kubernetes.io/projected/46b8c248-9f13-484f-8401-ecb16b5c05fd-kube-api-access-44ff4\") pod \"nova-scheduler-0\" (UID: \"46b8c248-9f13-484f-8401-ecb16b5c05fd\") " pod="openstack/nova-scheduler-0" Dec 07 19:36:45 crc kubenswrapper[4815]: I1207 19:36:45.666322 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46b8c248-9f13-484f-8401-ecb16b5c05fd-config-data\") pod \"nova-scheduler-0\" (UID: \"46b8c248-9f13-484f-8401-ecb16b5c05fd\") " pod="openstack/nova-scheduler-0" Dec 07 19:36:45 crc kubenswrapper[4815]: I1207 19:36:45.673690 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46b8c248-9f13-484f-8401-ecb16b5c05fd-config-data\") pod \"nova-scheduler-0\" (UID: \"46b8c248-9f13-484f-8401-ecb16b5c05fd\") " pod="openstack/nova-scheduler-0" Dec 07 19:36:45 crc kubenswrapper[4815]: I1207 19:36:45.675268 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 07 19:36:45 crc kubenswrapper[4815]: I1207 19:36:45.687817 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 07 19:36:45 crc kubenswrapper[4815]: I1207 19:36:45.700686 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 07 19:36:45 crc kubenswrapper[4815]: I1207 19:36:45.706822 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46b8c248-9f13-484f-8401-ecb16b5c05fd-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"46b8c248-9f13-484f-8401-ecb16b5c05fd\") " pod="openstack/nova-scheduler-0" Dec 07 19:36:45 crc kubenswrapper[4815]: I1207 19:36:45.719562 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 07 19:36:45 crc kubenswrapper[4815]: I1207 19:36:45.728005 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44ff4\" (UniqueName: \"kubernetes.io/projected/46b8c248-9f13-484f-8401-ecb16b5c05fd-kube-api-access-44ff4\") pod \"nova-scheduler-0\" (UID: \"46b8c248-9f13-484f-8401-ecb16b5c05fd\") " pod="openstack/nova-scheduler-0" Dec 07 19:36:45 crc kubenswrapper[4815]: I1207 19:36:45.771554 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a19e8619-f5e9-449d-87c5-ce78d91af395-config-data\") pod \"nova-metadata-0\" (UID: \"a19e8619-f5e9-449d-87c5-ce78d91af395\") " pod="openstack/nova-metadata-0" Dec 07 19:36:45 crc kubenswrapper[4815]: I1207 19:36:45.771608 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mj2fv\" (UniqueName: \"kubernetes.io/projected/a19e8619-f5e9-449d-87c5-ce78d91af395-kube-api-access-mj2fv\") pod \"nova-metadata-0\" (UID: \"a19e8619-f5e9-449d-87c5-ce78d91af395\") " pod="openstack/nova-metadata-0" Dec 07 19:36:45 crc kubenswrapper[4815]: I1207 19:36:45.771631 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a19e8619-f5e9-449d-87c5-ce78d91af395-logs\") pod \"nova-metadata-0\" (UID: \"a19e8619-f5e9-449d-87c5-ce78d91af395\") " pod="openstack/nova-metadata-0" Dec 07 19:36:45 crc kubenswrapper[4815]: I1207 19:36:45.771754 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a19e8619-f5e9-449d-87c5-ce78d91af395-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a19e8619-f5e9-449d-87c5-ce78d91af395\") " pod="openstack/nova-metadata-0" Dec 07 19:36:45 crc kubenswrapper[4815]: I1207 19:36:45.824982 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 07 19:36:45 crc kubenswrapper[4815]: I1207 19:36:45.826237 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8b8cf6657-nlxw2"] Dec 07 19:36:45 crc kubenswrapper[4815]: I1207 19:36:45.826429 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 07 19:36:45 crc kubenswrapper[4815]: I1207 19:36:45.870221 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 07 19:36:45 crc kubenswrapper[4815]: I1207 19:36:45.871384 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 07 19:36:45 crc kubenswrapper[4815]: I1207 19:36:45.872769 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b8cf6657-nlxw2" Dec 07 19:36:45 crc kubenswrapper[4815]: I1207 19:36:45.889533 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75629150-d865-4487-93f7-5b60e2194ab7-dns-svc\") pod \"dnsmasq-dns-8b8cf6657-nlxw2\" (UID: \"75629150-d865-4487-93f7-5b60e2194ab7\") " pod="openstack/dnsmasq-dns-8b8cf6657-nlxw2" Dec 07 19:36:45 crc kubenswrapper[4815]: I1207 19:36:45.889880 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/75629150-d865-4487-93f7-5b60e2194ab7-ovsdbserver-sb\") pod \"dnsmasq-dns-8b8cf6657-nlxw2\" (UID: \"75629150-d865-4487-93f7-5b60e2194ab7\") " pod="openstack/dnsmasq-dns-8b8cf6657-nlxw2" Dec 07 19:36:45 crc kubenswrapper[4815]: I1207 19:36:45.890055 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75629150-d865-4487-93f7-5b60e2194ab7-config\") pod \"dnsmasq-dns-8b8cf6657-nlxw2\" (UID: \"75629150-d865-4487-93f7-5b60e2194ab7\") " pod="openstack/dnsmasq-dns-8b8cf6657-nlxw2" Dec 07 19:36:45 crc kubenswrapper[4815]: I1207 19:36:45.890358 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8zbh\" (UniqueName: \"kubernetes.io/projected/4b346b54-d142-4628-af7d-0479df88e469-kube-api-access-x8zbh\") pod \"nova-cell1-novncproxy-0\" (UID: \"4b346b54-d142-4628-af7d-0479df88e469\") " pod="openstack/nova-cell1-novncproxy-0" Dec 07 19:36:45 crc kubenswrapper[4815]: I1207 19:36:45.890413 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a19e8619-f5e9-449d-87c5-ce78d91af395-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a19e8619-f5e9-449d-87c5-ce78d91af395\") " pod="openstack/nova-metadata-0" Dec 07 19:36:45 crc kubenswrapper[4815]: I1207 19:36:45.890449 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4xpn\" (UniqueName: \"kubernetes.io/projected/75629150-d865-4487-93f7-5b60e2194ab7-kube-api-access-n4xpn\") pod \"dnsmasq-dns-8b8cf6657-nlxw2\" (UID: \"75629150-d865-4487-93f7-5b60e2194ab7\") " pod="openstack/dnsmasq-dns-8b8cf6657-nlxw2" Dec 07 19:36:45 crc kubenswrapper[4815]: I1207 19:36:45.890490 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/75629150-d865-4487-93f7-5b60e2194ab7-ovsdbserver-nb\") pod \"dnsmasq-dns-8b8cf6657-nlxw2\" (UID: \"75629150-d865-4487-93f7-5b60e2194ab7\") " pod="openstack/dnsmasq-dns-8b8cf6657-nlxw2" Dec 07 19:36:45 crc kubenswrapper[4815]: I1207 19:36:45.890550 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b346b54-d142-4628-af7d-0479df88e469-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"4b346b54-d142-4628-af7d-0479df88e469\") " pod="openstack/nova-cell1-novncproxy-0" Dec 07 19:36:45 crc kubenswrapper[4815]: I1207 19:36:45.890595 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b346b54-d142-4628-af7d-0479df88e469-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"4b346b54-d142-4628-af7d-0479df88e469\") " pod="openstack/nova-cell1-novncproxy-0" Dec 07 19:36:45 crc kubenswrapper[4815]: I1207 19:36:45.890707 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a19e8619-f5e9-449d-87c5-ce78d91af395-config-data\") pod \"nova-metadata-0\" (UID: \"a19e8619-f5e9-449d-87c5-ce78d91af395\") " pod="openstack/nova-metadata-0" Dec 07 19:36:45 crc kubenswrapper[4815]: I1207 19:36:45.890756 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mj2fv\" (UniqueName: \"kubernetes.io/projected/a19e8619-f5e9-449d-87c5-ce78d91af395-kube-api-access-mj2fv\") pod \"nova-metadata-0\" (UID: \"a19e8619-f5e9-449d-87c5-ce78d91af395\") " pod="openstack/nova-metadata-0" Dec 07 19:36:45 crc kubenswrapper[4815]: I1207 19:36:45.890798 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a19e8619-f5e9-449d-87c5-ce78d91af395-logs\") pod \"nova-metadata-0\" (UID: \"a19e8619-f5e9-449d-87c5-ce78d91af395\") " pod="openstack/nova-metadata-0" Dec 07 19:36:45 crc kubenswrapper[4815]: I1207 19:36:45.891429 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a19e8619-f5e9-449d-87c5-ce78d91af395-logs\") pod \"nova-metadata-0\" (UID: \"a19e8619-f5e9-449d-87c5-ce78d91af395\") " pod="openstack/nova-metadata-0" Dec 07 19:36:45 crc kubenswrapper[4815]: I1207 19:36:45.911362 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a19e8619-f5e9-449d-87c5-ce78d91af395-config-data\") pod \"nova-metadata-0\" (UID: \"a19e8619-f5e9-449d-87c5-ce78d91af395\") " pod="openstack/nova-metadata-0" Dec 07 19:36:45 crc kubenswrapper[4815]: I1207 19:36:45.917277 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 07 19:36:45 crc kubenswrapper[4815]: I1207 19:36:45.920619 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a19e8619-f5e9-449d-87c5-ce78d91af395-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a19e8619-f5e9-449d-87c5-ce78d91af395\") " pod="openstack/nova-metadata-0" Dec 07 19:36:45 crc kubenswrapper[4815]: I1207 19:36:45.926017 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mj2fv\" (UniqueName: \"kubernetes.io/projected/a19e8619-f5e9-449d-87c5-ce78d91af395-kube-api-access-mj2fv\") pod \"nova-metadata-0\" (UID: \"a19e8619-f5e9-449d-87c5-ce78d91af395\") " pod="openstack/nova-metadata-0" Dec 07 19:36:45 crc kubenswrapper[4815]: I1207 19:36:45.949073 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8b8cf6657-nlxw2"] Dec 07 19:36:46 crc kubenswrapper[4815]: I1207 19:36:45.998941 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75629150-d865-4487-93f7-5b60e2194ab7-dns-svc\") pod \"dnsmasq-dns-8b8cf6657-nlxw2\" (UID: \"75629150-d865-4487-93f7-5b60e2194ab7\") " pod="openstack/dnsmasq-dns-8b8cf6657-nlxw2" Dec 07 19:36:46 crc kubenswrapper[4815]: I1207 19:36:45.998989 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/75629150-d865-4487-93f7-5b60e2194ab7-ovsdbserver-sb\") pod \"dnsmasq-dns-8b8cf6657-nlxw2\" (UID: \"75629150-d865-4487-93f7-5b60e2194ab7\") " pod="openstack/dnsmasq-dns-8b8cf6657-nlxw2" Dec 07 19:36:46 crc kubenswrapper[4815]: I1207 19:36:45.999073 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75629150-d865-4487-93f7-5b60e2194ab7-config\") pod \"dnsmasq-dns-8b8cf6657-nlxw2\" (UID: \"75629150-d865-4487-93f7-5b60e2194ab7\") " pod="openstack/dnsmasq-dns-8b8cf6657-nlxw2" Dec 07 19:36:46 crc kubenswrapper[4815]: I1207 19:36:45.999244 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8zbh\" (UniqueName: \"kubernetes.io/projected/4b346b54-d142-4628-af7d-0479df88e469-kube-api-access-x8zbh\") pod \"nova-cell1-novncproxy-0\" (UID: \"4b346b54-d142-4628-af7d-0479df88e469\") " pod="openstack/nova-cell1-novncproxy-0" Dec 07 19:36:46 crc kubenswrapper[4815]: I1207 19:36:45.999275 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4xpn\" (UniqueName: \"kubernetes.io/projected/75629150-d865-4487-93f7-5b60e2194ab7-kube-api-access-n4xpn\") pod \"dnsmasq-dns-8b8cf6657-nlxw2\" (UID: \"75629150-d865-4487-93f7-5b60e2194ab7\") " pod="openstack/dnsmasq-dns-8b8cf6657-nlxw2" Dec 07 19:36:46 crc kubenswrapper[4815]: I1207 19:36:45.999297 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/75629150-d865-4487-93f7-5b60e2194ab7-ovsdbserver-nb\") pod \"dnsmasq-dns-8b8cf6657-nlxw2\" (UID: \"75629150-d865-4487-93f7-5b60e2194ab7\") " pod="openstack/dnsmasq-dns-8b8cf6657-nlxw2" Dec 07 19:36:46 crc kubenswrapper[4815]: I1207 19:36:45.999340 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b346b54-d142-4628-af7d-0479df88e469-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"4b346b54-d142-4628-af7d-0479df88e469\") " pod="openstack/nova-cell1-novncproxy-0" Dec 07 19:36:46 crc kubenswrapper[4815]: I1207 19:36:45.999365 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b346b54-d142-4628-af7d-0479df88e469-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"4b346b54-d142-4628-af7d-0479df88e469\") " pod="openstack/nova-cell1-novncproxy-0" Dec 07 19:36:46 crc kubenswrapper[4815]: I1207 19:36:46.001845 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/75629150-d865-4487-93f7-5b60e2194ab7-ovsdbserver-nb\") pod \"dnsmasq-dns-8b8cf6657-nlxw2\" (UID: \"75629150-d865-4487-93f7-5b60e2194ab7\") " pod="openstack/dnsmasq-dns-8b8cf6657-nlxw2" Dec 07 19:36:46 crc kubenswrapper[4815]: I1207 19:36:46.002209 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75629150-d865-4487-93f7-5b60e2194ab7-dns-svc\") pod \"dnsmasq-dns-8b8cf6657-nlxw2\" (UID: \"75629150-d865-4487-93f7-5b60e2194ab7\") " pod="openstack/dnsmasq-dns-8b8cf6657-nlxw2" Dec 07 19:36:46 crc kubenswrapper[4815]: I1207 19:36:46.002768 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75629150-d865-4487-93f7-5b60e2194ab7-config\") pod \"dnsmasq-dns-8b8cf6657-nlxw2\" (UID: \"75629150-d865-4487-93f7-5b60e2194ab7\") " pod="openstack/dnsmasq-dns-8b8cf6657-nlxw2" Dec 07 19:36:46 crc kubenswrapper[4815]: I1207 19:36:46.004314 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/75629150-d865-4487-93f7-5b60e2194ab7-ovsdbserver-sb\") pod \"dnsmasq-dns-8b8cf6657-nlxw2\" (UID: \"75629150-d865-4487-93f7-5b60e2194ab7\") " pod="openstack/dnsmasq-dns-8b8cf6657-nlxw2" Dec 07 19:36:46 crc kubenswrapper[4815]: I1207 19:36:46.009859 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b346b54-d142-4628-af7d-0479df88e469-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"4b346b54-d142-4628-af7d-0479df88e469\") " pod="openstack/nova-cell1-novncproxy-0" Dec 07 19:36:46 crc kubenswrapper[4815]: I1207 19:36:46.017371 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b346b54-d142-4628-af7d-0479df88e469-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"4b346b54-d142-4628-af7d-0479df88e469\") " pod="openstack/nova-cell1-novncproxy-0" Dec 07 19:36:46 crc kubenswrapper[4815]: I1207 19:36:46.031283 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8zbh\" (UniqueName: \"kubernetes.io/projected/4b346b54-d142-4628-af7d-0479df88e469-kube-api-access-x8zbh\") pod \"nova-cell1-novncproxy-0\" (UID: \"4b346b54-d142-4628-af7d-0479df88e469\") " pod="openstack/nova-cell1-novncproxy-0" Dec 07 19:36:46 crc kubenswrapper[4815]: I1207 19:36:46.032152 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4xpn\" (UniqueName: \"kubernetes.io/projected/75629150-d865-4487-93f7-5b60e2194ab7-kube-api-access-n4xpn\") pod \"dnsmasq-dns-8b8cf6657-nlxw2\" (UID: \"75629150-d865-4487-93f7-5b60e2194ab7\") " pod="openstack/dnsmasq-dns-8b8cf6657-nlxw2" Dec 07 19:36:46 crc kubenswrapper[4815]: I1207 19:36:46.066387 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 07 19:36:46 crc kubenswrapper[4815]: I1207 19:36:46.192147 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 07 19:36:46 crc kubenswrapper[4815]: I1207 19:36:46.219219 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-tsf6c"] Dec 07 19:36:46 crc kubenswrapper[4815]: I1207 19:36:46.236933 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b8cf6657-nlxw2" Dec 07 19:36:46 crc kubenswrapper[4815]: I1207 19:36:46.486489 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 07 19:36:46 crc kubenswrapper[4815]: I1207 19:36:46.501412 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 07 19:36:46 crc kubenswrapper[4815]: I1207 19:36:46.765971 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-nrp9x"] Dec 07 19:36:46 crc kubenswrapper[4815]: I1207 19:36:46.767774 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-nrp9x" Dec 07 19:36:46 crc kubenswrapper[4815]: I1207 19:36:46.771536 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 07 19:36:46 crc kubenswrapper[4815]: I1207 19:36:46.772101 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Dec 07 19:36:46 crc kubenswrapper[4815]: I1207 19:36:46.786259 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-nrp9x"] Dec 07 19:36:46 crc kubenswrapper[4815]: I1207 19:36:46.847180 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 07 19:36:46 crc kubenswrapper[4815]: I1207 19:36:46.852987 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e95a85c7-1619-4d79-bc43-3acc00d3ab9a-scripts\") pod \"nova-cell1-conductor-db-sync-nrp9x\" (UID: \"e95a85c7-1619-4d79-bc43-3acc00d3ab9a\") " pod="openstack/nova-cell1-conductor-db-sync-nrp9x" Dec 07 19:36:46 crc kubenswrapper[4815]: I1207 19:36:46.853062 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpf6k\" (UniqueName: \"kubernetes.io/projected/e95a85c7-1619-4d79-bc43-3acc00d3ab9a-kube-api-access-tpf6k\") pod \"nova-cell1-conductor-db-sync-nrp9x\" (UID: \"e95a85c7-1619-4d79-bc43-3acc00d3ab9a\") " pod="openstack/nova-cell1-conductor-db-sync-nrp9x" Dec 07 19:36:46 crc kubenswrapper[4815]: I1207 19:36:46.853087 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e95a85c7-1619-4d79-bc43-3acc00d3ab9a-config-data\") pod \"nova-cell1-conductor-db-sync-nrp9x\" (UID: \"e95a85c7-1619-4d79-bc43-3acc00d3ab9a\") " pod="openstack/nova-cell1-conductor-db-sync-nrp9x" Dec 07 19:36:46 crc kubenswrapper[4815]: I1207 19:36:46.853118 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e95a85c7-1619-4d79-bc43-3acc00d3ab9a-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-nrp9x\" (UID: \"e95a85c7-1619-4d79-bc43-3acc00d3ab9a\") " pod="openstack/nova-cell1-conductor-db-sync-nrp9x" Dec 07 19:36:46 crc kubenswrapper[4815]: I1207 19:36:46.955934 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e95a85c7-1619-4d79-bc43-3acc00d3ab9a-scripts\") pod \"nova-cell1-conductor-db-sync-nrp9x\" (UID: \"e95a85c7-1619-4d79-bc43-3acc00d3ab9a\") " pod="openstack/nova-cell1-conductor-db-sync-nrp9x" Dec 07 19:36:46 crc kubenswrapper[4815]: I1207 19:36:46.955990 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpf6k\" (UniqueName: \"kubernetes.io/projected/e95a85c7-1619-4d79-bc43-3acc00d3ab9a-kube-api-access-tpf6k\") pod \"nova-cell1-conductor-db-sync-nrp9x\" (UID: \"e95a85c7-1619-4d79-bc43-3acc00d3ab9a\") " pod="openstack/nova-cell1-conductor-db-sync-nrp9x" Dec 07 19:36:46 crc kubenswrapper[4815]: I1207 19:36:46.956012 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e95a85c7-1619-4d79-bc43-3acc00d3ab9a-config-data\") pod \"nova-cell1-conductor-db-sync-nrp9x\" (UID: \"e95a85c7-1619-4d79-bc43-3acc00d3ab9a\") " pod="openstack/nova-cell1-conductor-db-sync-nrp9x" Dec 07 19:36:46 crc kubenswrapper[4815]: I1207 19:36:46.956047 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e95a85c7-1619-4d79-bc43-3acc00d3ab9a-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-nrp9x\" (UID: \"e95a85c7-1619-4d79-bc43-3acc00d3ab9a\") " pod="openstack/nova-cell1-conductor-db-sync-nrp9x" Dec 07 19:36:46 crc kubenswrapper[4815]: I1207 19:36:46.962984 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e95a85c7-1619-4d79-bc43-3acc00d3ab9a-config-data\") pod \"nova-cell1-conductor-db-sync-nrp9x\" (UID: \"e95a85c7-1619-4d79-bc43-3acc00d3ab9a\") " pod="openstack/nova-cell1-conductor-db-sync-nrp9x" Dec 07 19:36:46 crc kubenswrapper[4815]: I1207 19:36:46.963435 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e95a85c7-1619-4d79-bc43-3acc00d3ab9a-scripts\") pod \"nova-cell1-conductor-db-sync-nrp9x\" (UID: \"e95a85c7-1619-4d79-bc43-3acc00d3ab9a\") " pod="openstack/nova-cell1-conductor-db-sync-nrp9x" Dec 07 19:36:46 crc kubenswrapper[4815]: I1207 19:36:46.967463 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e95a85c7-1619-4d79-bc43-3acc00d3ab9a-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-nrp9x\" (UID: \"e95a85c7-1619-4d79-bc43-3acc00d3ab9a\") " pod="openstack/nova-cell1-conductor-db-sync-nrp9x" Dec 07 19:36:46 crc kubenswrapper[4815]: I1207 19:36:46.973296 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpf6k\" (UniqueName: \"kubernetes.io/projected/e95a85c7-1619-4d79-bc43-3acc00d3ab9a-kube-api-access-tpf6k\") pod \"nova-cell1-conductor-db-sync-nrp9x\" (UID: \"e95a85c7-1619-4d79-bc43-3acc00d3ab9a\") " pod="openstack/nova-cell1-conductor-db-sync-nrp9x" Dec 07 19:36:47 crc kubenswrapper[4815]: I1207 19:36:47.069959 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8b8cf6657-nlxw2"] Dec 07 19:36:47 crc kubenswrapper[4815]: I1207 19:36:47.080573 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a19e8619-f5e9-449d-87c5-ce78d91af395","Type":"ContainerStarted","Data":"9fc8fb45ddce31a802ca82c0bdea75a713fa298f233b5309319891ad1b3d2c40"} Dec 07 19:36:47 crc kubenswrapper[4815]: I1207 19:36:47.083033 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"46b8c248-9f13-484f-8401-ecb16b5c05fd","Type":"ContainerStarted","Data":"0684989e52d790ebd57d6783bd48a00f7560b28d9e3c95ad12e32b19413e8398"} Dec 07 19:36:47 crc kubenswrapper[4815]: I1207 19:36:47.090954 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8ade483a-7452-4233-be4b-d2de4c5c6549","Type":"ContainerStarted","Data":"aa8dd24a810cf358e596ffd9e9cd2e7a5a28ee67aa3a392996092df58ecb1b1a"} Dec 07 19:36:47 crc kubenswrapper[4815]: I1207 19:36:47.092906 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-tsf6c" event={"ID":"2cdbde09-9f1a-4448-8fa3-372d29371084","Type":"ContainerStarted","Data":"6467d92ce190ab795185aadce7f6f83f049a27f8b66597f85497c938c9eacee3"} Dec 07 19:36:47 crc kubenswrapper[4815]: I1207 19:36:47.092989 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-tsf6c" event={"ID":"2cdbde09-9f1a-4448-8fa3-372d29371084","Type":"ContainerStarted","Data":"1b326df50b080fdd546b07dd85e1733ec781fd093e3afc4a57180cb6c947dcd0"} Dec 07 19:36:47 crc kubenswrapper[4815]: I1207 19:36:47.094010 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b8cf6657-nlxw2" event={"ID":"75629150-d865-4487-93f7-5b60e2194ab7","Type":"ContainerStarted","Data":"1dc4c5df84859027041de704c3221a6714807c1b20166873c96e4f9b88ae88a3"} Dec 07 19:36:47 crc kubenswrapper[4815]: I1207 19:36:47.097028 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-nrp9x" Dec 07 19:36:47 crc kubenswrapper[4815]: I1207 19:36:47.150724 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-tsf6c" podStartSLOduration=2.150707097 podStartE2EDuration="2.150707097s" podCreationTimestamp="2025-12-07 19:36:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:36:47.147821256 +0000 UTC m=+1311.726811301" watchObservedRunningTime="2025-12-07 19:36:47.150707097 +0000 UTC m=+1311.729697132" Dec 07 19:36:47 crc kubenswrapper[4815]: I1207 19:36:47.188302 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 07 19:36:47 crc kubenswrapper[4815]: I1207 19:36:47.747896 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-nrp9x"] Dec 07 19:36:48 crc kubenswrapper[4815]: I1207 19:36:48.104444 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"4b346b54-d142-4628-af7d-0479df88e469","Type":"ContainerStarted","Data":"613bacf9e69cbd4ad629e87824664890018715e3d183b14aa9489fa4c52dce67"} Dec 07 19:36:48 crc kubenswrapper[4815]: I1207 19:36:48.110135 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-nrp9x" event={"ID":"e95a85c7-1619-4d79-bc43-3acc00d3ab9a","Type":"ContainerStarted","Data":"a999466062f509667c45e105e9a97387211230533c4ceadb8cd4290c9acdccf5"} Dec 07 19:36:48 crc kubenswrapper[4815]: I1207 19:36:48.120932 4815 generic.go:334] "Generic (PLEG): container finished" podID="75629150-d865-4487-93f7-5b60e2194ab7" containerID="407ecc3e063b112e52b2224b78b55d52b4bba65fc1f7704a4eb0e3ebb97da842" exitCode=0 Dec 07 19:36:48 crc kubenswrapper[4815]: I1207 19:36:48.122008 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b8cf6657-nlxw2" event={"ID":"75629150-d865-4487-93f7-5b60e2194ab7","Type":"ContainerDied","Data":"407ecc3e063b112e52b2224b78b55d52b4bba65fc1f7704a4eb0e3ebb97da842"} Dec 07 19:36:49 crc kubenswrapper[4815]: I1207 19:36:49.266350 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-nrp9x" event={"ID":"e95a85c7-1619-4d79-bc43-3acc00d3ab9a","Type":"ContainerStarted","Data":"1c4ec4243730fce8a5a4eb907ccbef3c5cd68c4dff9ca432250d80287da1aad6"} Dec 07 19:36:49 crc kubenswrapper[4815]: I1207 19:36:49.299032 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b8cf6657-nlxw2" event={"ID":"75629150-d865-4487-93f7-5b60e2194ab7","Type":"ContainerStarted","Data":"18ced62d2d9a61bb2e5e92bf9fabb0293ce8263a8c67bd7ac58c5ba5feba5ba2"} Dec 07 19:36:49 crc kubenswrapper[4815]: I1207 19:36:49.300160 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8b8cf6657-nlxw2" Dec 07 19:36:49 crc kubenswrapper[4815]: I1207 19:36:49.303531 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-nrp9x" podStartSLOduration=3.303511433 podStartE2EDuration="3.303511433s" podCreationTimestamp="2025-12-07 19:36:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:36:49.297055681 +0000 UTC m=+1313.876045736" watchObservedRunningTime="2025-12-07 19:36:49.303511433 +0000 UTC m=+1313.882501478" Dec 07 19:36:49 crc kubenswrapper[4815]: I1207 19:36:49.349387 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8b8cf6657-nlxw2" podStartSLOduration=4.349369258 podStartE2EDuration="4.349369258s" podCreationTimestamp="2025-12-07 19:36:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:36:49.347184026 +0000 UTC m=+1313.926174081" watchObservedRunningTime="2025-12-07 19:36:49.349369258 +0000 UTC m=+1313.928359303" Dec 07 19:36:49 crc kubenswrapper[4815]: I1207 19:36:49.577517 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 07 19:36:49 crc kubenswrapper[4815]: I1207 19:36:49.657269 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 07 19:36:49 crc kubenswrapper[4815]: I1207 19:36:49.666024 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 07 19:36:52 crc kubenswrapper[4815]: I1207 19:36:52.332805 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a19e8619-f5e9-449d-87c5-ce78d91af395","Type":"ContainerStarted","Data":"2e68f59d3031524fd452c0742278d3666deb9d8355bf4a49eb552eef5630ae5e"} Dec 07 19:36:52 crc kubenswrapper[4815]: I1207 19:36:52.333241 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a19e8619-f5e9-449d-87c5-ce78d91af395","Type":"ContainerStarted","Data":"fb6bed024ebfbb77d55ee677f67757b14d7ee99c405d976cc72fbbb84ba8eae6"} Dec 07 19:36:52 crc kubenswrapper[4815]: I1207 19:36:52.333364 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="a19e8619-f5e9-449d-87c5-ce78d91af395" containerName="nova-metadata-log" containerID="cri-o://fb6bed024ebfbb77d55ee677f67757b14d7ee99c405d976cc72fbbb84ba8eae6" gracePeriod=30 Dec 07 19:36:52 crc kubenswrapper[4815]: I1207 19:36:52.333771 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="a19e8619-f5e9-449d-87c5-ce78d91af395" containerName="nova-metadata-metadata" containerID="cri-o://2e68f59d3031524fd452c0742278d3666deb9d8355bf4a49eb552eef5630ae5e" gracePeriod=30 Dec 07 19:36:52 crc kubenswrapper[4815]: I1207 19:36:52.343489 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"46b8c248-9f13-484f-8401-ecb16b5c05fd","Type":"ContainerStarted","Data":"ee6e5f2b7f2e5f0ab8ea23735fdba52239a9a6096f7bbfb4e0673cf2751ae3c1"} Dec 07 19:36:52 crc kubenswrapper[4815]: I1207 19:36:52.346031 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8ade483a-7452-4233-be4b-d2de4c5c6549","Type":"ContainerStarted","Data":"4b54e609f13e73854e6666a306978e488a02cd34ad291282f8d861f31cf1f41e"} Dec 07 19:36:52 crc kubenswrapper[4815]: I1207 19:36:52.346074 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8ade483a-7452-4233-be4b-d2de4c5c6549","Type":"ContainerStarted","Data":"52d71aaf46e7310cdbd3dc1313781b8c202d6ba10b188721cda334865571c29b"} Dec 07 19:36:52 crc kubenswrapper[4815]: I1207 19:36:52.347150 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"4b346b54-d142-4628-af7d-0479df88e469","Type":"ContainerStarted","Data":"5f54ddcfc950e92bc25aa657f4217c1eefdbd14f27202b47ba786b4e3c162e41"} Dec 07 19:36:52 crc kubenswrapper[4815]: I1207 19:36:52.347284 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="4b346b54-d142-4628-af7d-0479df88e469" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://5f54ddcfc950e92bc25aa657f4217c1eefdbd14f27202b47ba786b4e3c162e41" gracePeriod=30 Dec 07 19:36:52 crc kubenswrapper[4815]: I1207 19:36:52.383814 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.533108528 podStartE2EDuration="7.383797298s" podCreationTimestamp="2025-12-07 19:36:45 +0000 UTC" firstStartedPulling="2025-12-07 19:36:47.207025267 +0000 UTC m=+1311.786015312" lastFinishedPulling="2025-12-07 19:36:51.057714037 +0000 UTC m=+1315.636704082" observedRunningTime="2025-12-07 19:36:52.382057339 +0000 UTC m=+1316.961047384" watchObservedRunningTime="2025-12-07 19:36:52.383797298 +0000 UTC m=+1316.962787343" Dec 07 19:36:52 crc kubenswrapper[4815]: I1207 19:36:52.384584 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.157249598 podStartE2EDuration="7.38457917s" podCreationTimestamp="2025-12-07 19:36:45 +0000 UTC" firstStartedPulling="2025-12-07 19:36:46.84152494 +0000 UTC m=+1311.420514985" lastFinishedPulling="2025-12-07 19:36:51.068854512 +0000 UTC m=+1315.647844557" observedRunningTime="2025-12-07 19:36:52.361879379 +0000 UTC m=+1316.940869424" watchObservedRunningTime="2025-12-07 19:36:52.38457917 +0000 UTC m=+1316.963569215" Dec 07 19:36:52 crc kubenswrapper[4815]: I1207 19:36:52.403129 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.846415524 podStartE2EDuration="7.403111413s" podCreationTimestamp="2025-12-07 19:36:45 +0000 UTC" firstStartedPulling="2025-12-07 19:36:46.497096238 +0000 UTC m=+1311.076086283" lastFinishedPulling="2025-12-07 19:36:51.053792127 +0000 UTC m=+1315.632782172" observedRunningTime="2025-12-07 19:36:52.400775527 +0000 UTC m=+1316.979765582" watchObservedRunningTime="2025-12-07 19:36:52.403111413 +0000 UTC m=+1316.982101458" Dec 07 19:36:52 crc kubenswrapper[4815]: I1207 19:36:52.423267 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.923658025 podStartE2EDuration="7.423245972s" podCreationTimestamp="2025-12-07 19:36:45 +0000 UTC" firstStartedPulling="2025-12-07 19:36:46.558397668 +0000 UTC m=+1311.137387713" lastFinishedPulling="2025-12-07 19:36:51.057985615 +0000 UTC m=+1315.636975660" observedRunningTime="2025-12-07 19:36:52.4178732 +0000 UTC m=+1316.996863245" watchObservedRunningTime="2025-12-07 19:36:52.423245972 +0000 UTC m=+1317.002236027" Dec 07 19:36:52 crc kubenswrapper[4815]: I1207 19:36:52.614436 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 07 19:36:52 crc kubenswrapper[4815]: I1207 19:36:52.614668 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="8a52cbc1-9245-48e5-8b22-0cdf96dc671b" containerName="kube-state-metrics" containerID="cri-o://2e00117efd8b9f2d1f8e4f741b77a80615ae76d7a0a18cc88633da08214fe472" gracePeriod=30 Dec 07 19:36:53 crc kubenswrapper[4815]: I1207 19:36:53.163964 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 07 19:36:53 crc kubenswrapper[4815]: I1207 19:36:53.298232 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-79qvf\" (UniqueName: \"kubernetes.io/projected/8a52cbc1-9245-48e5-8b22-0cdf96dc671b-kube-api-access-79qvf\") pod \"8a52cbc1-9245-48e5-8b22-0cdf96dc671b\" (UID: \"8a52cbc1-9245-48e5-8b22-0cdf96dc671b\") " Dec 07 19:36:53 crc kubenswrapper[4815]: I1207 19:36:53.307144 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a52cbc1-9245-48e5-8b22-0cdf96dc671b-kube-api-access-79qvf" (OuterVolumeSpecName: "kube-api-access-79qvf") pod "8a52cbc1-9245-48e5-8b22-0cdf96dc671b" (UID: "8a52cbc1-9245-48e5-8b22-0cdf96dc671b"). InnerVolumeSpecName "kube-api-access-79qvf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:36:53 crc kubenswrapper[4815]: I1207 19:36:53.358861 4815 generic.go:334] "Generic (PLEG): container finished" podID="a19e8619-f5e9-449d-87c5-ce78d91af395" containerID="2e68f59d3031524fd452c0742278d3666deb9d8355bf4a49eb552eef5630ae5e" exitCode=0 Dec 07 19:36:53 crc kubenswrapper[4815]: I1207 19:36:53.359796 4815 generic.go:334] "Generic (PLEG): container finished" podID="a19e8619-f5e9-449d-87c5-ce78d91af395" containerID="fb6bed024ebfbb77d55ee677f67757b14d7ee99c405d976cc72fbbb84ba8eae6" exitCode=143 Dec 07 19:36:53 crc kubenswrapper[4815]: I1207 19:36:53.359760 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a19e8619-f5e9-449d-87c5-ce78d91af395","Type":"ContainerDied","Data":"2e68f59d3031524fd452c0742278d3666deb9d8355bf4a49eb552eef5630ae5e"} Dec 07 19:36:53 crc kubenswrapper[4815]: I1207 19:36:53.359920 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a19e8619-f5e9-449d-87c5-ce78d91af395","Type":"ContainerDied","Data":"fb6bed024ebfbb77d55ee677f67757b14d7ee99c405d976cc72fbbb84ba8eae6"} Dec 07 19:36:53 crc kubenswrapper[4815]: I1207 19:36:53.359950 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a19e8619-f5e9-449d-87c5-ce78d91af395","Type":"ContainerDied","Data":"9fc8fb45ddce31a802ca82c0bdea75a713fa298f233b5309319891ad1b3d2c40"} Dec 07 19:36:53 crc kubenswrapper[4815]: I1207 19:36:53.359962 4815 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9fc8fb45ddce31a802ca82c0bdea75a713fa298f233b5309319891ad1b3d2c40" Dec 07 19:36:53 crc kubenswrapper[4815]: I1207 19:36:53.362178 4815 generic.go:334] "Generic (PLEG): container finished" podID="8a52cbc1-9245-48e5-8b22-0cdf96dc671b" containerID="2e00117efd8b9f2d1f8e4f741b77a80615ae76d7a0a18cc88633da08214fe472" exitCode=2 Dec 07 19:36:53 crc kubenswrapper[4815]: I1207 19:36:53.362315 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8a52cbc1-9245-48e5-8b22-0cdf96dc671b","Type":"ContainerDied","Data":"2e00117efd8b9f2d1f8e4f741b77a80615ae76d7a0a18cc88633da08214fe472"} Dec 07 19:36:53 crc kubenswrapper[4815]: I1207 19:36:53.362348 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8a52cbc1-9245-48e5-8b22-0cdf96dc671b","Type":"ContainerDied","Data":"5a19d08b809ab8d14824d5dfe004c90a1298db2d2ff70003d6ea5533dbd522ed"} Dec 07 19:36:53 crc kubenswrapper[4815]: I1207 19:36:53.362369 4815 scope.go:117] "RemoveContainer" containerID="2e00117efd8b9f2d1f8e4f741b77a80615ae76d7a0a18cc88633da08214fe472" Dec 07 19:36:53 crc kubenswrapper[4815]: I1207 19:36:53.362522 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 07 19:36:53 crc kubenswrapper[4815]: I1207 19:36:53.400663 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-79qvf\" (UniqueName: \"kubernetes.io/projected/8a52cbc1-9245-48e5-8b22-0cdf96dc671b-kube-api-access-79qvf\") on node \"crc\" DevicePath \"\"" Dec 07 19:36:53 crc kubenswrapper[4815]: I1207 19:36:53.430794 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 07 19:36:53 crc kubenswrapper[4815]: I1207 19:36:53.485981 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 07 19:36:53 crc kubenswrapper[4815]: I1207 19:36:53.496573 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 07 19:36:53 crc kubenswrapper[4815]: I1207 19:36:53.511980 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 07 19:36:53 crc kubenswrapper[4815]: E1207 19:36:53.512331 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a52cbc1-9245-48e5-8b22-0cdf96dc671b" containerName="kube-state-metrics" Dec 07 19:36:53 crc kubenswrapper[4815]: I1207 19:36:53.512345 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a52cbc1-9245-48e5-8b22-0cdf96dc671b" containerName="kube-state-metrics" Dec 07 19:36:53 crc kubenswrapper[4815]: E1207 19:36:53.512354 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a19e8619-f5e9-449d-87c5-ce78d91af395" containerName="nova-metadata-metadata" Dec 07 19:36:53 crc kubenswrapper[4815]: I1207 19:36:53.512360 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="a19e8619-f5e9-449d-87c5-ce78d91af395" containerName="nova-metadata-metadata" Dec 07 19:36:53 crc kubenswrapper[4815]: E1207 19:36:53.512393 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a19e8619-f5e9-449d-87c5-ce78d91af395" containerName="nova-metadata-log" Dec 07 19:36:53 crc kubenswrapper[4815]: I1207 19:36:53.512399 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="a19e8619-f5e9-449d-87c5-ce78d91af395" containerName="nova-metadata-log" Dec 07 19:36:53 crc kubenswrapper[4815]: I1207 19:36:53.512571 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="a19e8619-f5e9-449d-87c5-ce78d91af395" containerName="nova-metadata-metadata" Dec 07 19:36:53 crc kubenswrapper[4815]: I1207 19:36:53.512585 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="a19e8619-f5e9-449d-87c5-ce78d91af395" containerName="nova-metadata-log" Dec 07 19:36:53 crc kubenswrapper[4815]: I1207 19:36:53.512592 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a52cbc1-9245-48e5-8b22-0cdf96dc671b" containerName="kube-state-metrics" Dec 07 19:36:53 crc kubenswrapper[4815]: I1207 19:36:53.513178 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 07 19:36:53 crc kubenswrapper[4815]: I1207 19:36:53.518521 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Dec 07 19:36:53 crc kubenswrapper[4815]: I1207 19:36:53.518722 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Dec 07 19:36:53 crc kubenswrapper[4815]: I1207 19:36:53.554961 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 07 19:36:53 crc kubenswrapper[4815]: I1207 19:36:53.565130 4815 scope.go:117] "RemoveContainer" containerID="2e00117efd8b9f2d1f8e4f741b77a80615ae76d7a0a18cc88633da08214fe472" Dec 07 19:36:53 crc kubenswrapper[4815]: E1207 19:36:53.578071 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e00117efd8b9f2d1f8e4f741b77a80615ae76d7a0a18cc88633da08214fe472\": container with ID starting with 2e00117efd8b9f2d1f8e4f741b77a80615ae76d7a0a18cc88633da08214fe472 not found: ID does not exist" containerID="2e00117efd8b9f2d1f8e4f741b77a80615ae76d7a0a18cc88633da08214fe472" Dec 07 19:36:53 crc kubenswrapper[4815]: I1207 19:36:53.578115 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e00117efd8b9f2d1f8e4f741b77a80615ae76d7a0a18cc88633da08214fe472"} err="failed to get container status \"2e00117efd8b9f2d1f8e4f741b77a80615ae76d7a0a18cc88633da08214fe472\": rpc error: code = NotFound desc = could not find container \"2e00117efd8b9f2d1f8e4f741b77a80615ae76d7a0a18cc88633da08214fe472\": container with ID starting with 2e00117efd8b9f2d1f8e4f741b77a80615ae76d7a0a18cc88633da08214fe472 not found: ID does not exist" Dec 07 19:36:53 crc kubenswrapper[4815]: I1207 19:36:53.606451 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a19e8619-f5e9-449d-87c5-ce78d91af395-logs\") pod \"a19e8619-f5e9-449d-87c5-ce78d91af395\" (UID: \"a19e8619-f5e9-449d-87c5-ce78d91af395\") " Dec 07 19:36:53 crc kubenswrapper[4815]: I1207 19:36:53.606649 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a19e8619-f5e9-449d-87c5-ce78d91af395-config-data\") pod \"a19e8619-f5e9-449d-87c5-ce78d91af395\" (UID: \"a19e8619-f5e9-449d-87c5-ce78d91af395\") " Dec 07 19:36:53 crc kubenswrapper[4815]: I1207 19:36:53.606705 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mj2fv\" (UniqueName: \"kubernetes.io/projected/a19e8619-f5e9-449d-87c5-ce78d91af395-kube-api-access-mj2fv\") pod \"a19e8619-f5e9-449d-87c5-ce78d91af395\" (UID: \"a19e8619-f5e9-449d-87c5-ce78d91af395\") " Dec 07 19:36:53 crc kubenswrapper[4815]: I1207 19:36:53.606739 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a19e8619-f5e9-449d-87c5-ce78d91af395-combined-ca-bundle\") pod \"a19e8619-f5e9-449d-87c5-ce78d91af395\" (UID: \"a19e8619-f5e9-449d-87c5-ce78d91af395\") " Dec 07 19:36:53 crc kubenswrapper[4815]: I1207 19:36:53.607030 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2v2g\" (UniqueName: \"kubernetes.io/projected/2b5ed6cd-cad8-491a-b2a6-71868243996f-kube-api-access-b2v2g\") pod \"kube-state-metrics-0\" (UID: \"2b5ed6cd-cad8-491a-b2a6-71868243996f\") " pod="openstack/kube-state-metrics-0" Dec 07 19:36:53 crc kubenswrapper[4815]: I1207 19:36:53.607063 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b5ed6cd-cad8-491a-b2a6-71868243996f-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"2b5ed6cd-cad8-491a-b2a6-71868243996f\") " pod="openstack/kube-state-metrics-0" Dec 07 19:36:53 crc kubenswrapper[4815]: I1207 19:36:53.607105 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a19e8619-f5e9-449d-87c5-ce78d91af395-logs" (OuterVolumeSpecName: "logs") pod "a19e8619-f5e9-449d-87c5-ce78d91af395" (UID: "a19e8619-f5e9-449d-87c5-ce78d91af395"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 07 19:36:53 crc kubenswrapper[4815]: I1207 19:36:53.607169 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/2b5ed6cd-cad8-491a-b2a6-71868243996f-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"2b5ed6cd-cad8-491a-b2a6-71868243996f\") " pod="openstack/kube-state-metrics-0" Dec 07 19:36:53 crc kubenswrapper[4815]: I1207 19:36:53.607193 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b5ed6cd-cad8-491a-b2a6-71868243996f-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"2b5ed6cd-cad8-491a-b2a6-71868243996f\") " pod="openstack/kube-state-metrics-0" Dec 07 19:36:53 crc kubenswrapper[4815]: I1207 19:36:53.607233 4815 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a19e8619-f5e9-449d-87c5-ce78d91af395-logs\") on node \"crc\" DevicePath \"\"" Dec 07 19:36:53 crc kubenswrapper[4815]: I1207 19:36:53.623160 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a19e8619-f5e9-449d-87c5-ce78d91af395-kube-api-access-mj2fv" (OuterVolumeSpecName: "kube-api-access-mj2fv") pod "a19e8619-f5e9-449d-87c5-ce78d91af395" (UID: "a19e8619-f5e9-449d-87c5-ce78d91af395"). InnerVolumeSpecName "kube-api-access-mj2fv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:36:53 crc kubenswrapper[4815]: I1207 19:36:53.656173 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a19e8619-f5e9-449d-87c5-ce78d91af395-config-data" (OuterVolumeSpecName: "config-data") pod "a19e8619-f5e9-449d-87c5-ce78d91af395" (UID: "a19e8619-f5e9-449d-87c5-ce78d91af395"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:36:53 crc kubenswrapper[4815]: I1207 19:36:53.663007 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a19e8619-f5e9-449d-87c5-ce78d91af395-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a19e8619-f5e9-449d-87c5-ce78d91af395" (UID: "a19e8619-f5e9-449d-87c5-ce78d91af395"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:36:53 crc kubenswrapper[4815]: I1207 19:36:53.709051 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b5ed6cd-cad8-491a-b2a6-71868243996f-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"2b5ed6cd-cad8-491a-b2a6-71868243996f\") " pod="openstack/kube-state-metrics-0" Dec 07 19:36:53 crc kubenswrapper[4815]: I1207 19:36:53.709951 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/2b5ed6cd-cad8-491a-b2a6-71868243996f-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"2b5ed6cd-cad8-491a-b2a6-71868243996f\") " pod="openstack/kube-state-metrics-0" Dec 07 19:36:53 crc kubenswrapper[4815]: I1207 19:36:53.710331 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b5ed6cd-cad8-491a-b2a6-71868243996f-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"2b5ed6cd-cad8-491a-b2a6-71868243996f\") " pod="openstack/kube-state-metrics-0" Dec 07 19:36:53 crc kubenswrapper[4815]: I1207 19:36:53.710480 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2v2g\" (UniqueName: \"kubernetes.io/projected/2b5ed6cd-cad8-491a-b2a6-71868243996f-kube-api-access-b2v2g\") pod \"kube-state-metrics-0\" (UID: \"2b5ed6cd-cad8-491a-b2a6-71868243996f\") " pod="openstack/kube-state-metrics-0" Dec 07 19:36:53 crc kubenswrapper[4815]: I1207 19:36:53.710630 4815 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a19e8619-f5e9-449d-87c5-ce78d91af395-config-data\") on node \"crc\" DevicePath \"\"" Dec 07 19:36:53 crc kubenswrapper[4815]: I1207 19:36:53.710722 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mj2fv\" (UniqueName: \"kubernetes.io/projected/a19e8619-f5e9-449d-87c5-ce78d91af395-kube-api-access-mj2fv\") on node \"crc\" DevicePath \"\"" Dec 07 19:36:53 crc kubenswrapper[4815]: I1207 19:36:53.710783 4815 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a19e8619-f5e9-449d-87c5-ce78d91af395-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 07 19:36:53 crc kubenswrapper[4815]: I1207 19:36:53.714810 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b5ed6cd-cad8-491a-b2a6-71868243996f-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"2b5ed6cd-cad8-491a-b2a6-71868243996f\") " pod="openstack/kube-state-metrics-0" Dec 07 19:36:53 crc kubenswrapper[4815]: I1207 19:36:53.715242 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/2b5ed6cd-cad8-491a-b2a6-71868243996f-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"2b5ed6cd-cad8-491a-b2a6-71868243996f\") " pod="openstack/kube-state-metrics-0" Dec 07 19:36:53 crc kubenswrapper[4815]: I1207 19:36:53.715482 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b5ed6cd-cad8-491a-b2a6-71868243996f-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"2b5ed6cd-cad8-491a-b2a6-71868243996f\") " pod="openstack/kube-state-metrics-0" Dec 07 19:36:53 crc kubenswrapper[4815]: I1207 19:36:53.728151 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2v2g\" (UniqueName: \"kubernetes.io/projected/2b5ed6cd-cad8-491a-b2a6-71868243996f-kube-api-access-b2v2g\") pod \"kube-state-metrics-0\" (UID: \"2b5ed6cd-cad8-491a-b2a6-71868243996f\") " pod="openstack/kube-state-metrics-0" Dec 07 19:36:53 crc kubenswrapper[4815]: I1207 19:36:53.780084 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a52cbc1-9245-48e5-8b22-0cdf96dc671b" path="/var/lib/kubelet/pods/8a52cbc1-9245-48e5-8b22-0cdf96dc671b/volumes" Dec 07 19:36:53 crc kubenswrapper[4815]: I1207 19:36:53.851486 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 07 19:36:54 crc kubenswrapper[4815]: I1207 19:36:54.244196 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 07 19:36:54 crc kubenswrapper[4815]: I1207 19:36:54.245001 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b54fd738-3402-4969-a7b5-348bb50802cb" containerName="ceilometer-central-agent" containerID="cri-o://e84e0d9c057ff0dccd894ce588e5182e8c975caa8163ce63305aab73801a7b3c" gracePeriod=30 Dec 07 19:36:54 crc kubenswrapper[4815]: I1207 19:36:54.245228 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b54fd738-3402-4969-a7b5-348bb50802cb" containerName="proxy-httpd" containerID="cri-o://0c11b514267e1af828d819e7f52a907b254b7f028df1d55089facafa61e8567b" gracePeriod=30 Dec 07 19:36:54 crc kubenswrapper[4815]: I1207 19:36:54.245442 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b54fd738-3402-4969-a7b5-348bb50802cb" containerName="ceilometer-notification-agent" containerID="cri-o://b38ad080ab273ecc852542a06d26d23620338320c13ef3ceafdf9142996eff0e" gracePeriod=30 Dec 07 19:36:54 crc kubenswrapper[4815]: I1207 19:36:54.245497 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b54fd738-3402-4969-a7b5-348bb50802cb" containerName="sg-core" containerID="cri-o://1a5542976f7dd3c51a5c257561bd41f5bdae1511b1847939657d90eadb7d4559" gracePeriod=30 Dec 07 19:36:54 crc kubenswrapper[4815]: I1207 19:36:54.329087 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 07 19:36:54 crc kubenswrapper[4815]: I1207 19:36:54.392291 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"2b5ed6cd-cad8-491a-b2a6-71868243996f","Type":"ContainerStarted","Data":"b0e1cd8407b1acfb71adb362db693bbf6c20610480bea3230d41b949daea6f1d"} Dec 07 19:36:54 crc kubenswrapper[4815]: I1207 19:36:54.401682 4815 generic.go:334] "Generic (PLEG): container finished" podID="b54fd738-3402-4969-a7b5-348bb50802cb" containerID="0c11b514267e1af828d819e7f52a907b254b7f028df1d55089facafa61e8567b" exitCode=0 Dec 07 19:36:54 crc kubenswrapper[4815]: I1207 19:36:54.401716 4815 generic.go:334] "Generic (PLEG): container finished" podID="b54fd738-3402-4969-a7b5-348bb50802cb" containerID="1a5542976f7dd3c51a5c257561bd41f5bdae1511b1847939657d90eadb7d4559" exitCode=2 Dec 07 19:36:54 crc kubenswrapper[4815]: I1207 19:36:54.401753 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b54fd738-3402-4969-a7b5-348bb50802cb","Type":"ContainerDied","Data":"0c11b514267e1af828d819e7f52a907b254b7f028df1d55089facafa61e8567b"} Dec 07 19:36:54 crc kubenswrapper[4815]: I1207 19:36:54.401780 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b54fd738-3402-4969-a7b5-348bb50802cb","Type":"ContainerDied","Data":"1a5542976f7dd3c51a5c257561bd41f5bdae1511b1847939657d90eadb7d4559"} Dec 07 19:36:54 crc kubenswrapper[4815]: I1207 19:36:54.404602 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 07 19:36:54 crc kubenswrapper[4815]: I1207 19:36:54.456373 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 07 19:36:54 crc kubenswrapper[4815]: I1207 19:36:54.465248 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 07 19:36:54 crc kubenswrapper[4815]: I1207 19:36:54.479268 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 07 19:36:54 crc kubenswrapper[4815]: I1207 19:36:54.480718 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 07 19:36:54 crc kubenswrapper[4815]: I1207 19:36:54.488931 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 07 19:36:54 crc kubenswrapper[4815]: I1207 19:36:54.491043 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 07 19:36:54 crc kubenswrapper[4815]: I1207 19:36:54.496814 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 07 19:36:54 crc kubenswrapper[4815]: I1207 19:36:54.634270 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6061916-563b-4a80-8f3d-157804a505c3-config-data\") pod \"nova-metadata-0\" (UID: \"b6061916-563b-4a80-8f3d-157804a505c3\") " pod="openstack/nova-metadata-0" Dec 07 19:36:54 crc kubenswrapper[4815]: I1207 19:36:54.634336 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6061916-563b-4a80-8f3d-157804a505c3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b6061916-563b-4a80-8f3d-157804a505c3\") " pod="openstack/nova-metadata-0" Dec 07 19:36:54 crc kubenswrapper[4815]: I1207 19:36:54.634392 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6061916-563b-4a80-8f3d-157804a505c3-logs\") pod \"nova-metadata-0\" (UID: \"b6061916-563b-4a80-8f3d-157804a505c3\") " pod="openstack/nova-metadata-0" Dec 07 19:36:54 crc kubenswrapper[4815]: I1207 19:36:54.634425 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6061916-563b-4a80-8f3d-157804a505c3-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b6061916-563b-4a80-8f3d-157804a505c3\") " pod="openstack/nova-metadata-0" Dec 07 19:36:54 crc kubenswrapper[4815]: I1207 19:36:54.634452 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4n7x9\" (UniqueName: \"kubernetes.io/projected/b6061916-563b-4a80-8f3d-157804a505c3-kube-api-access-4n7x9\") pod \"nova-metadata-0\" (UID: \"b6061916-563b-4a80-8f3d-157804a505c3\") " pod="openstack/nova-metadata-0" Dec 07 19:36:54 crc kubenswrapper[4815]: I1207 19:36:54.736446 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6061916-563b-4a80-8f3d-157804a505c3-config-data\") pod \"nova-metadata-0\" (UID: \"b6061916-563b-4a80-8f3d-157804a505c3\") " pod="openstack/nova-metadata-0" Dec 07 19:36:54 crc kubenswrapper[4815]: I1207 19:36:54.736496 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6061916-563b-4a80-8f3d-157804a505c3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b6061916-563b-4a80-8f3d-157804a505c3\") " pod="openstack/nova-metadata-0" Dec 07 19:36:54 crc kubenswrapper[4815]: I1207 19:36:54.736535 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6061916-563b-4a80-8f3d-157804a505c3-logs\") pod \"nova-metadata-0\" (UID: \"b6061916-563b-4a80-8f3d-157804a505c3\") " pod="openstack/nova-metadata-0" Dec 07 19:36:54 crc kubenswrapper[4815]: I1207 19:36:54.736559 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6061916-563b-4a80-8f3d-157804a505c3-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b6061916-563b-4a80-8f3d-157804a505c3\") " pod="openstack/nova-metadata-0" Dec 07 19:36:54 crc kubenswrapper[4815]: I1207 19:36:54.736580 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4n7x9\" (UniqueName: \"kubernetes.io/projected/b6061916-563b-4a80-8f3d-157804a505c3-kube-api-access-4n7x9\") pod \"nova-metadata-0\" (UID: \"b6061916-563b-4a80-8f3d-157804a505c3\") " pod="openstack/nova-metadata-0" Dec 07 19:36:54 crc kubenswrapper[4815]: I1207 19:36:54.737279 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6061916-563b-4a80-8f3d-157804a505c3-logs\") pod \"nova-metadata-0\" (UID: \"b6061916-563b-4a80-8f3d-157804a505c3\") " pod="openstack/nova-metadata-0" Dec 07 19:36:54 crc kubenswrapper[4815]: I1207 19:36:54.744145 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6061916-563b-4a80-8f3d-157804a505c3-config-data\") pod \"nova-metadata-0\" (UID: \"b6061916-563b-4a80-8f3d-157804a505c3\") " pod="openstack/nova-metadata-0" Dec 07 19:36:54 crc kubenswrapper[4815]: I1207 19:36:54.741910 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6061916-563b-4a80-8f3d-157804a505c3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b6061916-563b-4a80-8f3d-157804a505c3\") " pod="openstack/nova-metadata-0" Dec 07 19:36:54 crc kubenswrapper[4815]: I1207 19:36:54.754987 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6061916-563b-4a80-8f3d-157804a505c3-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b6061916-563b-4a80-8f3d-157804a505c3\") " pod="openstack/nova-metadata-0" Dec 07 19:36:54 crc kubenswrapper[4815]: I1207 19:36:54.755714 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4n7x9\" (UniqueName: \"kubernetes.io/projected/b6061916-563b-4a80-8f3d-157804a505c3-kube-api-access-4n7x9\") pod \"nova-metadata-0\" (UID: \"b6061916-563b-4a80-8f3d-157804a505c3\") " pod="openstack/nova-metadata-0" Dec 07 19:36:54 crc kubenswrapper[4815]: I1207 19:36:54.804445 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 07 19:36:55 crc kubenswrapper[4815]: I1207 19:36:55.285172 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 07 19:36:55 crc kubenswrapper[4815]: I1207 19:36:55.414195 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b6061916-563b-4a80-8f3d-157804a505c3","Type":"ContainerStarted","Data":"4b9d8d54c72a90904fc87d5f1cf05d005c4434cb259ad7dcf907958e5bc5e858"} Dec 07 19:36:55 crc kubenswrapper[4815]: I1207 19:36:55.415805 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"2b5ed6cd-cad8-491a-b2a6-71868243996f","Type":"ContainerStarted","Data":"6f489434dbbfe8e48bab9018be554889d728ae3b21216c96d7efa834fbe9b1eb"} Dec 07 19:36:55 crc kubenswrapper[4815]: I1207 19:36:55.416079 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 07 19:36:55 crc kubenswrapper[4815]: I1207 19:36:55.418697 4815 generic.go:334] "Generic (PLEG): container finished" podID="b54fd738-3402-4969-a7b5-348bb50802cb" containerID="e84e0d9c057ff0dccd894ce588e5182e8c975caa8163ce63305aab73801a7b3c" exitCode=0 Dec 07 19:36:55 crc kubenswrapper[4815]: I1207 19:36:55.418737 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b54fd738-3402-4969-a7b5-348bb50802cb","Type":"ContainerDied","Data":"e84e0d9c057ff0dccd894ce588e5182e8c975caa8163ce63305aab73801a7b3c"} Dec 07 19:36:55 crc kubenswrapper[4815]: I1207 19:36:55.436005 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.8570552280000001 podStartE2EDuration="2.435986369s" podCreationTimestamp="2025-12-07 19:36:53 +0000 UTC" firstStartedPulling="2025-12-07 19:36:54.355037247 +0000 UTC m=+1318.934027292" lastFinishedPulling="2025-12-07 19:36:54.933968388 +0000 UTC m=+1319.512958433" observedRunningTime="2025-12-07 19:36:55.431210154 +0000 UTC m=+1320.010200199" watchObservedRunningTime="2025-12-07 19:36:55.435986369 +0000 UTC m=+1320.014976414" Dec 07 19:36:55 crc kubenswrapper[4815]: I1207 19:36:55.595472 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 07 19:36:55 crc kubenswrapper[4815]: I1207 19:36:55.595694 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 07 19:36:55 crc kubenswrapper[4815]: I1207 19:36:55.797027 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a19e8619-f5e9-449d-87c5-ce78d91af395" path="/var/lib/kubelet/pods/a19e8619-f5e9-449d-87c5-ce78d91af395/volumes" Dec 07 19:36:55 crc kubenswrapper[4815]: I1207 19:36:55.918207 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 07 19:36:55 crc kubenswrapper[4815]: I1207 19:36:55.918267 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 07 19:36:55 crc kubenswrapper[4815]: I1207 19:36:55.945486 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 07 19:36:56 crc kubenswrapper[4815]: I1207 19:36:56.193030 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 07 19:36:56 crc kubenswrapper[4815]: I1207 19:36:56.239125 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8b8cf6657-nlxw2" Dec 07 19:36:56 crc kubenswrapper[4815]: I1207 19:36:56.331631 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58db5546cc-hwggf"] Dec 07 19:36:56 crc kubenswrapper[4815]: I1207 19:36:56.449394 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b6061916-563b-4a80-8f3d-157804a505c3","Type":"ContainerStarted","Data":"6307d6a0b0bc3be3a5fe9b0ea3faa73d3fe6e3ae31cb442818d1186f88730a2f"} Dec 07 19:36:56 crc kubenswrapper[4815]: I1207 19:36:56.449471 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b6061916-563b-4a80-8f3d-157804a505c3","Type":"ContainerStarted","Data":"23395269ec647f334fe55381661b1029c90a345465f85c85cd6641d7d94a7a7c"} Dec 07 19:36:56 crc kubenswrapper[4815]: I1207 19:36:56.450899 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-58db5546cc-hwggf" podUID="ca97776a-2f5b-4188-9c96-ff0ee5a94002" containerName="dnsmasq-dns" containerID="cri-o://59d7e595b532874472069ee447ab845d643b7d72499665443b5a86e02b8b5e9d" gracePeriod=10 Dec 07 19:36:56 crc kubenswrapper[4815]: I1207 19:36:56.502750 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 07 19:36:56 crc kubenswrapper[4815]: I1207 19:36:56.533705 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.533685373 podStartE2EDuration="2.533685373s" podCreationTimestamp="2025-12-07 19:36:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:36:56.481369856 +0000 UTC m=+1321.060359901" watchObservedRunningTime="2025-12-07 19:36:56.533685373 +0000 UTC m=+1321.112675418" Dec 07 19:36:56 crc kubenswrapper[4815]: I1207 19:36:56.678311 4815 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8ade483a-7452-4233-be4b-d2de4c5c6549" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.167:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 07 19:36:56 crc kubenswrapper[4815]: I1207 19:36:56.678349 4815 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8ade483a-7452-4233-be4b-d2de4c5c6549" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.167:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 07 19:36:57 crc kubenswrapper[4815]: I1207 19:36:57.145031 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58db5546cc-hwggf" Dec 07 19:36:57 crc kubenswrapper[4815]: I1207 19:36:57.210203 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4x5r\" (UniqueName: \"kubernetes.io/projected/ca97776a-2f5b-4188-9c96-ff0ee5a94002-kube-api-access-r4x5r\") pod \"ca97776a-2f5b-4188-9c96-ff0ee5a94002\" (UID: \"ca97776a-2f5b-4188-9c96-ff0ee5a94002\") " Dec 07 19:36:57 crc kubenswrapper[4815]: I1207 19:36:57.210345 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ca97776a-2f5b-4188-9c96-ff0ee5a94002-ovsdbserver-sb\") pod \"ca97776a-2f5b-4188-9c96-ff0ee5a94002\" (UID: \"ca97776a-2f5b-4188-9c96-ff0ee5a94002\") " Dec 07 19:36:57 crc kubenswrapper[4815]: I1207 19:36:57.210669 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ca97776a-2f5b-4188-9c96-ff0ee5a94002-dns-svc\") pod \"ca97776a-2f5b-4188-9c96-ff0ee5a94002\" (UID: \"ca97776a-2f5b-4188-9c96-ff0ee5a94002\") " Dec 07 19:36:57 crc kubenswrapper[4815]: I1207 19:36:57.210832 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ca97776a-2f5b-4188-9c96-ff0ee5a94002-ovsdbserver-nb\") pod \"ca97776a-2f5b-4188-9c96-ff0ee5a94002\" (UID: \"ca97776a-2f5b-4188-9c96-ff0ee5a94002\") " Dec 07 19:36:57 crc kubenswrapper[4815]: I1207 19:36:57.210880 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca97776a-2f5b-4188-9c96-ff0ee5a94002-config\") pod \"ca97776a-2f5b-4188-9c96-ff0ee5a94002\" (UID: \"ca97776a-2f5b-4188-9c96-ff0ee5a94002\") " Dec 07 19:36:57 crc kubenswrapper[4815]: I1207 19:36:57.222689 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca97776a-2f5b-4188-9c96-ff0ee5a94002-kube-api-access-r4x5r" (OuterVolumeSpecName: "kube-api-access-r4x5r") pod "ca97776a-2f5b-4188-9c96-ff0ee5a94002" (UID: "ca97776a-2f5b-4188-9c96-ff0ee5a94002"). InnerVolumeSpecName "kube-api-access-r4x5r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:36:57 crc kubenswrapper[4815]: I1207 19:36:57.295636 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca97776a-2f5b-4188-9c96-ff0ee5a94002-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ca97776a-2f5b-4188-9c96-ff0ee5a94002" (UID: "ca97776a-2f5b-4188-9c96-ff0ee5a94002"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:36:57 crc kubenswrapper[4815]: I1207 19:36:57.302443 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca97776a-2f5b-4188-9c96-ff0ee5a94002-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ca97776a-2f5b-4188-9c96-ff0ee5a94002" (UID: "ca97776a-2f5b-4188-9c96-ff0ee5a94002"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:36:57 crc kubenswrapper[4815]: I1207 19:36:57.318275 4815 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ca97776a-2f5b-4188-9c96-ff0ee5a94002-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 07 19:36:57 crc kubenswrapper[4815]: I1207 19:36:57.318306 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r4x5r\" (UniqueName: \"kubernetes.io/projected/ca97776a-2f5b-4188-9c96-ff0ee5a94002-kube-api-access-r4x5r\") on node \"crc\" DevicePath \"\"" Dec 07 19:36:57 crc kubenswrapper[4815]: I1207 19:36:57.318317 4815 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ca97776a-2f5b-4188-9c96-ff0ee5a94002-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 07 19:36:57 crc kubenswrapper[4815]: I1207 19:36:57.330437 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca97776a-2f5b-4188-9c96-ff0ee5a94002-config" (OuterVolumeSpecName: "config") pod "ca97776a-2f5b-4188-9c96-ff0ee5a94002" (UID: "ca97776a-2f5b-4188-9c96-ff0ee5a94002"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:36:57 crc kubenswrapper[4815]: I1207 19:36:57.368772 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca97776a-2f5b-4188-9c96-ff0ee5a94002-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ca97776a-2f5b-4188-9c96-ff0ee5a94002" (UID: "ca97776a-2f5b-4188-9c96-ff0ee5a94002"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:36:57 crc kubenswrapper[4815]: I1207 19:36:57.420532 4815 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ca97776a-2f5b-4188-9c96-ff0ee5a94002-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 07 19:36:57 crc kubenswrapper[4815]: I1207 19:36:57.420571 4815 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca97776a-2f5b-4188-9c96-ff0ee5a94002-config\") on node \"crc\" DevicePath \"\"" Dec 07 19:36:57 crc kubenswrapper[4815]: I1207 19:36:57.458659 4815 generic.go:334] "Generic (PLEG): container finished" podID="ca97776a-2f5b-4188-9c96-ff0ee5a94002" containerID="59d7e595b532874472069ee447ab845d643b7d72499665443b5a86e02b8b5e9d" exitCode=0 Dec 07 19:36:57 crc kubenswrapper[4815]: I1207 19:36:57.458744 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58db5546cc-hwggf" event={"ID":"ca97776a-2f5b-4188-9c96-ff0ee5a94002","Type":"ContainerDied","Data":"59d7e595b532874472069ee447ab845d643b7d72499665443b5a86e02b8b5e9d"} Dec 07 19:36:57 crc kubenswrapper[4815]: I1207 19:36:57.458762 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58db5546cc-hwggf" Dec 07 19:36:57 crc kubenswrapper[4815]: I1207 19:36:57.458820 4815 scope.go:117] "RemoveContainer" containerID="59d7e595b532874472069ee447ab845d643b7d72499665443b5a86e02b8b5e9d" Dec 07 19:36:57 crc kubenswrapper[4815]: I1207 19:36:57.458807 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58db5546cc-hwggf" event={"ID":"ca97776a-2f5b-4188-9c96-ff0ee5a94002","Type":"ContainerDied","Data":"be30b35974c34fa57c8cd427b2d64a3031ea003b040eb2bee8ca16fc1c691641"} Dec 07 19:36:57 crc kubenswrapper[4815]: I1207 19:36:57.484640 4815 scope.go:117] "RemoveContainer" containerID="d2f3639df657426008057679bc7e36801e0d8ecd62783af87a339716d923963f" Dec 07 19:36:57 crc kubenswrapper[4815]: I1207 19:36:57.515775 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58db5546cc-hwggf"] Dec 07 19:36:57 crc kubenswrapper[4815]: I1207 19:36:57.551441 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-58db5546cc-hwggf"] Dec 07 19:36:57 crc kubenswrapper[4815]: I1207 19:36:57.552083 4815 scope.go:117] "RemoveContainer" containerID="59d7e595b532874472069ee447ab845d643b7d72499665443b5a86e02b8b5e9d" Dec 07 19:36:57 crc kubenswrapper[4815]: E1207 19:36:57.555950 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59d7e595b532874472069ee447ab845d643b7d72499665443b5a86e02b8b5e9d\": container with ID starting with 59d7e595b532874472069ee447ab845d643b7d72499665443b5a86e02b8b5e9d not found: ID does not exist" containerID="59d7e595b532874472069ee447ab845d643b7d72499665443b5a86e02b8b5e9d" Dec 07 19:36:57 crc kubenswrapper[4815]: I1207 19:36:57.555988 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59d7e595b532874472069ee447ab845d643b7d72499665443b5a86e02b8b5e9d"} err="failed to get container status \"59d7e595b532874472069ee447ab845d643b7d72499665443b5a86e02b8b5e9d\": rpc error: code = NotFound desc = could not find container \"59d7e595b532874472069ee447ab845d643b7d72499665443b5a86e02b8b5e9d\": container with ID starting with 59d7e595b532874472069ee447ab845d643b7d72499665443b5a86e02b8b5e9d not found: ID does not exist" Dec 07 19:36:57 crc kubenswrapper[4815]: I1207 19:36:57.556013 4815 scope.go:117] "RemoveContainer" containerID="d2f3639df657426008057679bc7e36801e0d8ecd62783af87a339716d923963f" Dec 07 19:36:57 crc kubenswrapper[4815]: E1207 19:36:57.557012 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2f3639df657426008057679bc7e36801e0d8ecd62783af87a339716d923963f\": container with ID starting with d2f3639df657426008057679bc7e36801e0d8ecd62783af87a339716d923963f not found: ID does not exist" containerID="d2f3639df657426008057679bc7e36801e0d8ecd62783af87a339716d923963f" Dec 07 19:36:57 crc kubenswrapper[4815]: I1207 19:36:57.557035 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2f3639df657426008057679bc7e36801e0d8ecd62783af87a339716d923963f"} err="failed to get container status \"d2f3639df657426008057679bc7e36801e0d8ecd62783af87a339716d923963f\": rpc error: code = NotFound desc = could not find container \"d2f3639df657426008057679bc7e36801e0d8ecd62783af87a339716d923963f\": container with ID starting with d2f3639df657426008057679bc7e36801e0d8ecd62783af87a339716d923963f not found: ID does not exist" Dec 07 19:36:57 crc kubenswrapper[4815]: I1207 19:36:57.802801 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca97776a-2f5b-4188-9c96-ff0ee5a94002" path="/var/lib/kubelet/pods/ca97776a-2f5b-4188-9c96-ff0ee5a94002/volumes" Dec 07 19:36:58 crc kubenswrapper[4815]: I1207 19:36:58.028725 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 07 19:36:58 crc kubenswrapper[4815]: I1207 19:36:58.135077 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b54fd738-3402-4969-a7b5-348bb50802cb-log-httpd\") pod \"b54fd738-3402-4969-a7b5-348bb50802cb\" (UID: \"b54fd738-3402-4969-a7b5-348bb50802cb\") " Dec 07 19:36:58 crc kubenswrapper[4815]: I1207 19:36:58.135127 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b54fd738-3402-4969-a7b5-348bb50802cb-scripts\") pod \"b54fd738-3402-4969-a7b5-348bb50802cb\" (UID: \"b54fd738-3402-4969-a7b5-348bb50802cb\") " Dec 07 19:36:58 crc kubenswrapper[4815]: I1207 19:36:58.135188 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b54fd738-3402-4969-a7b5-348bb50802cb-config-data\") pod \"b54fd738-3402-4969-a7b5-348bb50802cb\" (UID: \"b54fd738-3402-4969-a7b5-348bb50802cb\") " Dec 07 19:36:58 crc kubenswrapper[4815]: I1207 19:36:58.135259 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b54fd738-3402-4969-a7b5-348bb50802cb-sg-core-conf-yaml\") pod \"b54fd738-3402-4969-a7b5-348bb50802cb\" (UID: \"b54fd738-3402-4969-a7b5-348bb50802cb\") " Dec 07 19:36:58 crc kubenswrapper[4815]: I1207 19:36:58.135300 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b54fd738-3402-4969-a7b5-348bb50802cb-run-httpd\") pod \"b54fd738-3402-4969-a7b5-348bb50802cb\" (UID: \"b54fd738-3402-4969-a7b5-348bb50802cb\") " Dec 07 19:36:58 crc kubenswrapper[4815]: I1207 19:36:58.135345 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qnbhh\" (UniqueName: \"kubernetes.io/projected/b54fd738-3402-4969-a7b5-348bb50802cb-kube-api-access-qnbhh\") pod \"b54fd738-3402-4969-a7b5-348bb50802cb\" (UID: \"b54fd738-3402-4969-a7b5-348bb50802cb\") " Dec 07 19:36:58 crc kubenswrapper[4815]: I1207 19:36:58.135394 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b54fd738-3402-4969-a7b5-348bb50802cb-combined-ca-bundle\") pod \"b54fd738-3402-4969-a7b5-348bb50802cb\" (UID: \"b54fd738-3402-4969-a7b5-348bb50802cb\") " Dec 07 19:36:58 crc kubenswrapper[4815]: I1207 19:36:58.135781 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b54fd738-3402-4969-a7b5-348bb50802cb-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b54fd738-3402-4969-a7b5-348bb50802cb" (UID: "b54fd738-3402-4969-a7b5-348bb50802cb"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 07 19:36:58 crc kubenswrapper[4815]: I1207 19:36:58.136080 4815 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b54fd738-3402-4969-a7b5-348bb50802cb-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 07 19:36:58 crc kubenswrapper[4815]: I1207 19:36:58.137695 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b54fd738-3402-4969-a7b5-348bb50802cb-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b54fd738-3402-4969-a7b5-348bb50802cb" (UID: "b54fd738-3402-4969-a7b5-348bb50802cb"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 07 19:36:58 crc kubenswrapper[4815]: I1207 19:36:58.145076 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b54fd738-3402-4969-a7b5-348bb50802cb-kube-api-access-qnbhh" (OuterVolumeSpecName: "kube-api-access-qnbhh") pod "b54fd738-3402-4969-a7b5-348bb50802cb" (UID: "b54fd738-3402-4969-a7b5-348bb50802cb"). InnerVolumeSpecName "kube-api-access-qnbhh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:36:58 crc kubenswrapper[4815]: I1207 19:36:58.162903 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b54fd738-3402-4969-a7b5-348bb50802cb-scripts" (OuterVolumeSpecName: "scripts") pod "b54fd738-3402-4969-a7b5-348bb50802cb" (UID: "b54fd738-3402-4969-a7b5-348bb50802cb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:36:58 crc kubenswrapper[4815]: I1207 19:36:58.189695 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b54fd738-3402-4969-a7b5-348bb50802cb-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b54fd738-3402-4969-a7b5-348bb50802cb" (UID: "b54fd738-3402-4969-a7b5-348bb50802cb"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:36:58 crc kubenswrapper[4815]: I1207 19:36:58.238556 4815 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b54fd738-3402-4969-a7b5-348bb50802cb-scripts\") on node \"crc\" DevicePath \"\"" Dec 07 19:36:58 crc kubenswrapper[4815]: I1207 19:36:58.238581 4815 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b54fd738-3402-4969-a7b5-348bb50802cb-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 07 19:36:58 crc kubenswrapper[4815]: I1207 19:36:58.238590 4815 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b54fd738-3402-4969-a7b5-348bb50802cb-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 07 19:36:58 crc kubenswrapper[4815]: I1207 19:36:58.238599 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qnbhh\" (UniqueName: \"kubernetes.io/projected/b54fd738-3402-4969-a7b5-348bb50802cb-kube-api-access-qnbhh\") on node \"crc\" DevicePath \"\"" Dec 07 19:36:58 crc kubenswrapper[4815]: I1207 19:36:58.256715 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b54fd738-3402-4969-a7b5-348bb50802cb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b54fd738-3402-4969-a7b5-348bb50802cb" (UID: "b54fd738-3402-4969-a7b5-348bb50802cb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:36:58 crc kubenswrapper[4815]: I1207 19:36:58.277663 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b54fd738-3402-4969-a7b5-348bb50802cb-config-data" (OuterVolumeSpecName: "config-data") pod "b54fd738-3402-4969-a7b5-348bb50802cb" (UID: "b54fd738-3402-4969-a7b5-348bb50802cb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:36:58 crc kubenswrapper[4815]: I1207 19:36:58.340643 4815 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b54fd738-3402-4969-a7b5-348bb50802cb-config-data\") on node \"crc\" DevicePath \"\"" Dec 07 19:36:58 crc kubenswrapper[4815]: I1207 19:36:58.340685 4815 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b54fd738-3402-4969-a7b5-348bb50802cb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 07 19:36:58 crc kubenswrapper[4815]: I1207 19:36:58.469298 4815 generic.go:334] "Generic (PLEG): container finished" podID="b54fd738-3402-4969-a7b5-348bb50802cb" containerID="b38ad080ab273ecc852542a06d26d23620338320c13ef3ceafdf9142996eff0e" exitCode=0 Dec 07 19:36:58 crc kubenswrapper[4815]: I1207 19:36:58.469384 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b54fd738-3402-4969-a7b5-348bb50802cb","Type":"ContainerDied","Data":"b38ad080ab273ecc852542a06d26d23620338320c13ef3ceafdf9142996eff0e"} Dec 07 19:36:58 crc kubenswrapper[4815]: I1207 19:36:58.469406 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 07 19:36:58 crc kubenswrapper[4815]: I1207 19:36:58.469430 4815 scope.go:117] "RemoveContainer" containerID="0c11b514267e1af828d819e7f52a907b254b7f028df1d55089facafa61e8567b" Dec 07 19:36:58 crc kubenswrapper[4815]: I1207 19:36:58.469418 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b54fd738-3402-4969-a7b5-348bb50802cb","Type":"ContainerDied","Data":"2f554b746be7b67ad0df1d46065007514e77a6f700c06730ec854d0baa6eff9c"} Dec 07 19:36:58 crc kubenswrapper[4815]: I1207 19:36:58.479425 4815 generic.go:334] "Generic (PLEG): container finished" podID="2cdbde09-9f1a-4448-8fa3-372d29371084" containerID="6467d92ce190ab795185aadce7f6f83f049a27f8b66597f85497c938c9eacee3" exitCode=0 Dec 07 19:36:58 crc kubenswrapper[4815]: I1207 19:36:58.479492 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-tsf6c" event={"ID":"2cdbde09-9f1a-4448-8fa3-372d29371084","Type":"ContainerDied","Data":"6467d92ce190ab795185aadce7f6f83f049a27f8b66597f85497c938c9eacee3"} Dec 07 19:36:58 crc kubenswrapper[4815]: I1207 19:36:58.503660 4815 scope.go:117] "RemoveContainer" containerID="1a5542976f7dd3c51a5c257561bd41f5bdae1511b1847939657d90eadb7d4559" Dec 07 19:36:58 crc kubenswrapper[4815]: I1207 19:36:58.524438 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 07 19:36:58 crc kubenswrapper[4815]: I1207 19:36:58.542075 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 07 19:36:58 crc kubenswrapper[4815]: I1207 19:36:58.563071 4815 scope.go:117] "RemoveContainer" containerID="b38ad080ab273ecc852542a06d26d23620338320c13ef3ceafdf9142996eff0e" Dec 07 19:36:58 crc kubenswrapper[4815]: I1207 19:36:58.586309 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 07 19:36:58 crc kubenswrapper[4815]: E1207 19:36:58.587086 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca97776a-2f5b-4188-9c96-ff0ee5a94002" containerName="dnsmasq-dns" Dec 07 19:36:58 crc kubenswrapper[4815]: I1207 19:36:58.587106 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca97776a-2f5b-4188-9c96-ff0ee5a94002" containerName="dnsmasq-dns" Dec 07 19:36:58 crc kubenswrapper[4815]: E1207 19:36:58.587126 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b54fd738-3402-4969-a7b5-348bb50802cb" containerName="ceilometer-notification-agent" Dec 07 19:36:58 crc kubenswrapper[4815]: I1207 19:36:58.587133 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="b54fd738-3402-4969-a7b5-348bb50802cb" containerName="ceilometer-notification-agent" Dec 07 19:36:58 crc kubenswrapper[4815]: E1207 19:36:58.587154 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b54fd738-3402-4969-a7b5-348bb50802cb" containerName="sg-core" Dec 07 19:36:58 crc kubenswrapper[4815]: I1207 19:36:58.587160 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="b54fd738-3402-4969-a7b5-348bb50802cb" containerName="sg-core" Dec 07 19:36:58 crc kubenswrapper[4815]: E1207 19:36:58.587171 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b54fd738-3402-4969-a7b5-348bb50802cb" containerName="proxy-httpd" Dec 07 19:36:58 crc kubenswrapper[4815]: I1207 19:36:58.587176 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="b54fd738-3402-4969-a7b5-348bb50802cb" containerName="proxy-httpd" Dec 07 19:36:58 crc kubenswrapper[4815]: E1207 19:36:58.587192 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca97776a-2f5b-4188-9c96-ff0ee5a94002" containerName="init" Dec 07 19:36:58 crc kubenswrapper[4815]: I1207 19:36:58.587197 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca97776a-2f5b-4188-9c96-ff0ee5a94002" containerName="init" Dec 07 19:36:58 crc kubenswrapper[4815]: E1207 19:36:58.587218 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b54fd738-3402-4969-a7b5-348bb50802cb" containerName="ceilometer-central-agent" Dec 07 19:36:58 crc kubenswrapper[4815]: I1207 19:36:58.587224 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="b54fd738-3402-4969-a7b5-348bb50802cb" containerName="ceilometer-central-agent" Dec 07 19:36:58 crc kubenswrapper[4815]: I1207 19:36:58.587514 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="b54fd738-3402-4969-a7b5-348bb50802cb" containerName="proxy-httpd" Dec 07 19:36:58 crc kubenswrapper[4815]: I1207 19:36:58.587534 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="b54fd738-3402-4969-a7b5-348bb50802cb" containerName="ceilometer-notification-agent" Dec 07 19:36:58 crc kubenswrapper[4815]: I1207 19:36:58.587548 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca97776a-2f5b-4188-9c96-ff0ee5a94002" containerName="dnsmasq-dns" Dec 07 19:36:58 crc kubenswrapper[4815]: I1207 19:36:58.587577 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="b54fd738-3402-4969-a7b5-348bb50802cb" containerName="sg-core" Dec 07 19:36:58 crc kubenswrapper[4815]: I1207 19:36:58.587599 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="b54fd738-3402-4969-a7b5-348bb50802cb" containerName="ceilometer-central-agent" Dec 07 19:36:58 crc kubenswrapper[4815]: I1207 19:36:58.594708 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 07 19:36:58 crc kubenswrapper[4815]: I1207 19:36:58.625110 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 07 19:36:58 crc kubenswrapper[4815]: I1207 19:36:58.625414 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 07 19:36:58 crc kubenswrapper[4815]: I1207 19:36:58.625484 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 07 19:36:58 crc kubenswrapper[4815]: I1207 19:36:58.654532 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9587a968-ace0-4eb2-8e73-4230356ac221-scripts\") pod \"ceilometer-0\" (UID: \"9587a968-ace0-4eb2-8e73-4230356ac221\") " pod="openstack/ceilometer-0" Dec 07 19:36:58 crc kubenswrapper[4815]: I1207 19:36:58.654641 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9587a968-ace0-4eb2-8e73-4230356ac221-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"9587a968-ace0-4eb2-8e73-4230356ac221\") " pod="openstack/ceilometer-0" Dec 07 19:36:58 crc kubenswrapper[4815]: I1207 19:36:58.654696 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9587a968-ace0-4eb2-8e73-4230356ac221-config-data\") pod \"ceilometer-0\" (UID: \"9587a968-ace0-4eb2-8e73-4230356ac221\") " pod="openstack/ceilometer-0" Dec 07 19:36:58 crc kubenswrapper[4815]: I1207 19:36:58.654783 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9587a968-ace0-4eb2-8e73-4230356ac221-run-httpd\") pod \"ceilometer-0\" (UID: \"9587a968-ace0-4eb2-8e73-4230356ac221\") " pod="openstack/ceilometer-0" Dec 07 19:36:58 crc kubenswrapper[4815]: I1207 19:36:58.654811 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chr2v\" (UniqueName: \"kubernetes.io/projected/9587a968-ace0-4eb2-8e73-4230356ac221-kube-api-access-chr2v\") pod \"ceilometer-0\" (UID: \"9587a968-ace0-4eb2-8e73-4230356ac221\") " pod="openstack/ceilometer-0" Dec 07 19:36:58 crc kubenswrapper[4815]: I1207 19:36:58.654858 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9587a968-ace0-4eb2-8e73-4230356ac221-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9587a968-ace0-4eb2-8e73-4230356ac221\") " pod="openstack/ceilometer-0" Dec 07 19:36:58 crc kubenswrapper[4815]: I1207 19:36:58.654945 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9587a968-ace0-4eb2-8e73-4230356ac221-log-httpd\") pod \"ceilometer-0\" (UID: \"9587a968-ace0-4eb2-8e73-4230356ac221\") " pod="openstack/ceilometer-0" Dec 07 19:36:58 crc kubenswrapper[4815]: I1207 19:36:58.654973 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9587a968-ace0-4eb2-8e73-4230356ac221-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9587a968-ace0-4eb2-8e73-4230356ac221\") " pod="openstack/ceilometer-0" Dec 07 19:36:58 crc kubenswrapper[4815]: I1207 19:36:58.663831 4815 scope.go:117] "RemoveContainer" containerID="e84e0d9c057ff0dccd894ce588e5182e8c975caa8163ce63305aab73801a7b3c" Dec 07 19:36:58 crc kubenswrapper[4815]: I1207 19:36:58.683142 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 07 19:36:58 crc kubenswrapper[4815]: I1207 19:36:58.706000 4815 scope.go:117] "RemoveContainer" containerID="0c11b514267e1af828d819e7f52a907b254b7f028df1d55089facafa61e8567b" Dec 07 19:36:58 crc kubenswrapper[4815]: E1207 19:36:58.710420 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c11b514267e1af828d819e7f52a907b254b7f028df1d55089facafa61e8567b\": container with ID starting with 0c11b514267e1af828d819e7f52a907b254b7f028df1d55089facafa61e8567b not found: ID does not exist" containerID="0c11b514267e1af828d819e7f52a907b254b7f028df1d55089facafa61e8567b" Dec 07 19:36:58 crc kubenswrapper[4815]: I1207 19:36:58.710716 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c11b514267e1af828d819e7f52a907b254b7f028df1d55089facafa61e8567b"} err="failed to get container status \"0c11b514267e1af828d819e7f52a907b254b7f028df1d55089facafa61e8567b\": rpc error: code = NotFound desc = could not find container \"0c11b514267e1af828d819e7f52a907b254b7f028df1d55089facafa61e8567b\": container with ID starting with 0c11b514267e1af828d819e7f52a907b254b7f028df1d55089facafa61e8567b not found: ID does not exist" Dec 07 19:36:58 crc kubenswrapper[4815]: I1207 19:36:58.710753 4815 scope.go:117] "RemoveContainer" containerID="1a5542976f7dd3c51a5c257561bd41f5bdae1511b1847939657d90eadb7d4559" Dec 07 19:36:58 crc kubenswrapper[4815]: E1207 19:36:58.711706 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a5542976f7dd3c51a5c257561bd41f5bdae1511b1847939657d90eadb7d4559\": container with ID starting with 1a5542976f7dd3c51a5c257561bd41f5bdae1511b1847939657d90eadb7d4559 not found: ID does not exist" containerID="1a5542976f7dd3c51a5c257561bd41f5bdae1511b1847939657d90eadb7d4559" Dec 07 19:36:58 crc kubenswrapper[4815]: I1207 19:36:58.711757 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a5542976f7dd3c51a5c257561bd41f5bdae1511b1847939657d90eadb7d4559"} err="failed to get container status \"1a5542976f7dd3c51a5c257561bd41f5bdae1511b1847939657d90eadb7d4559\": rpc error: code = NotFound desc = could not find container \"1a5542976f7dd3c51a5c257561bd41f5bdae1511b1847939657d90eadb7d4559\": container with ID starting with 1a5542976f7dd3c51a5c257561bd41f5bdae1511b1847939657d90eadb7d4559 not found: ID does not exist" Dec 07 19:36:58 crc kubenswrapper[4815]: I1207 19:36:58.711792 4815 scope.go:117] "RemoveContainer" containerID="b38ad080ab273ecc852542a06d26d23620338320c13ef3ceafdf9142996eff0e" Dec 07 19:36:58 crc kubenswrapper[4815]: E1207 19:36:58.712273 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b38ad080ab273ecc852542a06d26d23620338320c13ef3ceafdf9142996eff0e\": container with ID starting with b38ad080ab273ecc852542a06d26d23620338320c13ef3ceafdf9142996eff0e not found: ID does not exist" containerID="b38ad080ab273ecc852542a06d26d23620338320c13ef3ceafdf9142996eff0e" Dec 07 19:36:58 crc kubenswrapper[4815]: I1207 19:36:58.712306 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b38ad080ab273ecc852542a06d26d23620338320c13ef3ceafdf9142996eff0e"} err="failed to get container status \"b38ad080ab273ecc852542a06d26d23620338320c13ef3ceafdf9142996eff0e\": rpc error: code = NotFound desc = could not find container \"b38ad080ab273ecc852542a06d26d23620338320c13ef3ceafdf9142996eff0e\": container with ID starting with b38ad080ab273ecc852542a06d26d23620338320c13ef3ceafdf9142996eff0e not found: ID does not exist" Dec 07 19:36:58 crc kubenswrapper[4815]: I1207 19:36:58.712325 4815 scope.go:117] "RemoveContainer" containerID="e84e0d9c057ff0dccd894ce588e5182e8c975caa8163ce63305aab73801a7b3c" Dec 07 19:36:58 crc kubenswrapper[4815]: E1207 19:36:58.712927 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e84e0d9c057ff0dccd894ce588e5182e8c975caa8163ce63305aab73801a7b3c\": container with ID starting with e84e0d9c057ff0dccd894ce588e5182e8c975caa8163ce63305aab73801a7b3c not found: ID does not exist" containerID="e84e0d9c057ff0dccd894ce588e5182e8c975caa8163ce63305aab73801a7b3c" Dec 07 19:36:58 crc kubenswrapper[4815]: I1207 19:36:58.712958 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e84e0d9c057ff0dccd894ce588e5182e8c975caa8163ce63305aab73801a7b3c"} err="failed to get container status \"e84e0d9c057ff0dccd894ce588e5182e8c975caa8163ce63305aab73801a7b3c\": rpc error: code = NotFound desc = could not find container \"e84e0d9c057ff0dccd894ce588e5182e8c975caa8163ce63305aab73801a7b3c\": container with ID starting with e84e0d9c057ff0dccd894ce588e5182e8c975caa8163ce63305aab73801a7b3c not found: ID does not exist" Dec 07 19:36:58 crc kubenswrapper[4815]: I1207 19:36:58.756872 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9587a968-ace0-4eb2-8e73-4230356ac221-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"9587a968-ace0-4eb2-8e73-4230356ac221\") " pod="openstack/ceilometer-0" Dec 07 19:36:58 crc kubenswrapper[4815]: I1207 19:36:58.756935 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9587a968-ace0-4eb2-8e73-4230356ac221-config-data\") pod \"ceilometer-0\" (UID: \"9587a968-ace0-4eb2-8e73-4230356ac221\") " pod="openstack/ceilometer-0" Dec 07 19:36:58 crc kubenswrapper[4815]: I1207 19:36:58.756974 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9587a968-ace0-4eb2-8e73-4230356ac221-run-httpd\") pod \"ceilometer-0\" (UID: \"9587a968-ace0-4eb2-8e73-4230356ac221\") " pod="openstack/ceilometer-0" Dec 07 19:36:58 crc kubenswrapper[4815]: I1207 19:36:58.756992 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chr2v\" (UniqueName: \"kubernetes.io/projected/9587a968-ace0-4eb2-8e73-4230356ac221-kube-api-access-chr2v\") pod \"ceilometer-0\" (UID: \"9587a968-ace0-4eb2-8e73-4230356ac221\") " pod="openstack/ceilometer-0" Dec 07 19:36:58 crc kubenswrapper[4815]: I1207 19:36:58.757016 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9587a968-ace0-4eb2-8e73-4230356ac221-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9587a968-ace0-4eb2-8e73-4230356ac221\") " pod="openstack/ceilometer-0" Dec 07 19:36:58 crc kubenswrapper[4815]: I1207 19:36:58.757047 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9587a968-ace0-4eb2-8e73-4230356ac221-log-httpd\") pod \"ceilometer-0\" (UID: \"9587a968-ace0-4eb2-8e73-4230356ac221\") " pod="openstack/ceilometer-0" Dec 07 19:36:58 crc kubenswrapper[4815]: I1207 19:36:58.757063 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9587a968-ace0-4eb2-8e73-4230356ac221-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9587a968-ace0-4eb2-8e73-4230356ac221\") " pod="openstack/ceilometer-0" Dec 07 19:36:58 crc kubenswrapper[4815]: I1207 19:36:58.757124 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9587a968-ace0-4eb2-8e73-4230356ac221-scripts\") pod \"ceilometer-0\" (UID: \"9587a968-ace0-4eb2-8e73-4230356ac221\") " pod="openstack/ceilometer-0" Dec 07 19:36:58 crc kubenswrapper[4815]: I1207 19:36:58.758888 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9587a968-ace0-4eb2-8e73-4230356ac221-log-httpd\") pod \"ceilometer-0\" (UID: \"9587a968-ace0-4eb2-8e73-4230356ac221\") " pod="openstack/ceilometer-0" Dec 07 19:36:58 crc kubenswrapper[4815]: I1207 19:36:58.759360 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9587a968-ace0-4eb2-8e73-4230356ac221-run-httpd\") pod \"ceilometer-0\" (UID: \"9587a968-ace0-4eb2-8e73-4230356ac221\") " pod="openstack/ceilometer-0" Dec 07 19:36:58 crc kubenswrapper[4815]: I1207 19:36:58.762033 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9587a968-ace0-4eb2-8e73-4230356ac221-config-data\") pod \"ceilometer-0\" (UID: \"9587a968-ace0-4eb2-8e73-4230356ac221\") " pod="openstack/ceilometer-0" Dec 07 19:36:58 crc kubenswrapper[4815]: I1207 19:36:58.763774 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9587a968-ace0-4eb2-8e73-4230356ac221-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9587a968-ace0-4eb2-8e73-4230356ac221\") " pod="openstack/ceilometer-0" Dec 07 19:36:58 crc kubenswrapper[4815]: I1207 19:36:58.767855 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9587a968-ace0-4eb2-8e73-4230356ac221-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9587a968-ace0-4eb2-8e73-4230356ac221\") " pod="openstack/ceilometer-0" Dec 07 19:36:58 crc kubenswrapper[4815]: I1207 19:36:58.768078 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9587a968-ace0-4eb2-8e73-4230356ac221-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"9587a968-ace0-4eb2-8e73-4230356ac221\") " pod="openstack/ceilometer-0" Dec 07 19:36:58 crc kubenswrapper[4815]: I1207 19:36:58.777434 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9587a968-ace0-4eb2-8e73-4230356ac221-scripts\") pod \"ceilometer-0\" (UID: \"9587a968-ace0-4eb2-8e73-4230356ac221\") " pod="openstack/ceilometer-0" Dec 07 19:36:58 crc kubenswrapper[4815]: I1207 19:36:58.778981 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chr2v\" (UniqueName: \"kubernetes.io/projected/9587a968-ace0-4eb2-8e73-4230356ac221-kube-api-access-chr2v\") pod \"ceilometer-0\" (UID: \"9587a968-ace0-4eb2-8e73-4230356ac221\") " pod="openstack/ceilometer-0" Dec 07 19:36:58 crc kubenswrapper[4815]: I1207 19:36:58.960531 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 07 19:36:59 crc kubenswrapper[4815]: I1207 19:36:59.657737 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 07 19:36:59 crc kubenswrapper[4815]: W1207 19:36:59.709335 4815 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9587a968_ace0_4eb2_8e73_4230356ac221.slice/crio-a89a9b7487c8ed5c1fb6ecff4b270d6f0460bc7c08f54595afc927391139eacf WatchSource:0}: Error finding container a89a9b7487c8ed5c1fb6ecff4b270d6f0460bc7c08f54595afc927391139eacf: Status 404 returned error can't find the container with id a89a9b7487c8ed5c1fb6ecff4b270d6f0460bc7c08f54595afc927391139eacf Dec 07 19:36:59 crc kubenswrapper[4815]: I1207 19:36:59.806027 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b54fd738-3402-4969-a7b5-348bb50802cb" path="/var/lib/kubelet/pods/b54fd738-3402-4969-a7b5-348bb50802cb/volumes" Dec 07 19:36:59 crc kubenswrapper[4815]: I1207 19:36:59.807251 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 07 19:36:59 crc kubenswrapper[4815]: I1207 19:36:59.807286 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 07 19:36:59 crc kubenswrapper[4815]: I1207 19:36:59.839850 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-tsf6c" Dec 07 19:36:59 crc kubenswrapper[4815]: I1207 19:36:59.980765 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2cdbde09-9f1a-4448-8fa3-372d29371084-scripts\") pod \"2cdbde09-9f1a-4448-8fa3-372d29371084\" (UID: \"2cdbde09-9f1a-4448-8fa3-372d29371084\") " Dec 07 19:36:59 crc kubenswrapper[4815]: I1207 19:36:59.981108 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cdbde09-9f1a-4448-8fa3-372d29371084-config-data\") pod \"2cdbde09-9f1a-4448-8fa3-372d29371084\" (UID: \"2cdbde09-9f1a-4448-8fa3-372d29371084\") " Dec 07 19:36:59 crc kubenswrapper[4815]: I1207 19:36:59.981280 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bx5vt\" (UniqueName: \"kubernetes.io/projected/2cdbde09-9f1a-4448-8fa3-372d29371084-kube-api-access-bx5vt\") pod \"2cdbde09-9f1a-4448-8fa3-372d29371084\" (UID: \"2cdbde09-9f1a-4448-8fa3-372d29371084\") " Dec 07 19:36:59 crc kubenswrapper[4815]: I1207 19:36:59.981435 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cdbde09-9f1a-4448-8fa3-372d29371084-combined-ca-bundle\") pod \"2cdbde09-9f1a-4448-8fa3-372d29371084\" (UID: \"2cdbde09-9f1a-4448-8fa3-372d29371084\") " Dec 07 19:36:59 crc kubenswrapper[4815]: I1207 19:36:59.985666 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cdbde09-9f1a-4448-8fa3-372d29371084-scripts" (OuterVolumeSpecName: "scripts") pod "2cdbde09-9f1a-4448-8fa3-372d29371084" (UID: "2cdbde09-9f1a-4448-8fa3-372d29371084"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:36:59 crc kubenswrapper[4815]: I1207 19:36:59.993691 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cdbde09-9f1a-4448-8fa3-372d29371084-kube-api-access-bx5vt" (OuterVolumeSpecName: "kube-api-access-bx5vt") pod "2cdbde09-9f1a-4448-8fa3-372d29371084" (UID: "2cdbde09-9f1a-4448-8fa3-372d29371084"). InnerVolumeSpecName "kube-api-access-bx5vt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:37:00 crc kubenswrapper[4815]: I1207 19:37:00.005250 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cdbde09-9f1a-4448-8fa3-372d29371084-config-data" (OuterVolumeSpecName: "config-data") pod "2cdbde09-9f1a-4448-8fa3-372d29371084" (UID: "2cdbde09-9f1a-4448-8fa3-372d29371084"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:37:00 crc kubenswrapper[4815]: I1207 19:37:00.028032 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cdbde09-9f1a-4448-8fa3-372d29371084-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2cdbde09-9f1a-4448-8fa3-372d29371084" (UID: "2cdbde09-9f1a-4448-8fa3-372d29371084"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:37:00 crc kubenswrapper[4815]: I1207 19:37:00.083974 4815 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cdbde09-9f1a-4448-8fa3-372d29371084-config-data\") on node \"crc\" DevicePath \"\"" Dec 07 19:37:00 crc kubenswrapper[4815]: I1207 19:37:00.084265 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bx5vt\" (UniqueName: \"kubernetes.io/projected/2cdbde09-9f1a-4448-8fa3-372d29371084-kube-api-access-bx5vt\") on node \"crc\" DevicePath \"\"" Dec 07 19:37:00 crc kubenswrapper[4815]: I1207 19:37:00.084276 4815 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cdbde09-9f1a-4448-8fa3-372d29371084-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 07 19:37:00 crc kubenswrapper[4815]: I1207 19:37:00.084284 4815 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2cdbde09-9f1a-4448-8fa3-372d29371084-scripts\") on node \"crc\" DevicePath \"\"" Dec 07 19:37:00 crc kubenswrapper[4815]: I1207 19:37:00.505074 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-tsf6c" Dec 07 19:37:00 crc kubenswrapper[4815]: I1207 19:37:00.505078 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-tsf6c" event={"ID":"2cdbde09-9f1a-4448-8fa3-372d29371084","Type":"ContainerDied","Data":"1b326df50b080fdd546b07dd85e1733ec781fd093e3afc4a57180cb6c947dcd0"} Dec 07 19:37:00 crc kubenswrapper[4815]: I1207 19:37:00.505126 4815 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b326df50b080fdd546b07dd85e1733ec781fd093e3afc4a57180cb6c947dcd0" Dec 07 19:37:00 crc kubenswrapper[4815]: I1207 19:37:00.516103 4815 generic.go:334] "Generic (PLEG): container finished" podID="e95a85c7-1619-4d79-bc43-3acc00d3ab9a" containerID="1c4ec4243730fce8a5a4eb907ccbef3c5cd68c4dff9ca432250d80287da1aad6" exitCode=0 Dec 07 19:37:00 crc kubenswrapper[4815]: I1207 19:37:00.516197 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-nrp9x" event={"ID":"e95a85c7-1619-4d79-bc43-3acc00d3ab9a","Type":"ContainerDied","Data":"1c4ec4243730fce8a5a4eb907ccbef3c5cd68c4dff9ca432250d80287da1aad6"} Dec 07 19:37:00 crc kubenswrapper[4815]: I1207 19:37:00.520739 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9587a968-ace0-4eb2-8e73-4230356ac221","Type":"ContainerStarted","Data":"cbde8f2beb045c26d969b77b9cbb14e6390930ca52357696b29fe7e8b355641b"} Dec 07 19:37:00 crc kubenswrapper[4815]: I1207 19:37:00.520848 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9587a968-ace0-4eb2-8e73-4230356ac221","Type":"ContainerStarted","Data":"a89a9b7487c8ed5c1fb6ecff4b270d6f0460bc7c08f54595afc927391139eacf"} Dec 07 19:37:00 crc kubenswrapper[4815]: I1207 19:37:00.690333 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 07 19:37:00 crc kubenswrapper[4815]: I1207 19:37:00.690605 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="8ade483a-7452-4233-be4b-d2de4c5c6549" containerName="nova-api-log" containerID="cri-o://52d71aaf46e7310cdbd3dc1313781b8c202d6ba10b188721cda334865571c29b" gracePeriod=30 Dec 07 19:37:00 crc kubenswrapper[4815]: I1207 19:37:00.691101 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="8ade483a-7452-4233-be4b-d2de4c5c6549" containerName="nova-api-api" containerID="cri-o://4b54e609f13e73854e6666a306978e488a02cd34ad291282f8d861f31cf1f41e" gracePeriod=30 Dec 07 19:37:00 crc kubenswrapper[4815]: I1207 19:37:00.700315 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 07 19:37:00 crc kubenswrapper[4815]: I1207 19:37:00.700515 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="46b8c248-9f13-484f-8401-ecb16b5c05fd" containerName="nova-scheduler-scheduler" containerID="cri-o://ee6e5f2b7f2e5f0ab8ea23735fdba52239a9a6096f7bbfb4e0673cf2751ae3c1" gracePeriod=30 Dec 07 19:37:00 crc kubenswrapper[4815]: I1207 19:37:00.766710 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 07 19:37:00 crc kubenswrapper[4815]: I1207 19:37:00.767277 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="b6061916-563b-4a80-8f3d-157804a505c3" containerName="nova-metadata-log" containerID="cri-o://23395269ec647f334fe55381661b1029c90a345465f85c85cd6641d7d94a7a7c" gracePeriod=30 Dec 07 19:37:00 crc kubenswrapper[4815]: I1207 19:37:00.767530 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="b6061916-563b-4a80-8f3d-157804a505c3" containerName="nova-metadata-metadata" containerID="cri-o://6307d6a0b0bc3be3a5fe9b0ea3faa73d3fe6e3ae31cb442818d1186f88730a2f" gracePeriod=30 Dec 07 19:37:00 crc kubenswrapper[4815]: E1207 19:37:00.930099 4815 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ee6e5f2b7f2e5f0ab8ea23735fdba52239a9a6096f7bbfb4e0673cf2751ae3c1" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 07 19:37:00 crc kubenswrapper[4815]: E1207 19:37:00.933412 4815 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ee6e5f2b7f2e5f0ab8ea23735fdba52239a9a6096f7bbfb4e0673cf2751ae3c1" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 07 19:37:00 crc kubenswrapper[4815]: E1207 19:37:00.944055 4815 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ee6e5f2b7f2e5f0ab8ea23735fdba52239a9a6096f7bbfb4e0673cf2751ae3c1" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 07 19:37:00 crc kubenswrapper[4815]: E1207 19:37:00.944254 4815 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="46b8c248-9f13-484f-8401-ecb16b5c05fd" containerName="nova-scheduler-scheduler" Dec 07 19:37:01 crc kubenswrapper[4815]: I1207 19:37:01.353796 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 07 19:37:01 crc kubenswrapper[4815]: I1207 19:37:01.407525 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6061916-563b-4a80-8f3d-157804a505c3-logs\") pod \"b6061916-563b-4a80-8f3d-157804a505c3\" (UID: \"b6061916-563b-4a80-8f3d-157804a505c3\") " Dec 07 19:37:01 crc kubenswrapper[4815]: I1207 19:37:01.407589 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4n7x9\" (UniqueName: \"kubernetes.io/projected/b6061916-563b-4a80-8f3d-157804a505c3-kube-api-access-4n7x9\") pod \"b6061916-563b-4a80-8f3d-157804a505c3\" (UID: \"b6061916-563b-4a80-8f3d-157804a505c3\") " Dec 07 19:37:01 crc kubenswrapper[4815]: I1207 19:37:01.407646 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6061916-563b-4a80-8f3d-157804a505c3-nova-metadata-tls-certs\") pod \"b6061916-563b-4a80-8f3d-157804a505c3\" (UID: \"b6061916-563b-4a80-8f3d-157804a505c3\") " Dec 07 19:37:01 crc kubenswrapper[4815]: I1207 19:37:01.407728 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6061916-563b-4a80-8f3d-157804a505c3-config-data\") pod \"b6061916-563b-4a80-8f3d-157804a505c3\" (UID: \"b6061916-563b-4a80-8f3d-157804a505c3\") " Dec 07 19:37:01 crc kubenswrapper[4815]: I1207 19:37:01.407832 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6061916-563b-4a80-8f3d-157804a505c3-combined-ca-bundle\") pod \"b6061916-563b-4a80-8f3d-157804a505c3\" (UID: \"b6061916-563b-4a80-8f3d-157804a505c3\") " Dec 07 19:37:01 crc kubenswrapper[4815]: I1207 19:37:01.408437 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6061916-563b-4a80-8f3d-157804a505c3-logs" (OuterVolumeSpecName: "logs") pod "b6061916-563b-4a80-8f3d-157804a505c3" (UID: "b6061916-563b-4a80-8f3d-157804a505c3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 07 19:37:01 crc kubenswrapper[4815]: I1207 19:37:01.415867 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6061916-563b-4a80-8f3d-157804a505c3-kube-api-access-4n7x9" (OuterVolumeSpecName: "kube-api-access-4n7x9") pod "b6061916-563b-4a80-8f3d-157804a505c3" (UID: "b6061916-563b-4a80-8f3d-157804a505c3"). InnerVolumeSpecName "kube-api-access-4n7x9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:37:01 crc kubenswrapper[4815]: I1207 19:37:01.443217 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6061916-563b-4a80-8f3d-157804a505c3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b6061916-563b-4a80-8f3d-157804a505c3" (UID: "b6061916-563b-4a80-8f3d-157804a505c3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:37:01 crc kubenswrapper[4815]: I1207 19:37:01.456899 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6061916-563b-4a80-8f3d-157804a505c3-config-data" (OuterVolumeSpecName: "config-data") pod "b6061916-563b-4a80-8f3d-157804a505c3" (UID: "b6061916-563b-4a80-8f3d-157804a505c3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:37:01 crc kubenswrapper[4815]: I1207 19:37:01.457688 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6061916-563b-4a80-8f3d-157804a505c3-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "b6061916-563b-4a80-8f3d-157804a505c3" (UID: "b6061916-563b-4a80-8f3d-157804a505c3"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:37:01 crc kubenswrapper[4815]: I1207 19:37:01.516512 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4n7x9\" (UniqueName: \"kubernetes.io/projected/b6061916-563b-4a80-8f3d-157804a505c3-kube-api-access-4n7x9\") on node \"crc\" DevicePath \"\"" Dec 07 19:37:01 crc kubenswrapper[4815]: I1207 19:37:01.517609 4815 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6061916-563b-4a80-8f3d-157804a505c3-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 07 19:37:01 crc kubenswrapper[4815]: I1207 19:37:01.517707 4815 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6061916-563b-4a80-8f3d-157804a505c3-config-data\") on node \"crc\" DevicePath \"\"" Dec 07 19:37:01 crc kubenswrapper[4815]: I1207 19:37:01.517813 4815 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6061916-563b-4a80-8f3d-157804a505c3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 07 19:37:01 crc kubenswrapper[4815]: I1207 19:37:01.517892 4815 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6061916-563b-4a80-8f3d-157804a505c3-logs\") on node \"crc\" DevicePath \"\"" Dec 07 19:37:01 crc kubenswrapper[4815]: I1207 19:37:01.540690 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9587a968-ace0-4eb2-8e73-4230356ac221","Type":"ContainerStarted","Data":"35d5f4c4104bfe95fd8ef2578c85d28f57d32c239db136652c73c0055eed73e6"} Dec 07 19:37:01 crc kubenswrapper[4815]: I1207 19:37:01.548333 4815 generic.go:334] "Generic (PLEG): container finished" podID="b6061916-563b-4a80-8f3d-157804a505c3" containerID="6307d6a0b0bc3be3a5fe9b0ea3faa73d3fe6e3ae31cb442818d1186f88730a2f" exitCode=0 Dec 07 19:37:01 crc kubenswrapper[4815]: I1207 19:37:01.548389 4815 generic.go:334] "Generic (PLEG): container finished" podID="b6061916-563b-4a80-8f3d-157804a505c3" containerID="23395269ec647f334fe55381661b1029c90a345465f85c85cd6641d7d94a7a7c" exitCode=143 Dec 07 19:37:01 crc kubenswrapper[4815]: I1207 19:37:01.548439 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b6061916-563b-4a80-8f3d-157804a505c3","Type":"ContainerDied","Data":"6307d6a0b0bc3be3a5fe9b0ea3faa73d3fe6e3ae31cb442818d1186f88730a2f"} Dec 07 19:37:01 crc kubenswrapper[4815]: I1207 19:37:01.548472 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b6061916-563b-4a80-8f3d-157804a505c3","Type":"ContainerDied","Data":"23395269ec647f334fe55381661b1029c90a345465f85c85cd6641d7d94a7a7c"} Dec 07 19:37:01 crc kubenswrapper[4815]: I1207 19:37:01.548485 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b6061916-563b-4a80-8f3d-157804a505c3","Type":"ContainerDied","Data":"4b9d8d54c72a90904fc87d5f1cf05d005c4434cb259ad7dcf907958e5bc5e858"} Dec 07 19:37:01 crc kubenswrapper[4815]: I1207 19:37:01.548502 4815 scope.go:117] "RemoveContainer" containerID="6307d6a0b0bc3be3a5fe9b0ea3faa73d3fe6e3ae31cb442818d1186f88730a2f" Dec 07 19:37:01 crc kubenswrapper[4815]: I1207 19:37:01.548600 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 07 19:37:01 crc kubenswrapper[4815]: I1207 19:37:01.558169 4815 generic.go:334] "Generic (PLEG): container finished" podID="8ade483a-7452-4233-be4b-d2de4c5c6549" containerID="52d71aaf46e7310cdbd3dc1313781b8c202d6ba10b188721cda334865571c29b" exitCode=143 Dec 07 19:37:01 crc kubenswrapper[4815]: I1207 19:37:01.558370 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8ade483a-7452-4233-be4b-d2de4c5c6549","Type":"ContainerDied","Data":"52d71aaf46e7310cdbd3dc1313781b8c202d6ba10b188721cda334865571c29b"} Dec 07 19:37:01 crc kubenswrapper[4815]: I1207 19:37:01.599949 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 07 19:37:01 crc kubenswrapper[4815]: I1207 19:37:01.608611 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 07 19:37:01 crc kubenswrapper[4815]: I1207 19:37:01.620071 4815 scope.go:117] "RemoveContainer" containerID="23395269ec647f334fe55381661b1029c90a345465f85c85cd6641d7d94a7a7c" Dec 07 19:37:01 crc kubenswrapper[4815]: I1207 19:37:01.622115 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 07 19:37:01 crc kubenswrapper[4815]: E1207 19:37:01.622700 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cdbde09-9f1a-4448-8fa3-372d29371084" containerName="nova-manage" Dec 07 19:37:01 crc kubenswrapper[4815]: I1207 19:37:01.622728 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cdbde09-9f1a-4448-8fa3-372d29371084" containerName="nova-manage" Dec 07 19:37:01 crc kubenswrapper[4815]: E1207 19:37:01.622756 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6061916-563b-4a80-8f3d-157804a505c3" containerName="nova-metadata-metadata" Dec 07 19:37:01 crc kubenswrapper[4815]: I1207 19:37:01.622767 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6061916-563b-4a80-8f3d-157804a505c3" containerName="nova-metadata-metadata" Dec 07 19:37:01 crc kubenswrapper[4815]: E1207 19:37:01.622789 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6061916-563b-4a80-8f3d-157804a505c3" containerName="nova-metadata-log" Dec 07 19:37:01 crc kubenswrapper[4815]: I1207 19:37:01.622798 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6061916-563b-4a80-8f3d-157804a505c3" containerName="nova-metadata-log" Dec 07 19:37:01 crc kubenswrapper[4815]: I1207 19:37:01.623047 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6061916-563b-4a80-8f3d-157804a505c3" containerName="nova-metadata-log" Dec 07 19:37:01 crc kubenswrapper[4815]: I1207 19:37:01.623085 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6061916-563b-4a80-8f3d-157804a505c3" containerName="nova-metadata-metadata" Dec 07 19:37:01 crc kubenswrapper[4815]: I1207 19:37:01.623155 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cdbde09-9f1a-4448-8fa3-372d29371084" containerName="nova-manage" Dec 07 19:37:01 crc kubenswrapper[4815]: I1207 19:37:01.635449 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 07 19:37:01 crc kubenswrapper[4815]: I1207 19:37:01.660372 4815 scope.go:117] "RemoveContainer" containerID="6307d6a0b0bc3be3a5fe9b0ea3faa73d3fe6e3ae31cb442818d1186f88730a2f" Dec 07 19:37:01 crc kubenswrapper[4815]: E1207 19:37:01.661282 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6307d6a0b0bc3be3a5fe9b0ea3faa73d3fe6e3ae31cb442818d1186f88730a2f\": container with ID starting with 6307d6a0b0bc3be3a5fe9b0ea3faa73d3fe6e3ae31cb442818d1186f88730a2f not found: ID does not exist" containerID="6307d6a0b0bc3be3a5fe9b0ea3faa73d3fe6e3ae31cb442818d1186f88730a2f" Dec 07 19:37:01 crc kubenswrapper[4815]: I1207 19:37:01.661311 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6307d6a0b0bc3be3a5fe9b0ea3faa73d3fe6e3ae31cb442818d1186f88730a2f"} err="failed to get container status \"6307d6a0b0bc3be3a5fe9b0ea3faa73d3fe6e3ae31cb442818d1186f88730a2f\": rpc error: code = NotFound desc = could not find container \"6307d6a0b0bc3be3a5fe9b0ea3faa73d3fe6e3ae31cb442818d1186f88730a2f\": container with ID starting with 6307d6a0b0bc3be3a5fe9b0ea3faa73d3fe6e3ae31cb442818d1186f88730a2f not found: ID does not exist" Dec 07 19:37:01 crc kubenswrapper[4815]: I1207 19:37:01.661345 4815 scope.go:117] "RemoveContainer" containerID="23395269ec647f334fe55381661b1029c90a345465f85c85cd6641d7d94a7a7c" Dec 07 19:37:01 crc kubenswrapper[4815]: E1207 19:37:01.661832 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23395269ec647f334fe55381661b1029c90a345465f85c85cd6641d7d94a7a7c\": container with ID starting with 23395269ec647f334fe55381661b1029c90a345465f85c85cd6641d7d94a7a7c not found: ID does not exist" containerID="23395269ec647f334fe55381661b1029c90a345465f85c85cd6641d7d94a7a7c" Dec 07 19:37:01 crc kubenswrapper[4815]: I1207 19:37:01.661856 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23395269ec647f334fe55381661b1029c90a345465f85c85cd6641d7d94a7a7c"} err="failed to get container status \"23395269ec647f334fe55381661b1029c90a345465f85c85cd6641d7d94a7a7c\": rpc error: code = NotFound desc = could not find container \"23395269ec647f334fe55381661b1029c90a345465f85c85cd6641d7d94a7a7c\": container with ID starting with 23395269ec647f334fe55381661b1029c90a345465f85c85cd6641d7d94a7a7c not found: ID does not exist" Dec 07 19:37:01 crc kubenswrapper[4815]: I1207 19:37:01.661868 4815 scope.go:117] "RemoveContainer" containerID="6307d6a0b0bc3be3a5fe9b0ea3faa73d3fe6e3ae31cb442818d1186f88730a2f" Dec 07 19:37:01 crc kubenswrapper[4815]: I1207 19:37:01.662786 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6307d6a0b0bc3be3a5fe9b0ea3faa73d3fe6e3ae31cb442818d1186f88730a2f"} err="failed to get container status \"6307d6a0b0bc3be3a5fe9b0ea3faa73d3fe6e3ae31cb442818d1186f88730a2f\": rpc error: code = NotFound desc = could not find container \"6307d6a0b0bc3be3a5fe9b0ea3faa73d3fe6e3ae31cb442818d1186f88730a2f\": container with ID starting with 6307d6a0b0bc3be3a5fe9b0ea3faa73d3fe6e3ae31cb442818d1186f88730a2f not found: ID does not exist" Dec 07 19:37:01 crc kubenswrapper[4815]: I1207 19:37:01.662802 4815 scope.go:117] "RemoveContainer" containerID="23395269ec647f334fe55381661b1029c90a345465f85c85cd6641d7d94a7a7c" Dec 07 19:37:01 crc kubenswrapper[4815]: I1207 19:37:01.663085 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23395269ec647f334fe55381661b1029c90a345465f85c85cd6641d7d94a7a7c"} err="failed to get container status \"23395269ec647f334fe55381661b1029c90a345465f85c85cd6641d7d94a7a7c\": rpc error: code = NotFound desc = could not find container \"23395269ec647f334fe55381661b1029c90a345465f85c85cd6641d7d94a7a7c\": container with ID starting with 23395269ec647f334fe55381661b1029c90a345465f85c85cd6641d7d94a7a7c not found: ID does not exist" Dec 07 19:37:01 crc kubenswrapper[4815]: I1207 19:37:01.680105 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 07 19:37:01 crc kubenswrapper[4815]: I1207 19:37:01.686951 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 07 19:37:01 crc kubenswrapper[4815]: I1207 19:37:01.687423 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 07 19:37:01 crc kubenswrapper[4815]: I1207 19:37:01.725637 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4fe96a6-892b-4178-ad60-b1b256140e05-config-data\") pod \"nova-metadata-0\" (UID: \"e4fe96a6-892b-4178-ad60-b1b256140e05\") " pod="openstack/nova-metadata-0" Dec 07 19:37:01 crc kubenswrapper[4815]: I1207 19:37:01.725741 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t66xr\" (UniqueName: \"kubernetes.io/projected/e4fe96a6-892b-4178-ad60-b1b256140e05-kube-api-access-t66xr\") pod \"nova-metadata-0\" (UID: \"e4fe96a6-892b-4178-ad60-b1b256140e05\") " pod="openstack/nova-metadata-0" Dec 07 19:37:01 crc kubenswrapper[4815]: I1207 19:37:01.725769 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4fe96a6-892b-4178-ad60-b1b256140e05-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e4fe96a6-892b-4178-ad60-b1b256140e05\") " pod="openstack/nova-metadata-0" Dec 07 19:37:01 crc kubenswrapper[4815]: I1207 19:37:01.725816 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4fe96a6-892b-4178-ad60-b1b256140e05-logs\") pod \"nova-metadata-0\" (UID: \"e4fe96a6-892b-4178-ad60-b1b256140e05\") " pod="openstack/nova-metadata-0" Dec 07 19:37:01 crc kubenswrapper[4815]: I1207 19:37:01.725928 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4fe96a6-892b-4178-ad60-b1b256140e05-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e4fe96a6-892b-4178-ad60-b1b256140e05\") " pod="openstack/nova-metadata-0" Dec 07 19:37:01 crc kubenswrapper[4815]: I1207 19:37:01.795128 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6061916-563b-4a80-8f3d-157804a505c3" path="/var/lib/kubelet/pods/b6061916-563b-4a80-8f3d-157804a505c3/volumes" Dec 07 19:37:01 crc kubenswrapper[4815]: I1207 19:37:01.827161 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4fe96a6-892b-4178-ad60-b1b256140e05-config-data\") pod \"nova-metadata-0\" (UID: \"e4fe96a6-892b-4178-ad60-b1b256140e05\") " pod="openstack/nova-metadata-0" Dec 07 19:37:01 crc kubenswrapper[4815]: I1207 19:37:01.827260 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t66xr\" (UniqueName: \"kubernetes.io/projected/e4fe96a6-892b-4178-ad60-b1b256140e05-kube-api-access-t66xr\") pod \"nova-metadata-0\" (UID: \"e4fe96a6-892b-4178-ad60-b1b256140e05\") " pod="openstack/nova-metadata-0" Dec 07 19:37:01 crc kubenswrapper[4815]: I1207 19:37:01.827291 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4fe96a6-892b-4178-ad60-b1b256140e05-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e4fe96a6-892b-4178-ad60-b1b256140e05\") " pod="openstack/nova-metadata-0" Dec 07 19:37:01 crc kubenswrapper[4815]: I1207 19:37:01.827326 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4fe96a6-892b-4178-ad60-b1b256140e05-logs\") pod \"nova-metadata-0\" (UID: \"e4fe96a6-892b-4178-ad60-b1b256140e05\") " pod="openstack/nova-metadata-0" Dec 07 19:37:01 crc kubenswrapper[4815]: I1207 19:37:01.827402 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4fe96a6-892b-4178-ad60-b1b256140e05-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e4fe96a6-892b-4178-ad60-b1b256140e05\") " pod="openstack/nova-metadata-0" Dec 07 19:37:01 crc kubenswrapper[4815]: I1207 19:37:01.834666 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4fe96a6-892b-4178-ad60-b1b256140e05-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e4fe96a6-892b-4178-ad60-b1b256140e05\") " pod="openstack/nova-metadata-0" Dec 07 19:37:01 crc kubenswrapper[4815]: I1207 19:37:01.837620 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4fe96a6-892b-4178-ad60-b1b256140e05-logs\") pod \"nova-metadata-0\" (UID: \"e4fe96a6-892b-4178-ad60-b1b256140e05\") " pod="openstack/nova-metadata-0" Dec 07 19:37:01 crc kubenswrapper[4815]: I1207 19:37:01.839478 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4fe96a6-892b-4178-ad60-b1b256140e05-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e4fe96a6-892b-4178-ad60-b1b256140e05\") " pod="openstack/nova-metadata-0" Dec 07 19:37:01 crc kubenswrapper[4815]: I1207 19:37:01.840076 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4fe96a6-892b-4178-ad60-b1b256140e05-config-data\") pod \"nova-metadata-0\" (UID: \"e4fe96a6-892b-4178-ad60-b1b256140e05\") " pod="openstack/nova-metadata-0" Dec 07 19:37:01 crc kubenswrapper[4815]: I1207 19:37:01.859460 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t66xr\" (UniqueName: \"kubernetes.io/projected/e4fe96a6-892b-4178-ad60-b1b256140e05-kube-api-access-t66xr\") pod \"nova-metadata-0\" (UID: \"e4fe96a6-892b-4178-ad60-b1b256140e05\") " pod="openstack/nova-metadata-0" Dec 07 19:37:01 crc kubenswrapper[4815]: I1207 19:37:01.942953 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-nrp9x" Dec 07 19:37:02 crc kubenswrapper[4815]: I1207 19:37:02.032282 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e95a85c7-1619-4d79-bc43-3acc00d3ab9a-config-data\") pod \"e95a85c7-1619-4d79-bc43-3acc00d3ab9a\" (UID: \"e95a85c7-1619-4d79-bc43-3acc00d3ab9a\") " Dec 07 19:37:02 crc kubenswrapper[4815]: I1207 19:37:02.032365 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e95a85c7-1619-4d79-bc43-3acc00d3ab9a-combined-ca-bundle\") pod \"e95a85c7-1619-4d79-bc43-3acc00d3ab9a\" (UID: \"e95a85c7-1619-4d79-bc43-3acc00d3ab9a\") " Dec 07 19:37:02 crc kubenswrapper[4815]: I1207 19:37:02.032423 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e95a85c7-1619-4d79-bc43-3acc00d3ab9a-scripts\") pod \"e95a85c7-1619-4d79-bc43-3acc00d3ab9a\" (UID: \"e95a85c7-1619-4d79-bc43-3acc00d3ab9a\") " Dec 07 19:37:02 crc kubenswrapper[4815]: I1207 19:37:02.032508 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tpf6k\" (UniqueName: \"kubernetes.io/projected/e95a85c7-1619-4d79-bc43-3acc00d3ab9a-kube-api-access-tpf6k\") pod \"e95a85c7-1619-4d79-bc43-3acc00d3ab9a\" (UID: \"e95a85c7-1619-4d79-bc43-3acc00d3ab9a\") " Dec 07 19:37:02 crc kubenswrapper[4815]: I1207 19:37:02.038666 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 07 19:37:02 crc kubenswrapper[4815]: I1207 19:37:02.041413 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e95a85c7-1619-4d79-bc43-3acc00d3ab9a-scripts" (OuterVolumeSpecName: "scripts") pod "e95a85c7-1619-4d79-bc43-3acc00d3ab9a" (UID: "e95a85c7-1619-4d79-bc43-3acc00d3ab9a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:37:02 crc kubenswrapper[4815]: I1207 19:37:02.045100 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e95a85c7-1619-4d79-bc43-3acc00d3ab9a-kube-api-access-tpf6k" (OuterVolumeSpecName: "kube-api-access-tpf6k") pod "e95a85c7-1619-4d79-bc43-3acc00d3ab9a" (UID: "e95a85c7-1619-4d79-bc43-3acc00d3ab9a"). InnerVolumeSpecName "kube-api-access-tpf6k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:37:02 crc kubenswrapper[4815]: I1207 19:37:02.071620 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e95a85c7-1619-4d79-bc43-3acc00d3ab9a-config-data" (OuterVolumeSpecName: "config-data") pod "e95a85c7-1619-4d79-bc43-3acc00d3ab9a" (UID: "e95a85c7-1619-4d79-bc43-3acc00d3ab9a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:37:02 crc kubenswrapper[4815]: I1207 19:37:02.089026 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e95a85c7-1619-4d79-bc43-3acc00d3ab9a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e95a85c7-1619-4d79-bc43-3acc00d3ab9a" (UID: "e95a85c7-1619-4d79-bc43-3acc00d3ab9a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:37:02 crc kubenswrapper[4815]: I1207 19:37:02.140896 4815 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e95a85c7-1619-4d79-bc43-3acc00d3ab9a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 07 19:37:02 crc kubenswrapper[4815]: I1207 19:37:02.140968 4815 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e95a85c7-1619-4d79-bc43-3acc00d3ab9a-scripts\") on node \"crc\" DevicePath \"\"" Dec 07 19:37:02 crc kubenswrapper[4815]: I1207 19:37:02.140979 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tpf6k\" (UniqueName: \"kubernetes.io/projected/e95a85c7-1619-4d79-bc43-3acc00d3ab9a-kube-api-access-tpf6k\") on node \"crc\" DevicePath \"\"" Dec 07 19:37:02 crc kubenswrapper[4815]: I1207 19:37:02.140991 4815 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e95a85c7-1619-4d79-bc43-3acc00d3ab9a-config-data\") on node \"crc\" DevicePath \"\"" Dec 07 19:37:02 crc kubenswrapper[4815]: I1207 19:37:02.531730 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 07 19:37:02 crc kubenswrapper[4815]: I1207 19:37:02.572478 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9587a968-ace0-4eb2-8e73-4230356ac221","Type":"ContainerStarted","Data":"87ea4f424ab8a48c8689ba8307522862e75e5c764c210cf7a2f31c773ea54c33"} Dec 07 19:37:02 crc kubenswrapper[4815]: I1207 19:37:02.573725 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e4fe96a6-892b-4178-ad60-b1b256140e05","Type":"ContainerStarted","Data":"3e4c7ddd5f230b428b727311bd44fe62b2de018f4e3af1ebad411eff90270cd6"} Dec 07 19:37:02 crc kubenswrapper[4815]: I1207 19:37:02.577104 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-nrp9x" event={"ID":"e95a85c7-1619-4d79-bc43-3acc00d3ab9a","Type":"ContainerDied","Data":"a999466062f509667c45e105e9a97387211230533c4ceadb8cd4290c9acdccf5"} Dec 07 19:37:02 crc kubenswrapper[4815]: I1207 19:37:02.577132 4815 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a999466062f509667c45e105e9a97387211230533c4ceadb8cd4290c9acdccf5" Dec 07 19:37:02 crc kubenswrapper[4815]: I1207 19:37:02.577182 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-nrp9x" Dec 07 19:37:02 crc kubenswrapper[4815]: I1207 19:37:02.635842 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 07 19:37:02 crc kubenswrapper[4815]: E1207 19:37:02.636329 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e95a85c7-1619-4d79-bc43-3acc00d3ab9a" containerName="nova-cell1-conductor-db-sync" Dec 07 19:37:02 crc kubenswrapper[4815]: I1207 19:37:02.636350 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="e95a85c7-1619-4d79-bc43-3acc00d3ab9a" containerName="nova-cell1-conductor-db-sync" Dec 07 19:37:02 crc kubenswrapper[4815]: I1207 19:37:02.636559 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="e95a85c7-1619-4d79-bc43-3acc00d3ab9a" containerName="nova-cell1-conductor-db-sync" Dec 07 19:37:02 crc kubenswrapper[4815]: I1207 19:37:02.637226 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 07 19:37:02 crc kubenswrapper[4815]: I1207 19:37:02.639474 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 07 19:37:02 crc kubenswrapper[4815]: I1207 19:37:02.656962 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 07 19:37:02 crc kubenswrapper[4815]: I1207 19:37:02.756564 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9b7680d-2790-4020-8305-fb018ebbee97-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"c9b7680d-2790-4020-8305-fb018ebbee97\") " pod="openstack/nova-cell1-conductor-0" Dec 07 19:37:02 crc kubenswrapper[4815]: I1207 19:37:02.756942 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpshv\" (UniqueName: \"kubernetes.io/projected/c9b7680d-2790-4020-8305-fb018ebbee97-kube-api-access-vpshv\") pod \"nova-cell1-conductor-0\" (UID: \"c9b7680d-2790-4020-8305-fb018ebbee97\") " pod="openstack/nova-cell1-conductor-0" Dec 07 19:37:02 crc kubenswrapper[4815]: I1207 19:37:02.756966 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9b7680d-2790-4020-8305-fb018ebbee97-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"c9b7680d-2790-4020-8305-fb018ebbee97\") " pod="openstack/nova-cell1-conductor-0" Dec 07 19:37:02 crc kubenswrapper[4815]: I1207 19:37:02.859095 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9b7680d-2790-4020-8305-fb018ebbee97-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"c9b7680d-2790-4020-8305-fb018ebbee97\") " pod="openstack/nova-cell1-conductor-0" Dec 07 19:37:02 crc kubenswrapper[4815]: I1207 19:37:02.859243 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpshv\" (UniqueName: \"kubernetes.io/projected/c9b7680d-2790-4020-8305-fb018ebbee97-kube-api-access-vpshv\") pod \"nova-cell1-conductor-0\" (UID: \"c9b7680d-2790-4020-8305-fb018ebbee97\") " pod="openstack/nova-cell1-conductor-0" Dec 07 19:37:02 crc kubenswrapper[4815]: I1207 19:37:02.859269 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9b7680d-2790-4020-8305-fb018ebbee97-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"c9b7680d-2790-4020-8305-fb018ebbee97\") " pod="openstack/nova-cell1-conductor-0" Dec 07 19:37:02 crc kubenswrapper[4815]: I1207 19:37:02.868435 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9b7680d-2790-4020-8305-fb018ebbee97-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"c9b7680d-2790-4020-8305-fb018ebbee97\") " pod="openstack/nova-cell1-conductor-0" Dec 07 19:37:02 crc kubenswrapper[4815]: I1207 19:37:02.879010 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9b7680d-2790-4020-8305-fb018ebbee97-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"c9b7680d-2790-4020-8305-fb018ebbee97\") " pod="openstack/nova-cell1-conductor-0" Dec 07 19:37:02 crc kubenswrapper[4815]: I1207 19:37:02.883174 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpshv\" (UniqueName: \"kubernetes.io/projected/c9b7680d-2790-4020-8305-fb018ebbee97-kube-api-access-vpshv\") pod \"nova-cell1-conductor-0\" (UID: \"c9b7680d-2790-4020-8305-fb018ebbee97\") " pod="openstack/nova-cell1-conductor-0" Dec 07 19:37:03 crc kubenswrapper[4815]: I1207 19:37:03.001101 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 07 19:37:03 crc kubenswrapper[4815]: I1207 19:37:03.511601 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 07 19:37:03 crc kubenswrapper[4815]: I1207 19:37:03.586581 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"c9b7680d-2790-4020-8305-fb018ebbee97","Type":"ContainerStarted","Data":"e6fe314b52bee7cda18cd34fa8f371f2b339cecaf3bab0c0374f26e087e2fd6f"} Dec 07 19:37:03 crc kubenswrapper[4815]: I1207 19:37:03.589739 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9587a968-ace0-4eb2-8e73-4230356ac221","Type":"ContainerStarted","Data":"dde72a050ae6583c1f997b1d0632ab6df2fe05042d35bbd8416f744433881fca"} Dec 07 19:37:03 crc kubenswrapper[4815]: I1207 19:37:03.591494 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 07 19:37:03 crc kubenswrapper[4815]: I1207 19:37:03.600413 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e4fe96a6-892b-4178-ad60-b1b256140e05","Type":"ContainerStarted","Data":"e264c9708358ce7f36356e8c85bacf46a94553e84b94adfae5b9b6b9d9e47617"} Dec 07 19:37:03 crc kubenswrapper[4815]: I1207 19:37:03.600705 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e4fe96a6-892b-4178-ad60-b1b256140e05","Type":"ContainerStarted","Data":"6d7131c14f3d1ff802c3e9200b3b8ce02374bc6eb4749ad06558ac19f7787e89"} Dec 07 19:37:03 crc kubenswrapper[4815]: I1207 19:37:03.634825 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.4862663400000002 podStartE2EDuration="5.63480347s" podCreationTimestamp="2025-12-07 19:36:58 +0000 UTC" firstStartedPulling="2025-12-07 19:36:59.720539736 +0000 UTC m=+1324.299529781" lastFinishedPulling="2025-12-07 19:37:02.869076856 +0000 UTC m=+1327.448066911" observedRunningTime="2025-12-07 19:37:03.624191901 +0000 UTC m=+1328.203182056" watchObservedRunningTime="2025-12-07 19:37:03.63480347 +0000 UTC m=+1328.213793515" Dec 07 19:37:03 crc kubenswrapper[4815]: I1207 19:37:03.647067 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.647044696 podStartE2EDuration="2.647044696s" podCreationTimestamp="2025-12-07 19:37:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:37:03.645981866 +0000 UTC m=+1328.224971911" watchObservedRunningTime="2025-12-07 19:37:03.647044696 +0000 UTC m=+1328.226034741" Dec 07 19:37:03 crc kubenswrapper[4815]: I1207 19:37:03.865511 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 07 19:37:04 crc kubenswrapper[4815]: I1207 19:37:04.216370 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 07 19:37:04 crc kubenswrapper[4815]: I1207 19:37:04.285628 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ade483a-7452-4233-be4b-d2de4c5c6549-combined-ca-bundle\") pod \"8ade483a-7452-4233-be4b-d2de4c5c6549\" (UID: \"8ade483a-7452-4233-be4b-d2de4c5c6549\") " Dec 07 19:37:04 crc kubenswrapper[4815]: I1207 19:37:04.285680 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ade483a-7452-4233-be4b-d2de4c5c6549-logs\") pod \"8ade483a-7452-4233-be4b-d2de4c5c6549\" (UID: \"8ade483a-7452-4233-be4b-d2de4c5c6549\") " Dec 07 19:37:04 crc kubenswrapper[4815]: I1207 19:37:04.285775 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjxvb\" (UniqueName: \"kubernetes.io/projected/8ade483a-7452-4233-be4b-d2de4c5c6549-kube-api-access-fjxvb\") pod \"8ade483a-7452-4233-be4b-d2de4c5c6549\" (UID: \"8ade483a-7452-4233-be4b-d2de4c5c6549\") " Dec 07 19:37:04 crc kubenswrapper[4815]: I1207 19:37:04.285836 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ade483a-7452-4233-be4b-d2de4c5c6549-config-data\") pod \"8ade483a-7452-4233-be4b-d2de4c5c6549\" (UID: \"8ade483a-7452-4233-be4b-d2de4c5c6549\") " Dec 07 19:37:04 crc kubenswrapper[4815]: I1207 19:37:04.286574 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ade483a-7452-4233-be4b-d2de4c5c6549-logs" (OuterVolumeSpecName: "logs") pod "8ade483a-7452-4233-be4b-d2de4c5c6549" (UID: "8ade483a-7452-4233-be4b-d2de4c5c6549"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 07 19:37:04 crc kubenswrapper[4815]: I1207 19:37:04.302682 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ade483a-7452-4233-be4b-d2de4c5c6549-kube-api-access-fjxvb" (OuterVolumeSpecName: "kube-api-access-fjxvb") pod "8ade483a-7452-4233-be4b-d2de4c5c6549" (UID: "8ade483a-7452-4233-be4b-d2de4c5c6549"). InnerVolumeSpecName "kube-api-access-fjxvb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:37:04 crc kubenswrapper[4815]: I1207 19:37:04.338848 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ade483a-7452-4233-be4b-d2de4c5c6549-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8ade483a-7452-4233-be4b-d2de4c5c6549" (UID: "8ade483a-7452-4233-be4b-d2de4c5c6549"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:37:04 crc kubenswrapper[4815]: I1207 19:37:04.339321 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ade483a-7452-4233-be4b-d2de4c5c6549-config-data" (OuterVolumeSpecName: "config-data") pod "8ade483a-7452-4233-be4b-d2de4c5c6549" (UID: "8ade483a-7452-4233-be4b-d2de4c5c6549"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:37:04 crc kubenswrapper[4815]: I1207 19:37:04.439181 4815 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ade483a-7452-4233-be4b-d2de4c5c6549-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 07 19:37:04 crc kubenswrapper[4815]: I1207 19:37:04.439207 4815 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ade483a-7452-4233-be4b-d2de4c5c6549-logs\") on node \"crc\" DevicePath \"\"" Dec 07 19:37:04 crc kubenswrapper[4815]: I1207 19:37:04.439219 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fjxvb\" (UniqueName: \"kubernetes.io/projected/8ade483a-7452-4233-be4b-d2de4c5c6549-kube-api-access-fjxvb\") on node \"crc\" DevicePath \"\"" Dec 07 19:37:04 crc kubenswrapper[4815]: I1207 19:37:04.439229 4815 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ade483a-7452-4233-be4b-d2de4c5c6549-config-data\") on node \"crc\" DevicePath \"\"" Dec 07 19:37:04 crc kubenswrapper[4815]: I1207 19:37:04.612784 4815 generic.go:334] "Generic (PLEG): container finished" podID="8ade483a-7452-4233-be4b-d2de4c5c6549" containerID="4b54e609f13e73854e6666a306978e488a02cd34ad291282f8d861f31cf1f41e" exitCode=0 Dec 07 19:37:04 crc kubenswrapper[4815]: I1207 19:37:04.612876 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8ade483a-7452-4233-be4b-d2de4c5c6549","Type":"ContainerDied","Data":"4b54e609f13e73854e6666a306978e488a02cd34ad291282f8d861f31cf1f41e"} Dec 07 19:37:04 crc kubenswrapper[4815]: I1207 19:37:04.612908 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8ade483a-7452-4233-be4b-d2de4c5c6549","Type":"ContainerDied","Data":"aa8dd24a810cf358e596ffd9e9cd2e7a5a28ee67aa3a392996092df58ecb1b1a"} Dec 07 19:37:04 crc kubenswrapper[4815]: I1207 19:37:04.613096 4815 scope.go:117] "RemoveContainer" containerID="4b54e609f13e73854e6666a306978e488a02cd34ad291282f8d861f31cf1f41e" Dec 07 19:37:04 crc kubenswrapper[4815]: I1207 19:37:04.613299 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 07 19:37:04 crc kubenswrapper[4815]: I1207 19:37:04.616892 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"c9b7680d-2790-4020-8305-fb018ebbee97","Type":"ContainerStarted","Data":"57c2e548479c0d795b0c007801aee2b9c692fa7fa731c05a3ccfe084d991e9dd"} Dec 07 19:37:04 crc kubenswrapper[4815]: I1207 19:37:04.654369 4815 scope.go:117] "RemoveContainer" containerID="52d71aaf46e7310cdbd3dc1313781b8c202d6ba10b188721cda334865571c29b" Dec 07 19:37:04 crc kubenswrapper[4815]: I1207 19:37:04.691307 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.69128956 podStartE2EDuration="2.69128956s" podCreationTimestamp="2025-12-07 19:37:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:37:04.67036966 +0000 UTC m=+1329.249359705" watchObservedRunningTime="2025-12-07 19:37:04.69128956 +0000 UTC m=+1329.270279605" Dec 07 19:37:04 crc kubenswrapper[4815]: I1207 19:37:04.712809 4815 scope.go:117] "RemoveContainer" containerID="4b54e609f13e73854e6666a306978e488a02cd34ad291282f8d861f31cf1f41e" Dec 07 19:37:04 crc kubenswrapper[4815]: I1207 19:37:04.714390 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 07 19:37:04 crc kubenswrapper[4815]: E1207 19:37:04.723274 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b54e609f13e73854e6666a306978e488a02cd34ad291282f8d861f31cf1f41e\": container with ID starting with 4b54e609f13e73854e6666a306978e488a02cd34ad291282f8d861f31cf1f41e not found: ID does not exist" containerID="4b54e609f13e73854e6666a306978e488a02cd34ad291282f8d861f31cf1f41e" Dec 07 19:37:04 crc kubenswrapper[4815]: I1207 19:37:04.723316 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b54e609f13e73854e6666a306978e488a02cd34ad291282f8d861f31cf1f41e"} err="failed to get container status \"4b54e609f13e73854e6666a306978e488a02cd34ad291282f8d861f31cf1f41e\": rpc error: code = NotFound desc = could not find container \"4b54e609f13e73854e6666a306978e488a02cd34ad291282f8d861f31cf1f41e\": container with ID starting with 4b54e609f13e73854e6666a306978e488a02cd34ad291282f8d861f31cf1f41e not found: ID does not exist" Dec 07 19:37:04 crc kubenswrapper[4815]: I1207 19:37:04.723341 4815 scope.go:117] "RemoveContainer" containerID="52d71aaf46e7310cdbd3dc1313781b8c202d6ba10b188721cda334865571c29b" Dec 07 19:37:04 crc kubenswrapper[4815]: I1207 19:37:04.735832 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 07 19:37:04 crc kubenswrapper[4815]: E1207 19:37:04.739078 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52d71aaf46e7310cdbd3dc1313781b8c202d6ba10b188721cda334865571c29b\": container with ID starting with 52d71aaf46e7310cdbd3dc1313781b8c202d6ba10b188721cda334865571c29b not found: ID does not exist" containerID="52d71aaf46e7310cdbd3dc1313781b8c202d6ba10b188721cda334865571c29b" Dec 07 19:37:04 crc kubenswrapper[4815]: I1207 19:37:04.739121 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52d71aaf46e7310cdbd3dc1313781b8c202d6ba10b188721cda334865571c29b"} err="failed to get container status \"52d71aaf46e7310cdbd3dc1313781b8c202d6ba10b188721cda334865571c29b\": rpc error: code = NotFound desc = could not find container \"52d71aaf46e7310cdbd3dc1313781b8c202d6ba10b188721cda334865571c29b\": container with ID starting with 52d71aaf46e7310cdbd3dc1313781b8c202d6ba10b188721cda334865571c29b not found: ID does not exist" Dec 07 19:37:04 crc kubenswrapper[4815]: I1207 19:37:04.770729 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 07 19:37:04 crc kubenswrapper[4815]: E1207 19:37:04.771658 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ade483a-7452-4233-be4b-d2de4c5c6549" containerName="nova-api-log" Dec 07 19:37:04 crc kubenswrapper[4815]: I1207 19:37:04.771681 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ade483a-7452-4233-be4b-d2de4c5c6549" containerName="nova-api-log" Dec 07 19:37:04 crc kubenswrapper[4815]: E1207 19:37:04.771699 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ade483a-7452-4233-be4b-d2de4c5c6549" containerName="nova-api-api" Dec 07 19:37:04 crc kubenswrapper[4815]: I1207 19:37:04.771708 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ade483a-7452-4233-be4b-d2de4c5c6549" containerName="nova-api-api" Dec 07 19:37:04 crc kubenswrapper[4815]: I1207 19:37:04.771910 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ade483a-7452-4233-be4b-d2de4c5c6549" containerName="nova-api-log" Dec 07 19:37:04 crc kubenswrapper[4815]: I1207 19:37:04.771984 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ade483a-7452-4233-be4b-d2de4c5c6549" containerName="nova-api-api" Dec 07 19:37:04 crc kubenswrapper[4815]: I1207 19:37:04.773221 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 07 19:37:04 crc kubenswrapper[4815]: I1207 19:37:04.781340 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 07 19:37:04 crc kubenswrapper[4815]: I1207 19:37:04.823361 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 07 19:37:04 crc kubenswrapper[4815]: I1207 19:37:04.865118 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xr8l\" (UniqueName: \"kubernetes.io/projected/b7bad849-31ee-4fba-8468-d9b63c4ccfca-kube-api-access-8xr8l\") pod \"nova-api-0\" (UID: \"b7bad849-31ee-4fba-8468-d9b63c4ccfca\") " pod="openstack/nova-api-0" Dec 07 19:37:04 crc kubenswrapper[4815]: I1207 19:37:04.865315 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7bad849-31ee-4fba-8468-d9b63c4ccfca-logs\") pod \"nova-api-0\" (UID: \"b7bad849-31ee-4fba-8468-d9b63c4ccfca\") " pod="openstack/nova-api-0" Dec 07 19:37:04 crc kubenswrapper[4815]: I1207 19:37:04.865415 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7bad849-31ee-4fba-8468-d9b63c4ccfca-config-data\") pod \"nova-api-0\" (UID: \"b7bad849-31ee-4fba-8468-d9b63c4ccfca\") " pod="openstack/nova-api-0" Dec 07 19:37:04 crc kubenswrapper[4815]: I1207 19:37:04.865447 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7bad849-31ee-4fba-8468-d9b63c4ccfca-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b7bad849-31ee-4fba-8468-d9b63c4ccfca\") " pod="openstack/nova-api-0" Dec 07 19:37:04 crc kubenswrapper[4815]: I1207 19:37:04.968430 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7bad849-31ee-4fba-8468-d9b63c4ccfca-config-data\") pod \"nova-api-0\" (UID: \"b7bad849-31ee-4fba-8468-d9b63c4ccfca\") " pod="openstack/nova-api-0" Dec 07 19:37:04 crc kubenswrapper[4815]: I1207 19:37:04.968492 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7bad849-31ee-4fba-8468-d9b63c4ccfca-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b7bad849-31ee-4fba-8468-d9b63c4ccfca\") " pod="openstack/nova-api-0" Dec 07 19:37:04 crc kubenswrapper[4815]: I1207 19:37:04.968590 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xr8l\" (UniqueName: \"kubernetes.io/projected/b7bad849-31ee-4fba-8468-d9b63c4ccfca-kube-api-access-8xr8l\") pod \"nova-api-0\" (UID: \"b7bad849-31ee-4fba-8468-d9b63c4ccfca\") " pod="openstack/nova-api-0" Dec 07 19:37:04 crc kubenswrapper[4815]: I1207 19:37:04.968649 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7bad849-31ee-4fba-8468-d9b63c4ccfca-logs\") pod \"nova-api-0\" (UID: \"b7bad849-31ee-4fba-8468-d9b63c4ccfca\") " pod="openstack/nova-api-0" Dec 07 19:37:04 crc kubenswrapper[4815]: I1207 19:37:04.969139 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7bad849-31ee-4fba-8468-d9b63c4ccfca-logs\") pod \"nova-api-0\" (UID: \"b7bad849-31ee-4fba-8468-d9b63c4ccfca\") " pod="openstack/nova-api-0" Dec 07 19:37:04 crc kubenswrapper[4815]: I1207 19:37:04.975040 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7bad849-31ee-4fba-8468-d9b63c4ccfca-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b7bad849-31ee-4fba-8468-d9b63c4ccfca\") " pod="openstack/nova-api-0" Dec 07 19:37:04 crc kubenswrapper[4815]: I1207 19:37:04.976487 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7bad849-31ee-4fba-8468-d9b63c4ccfca-config-data\") pod \"nova-api-0\" (UID: \"b7bad849-31ee-4fba-8468-d9b63c4ccfca\") " pod="openstack/nova-api-0" Dec 07 19:37:04 crc kubenswrapper[4815]: I1207 19:37:04.990449 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xr8l\" (UniqueName: \"kubernetes.io/projected/b7bad849-31ee-4fba-8468-d9b63c4ccfca-kube-api-access-8xr8l\") pod \"nova-api-0\" (UID: \"b7bad849-31ee-4fba-8468-d9b63c4ccfca\") " pod="openstack/nova-api-0" Dec 07 19:37:05 crc kubenswrapper[4815]: I1207 19:37:05.092201 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 07 19:37:05 crc kubenswrapper[4815]: I1207 19:37:05.123872 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 07 19:37:05 crc kubenswrapper[4815]: I1207 19:37:05.173850 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46b8c248-9f13-484f-8401-ecb16b5c05fd-combined-ca-bundle\") pod \"46b8c248-9f13-484f-8401-ecb16b5c05fd\" (UID: \"46b8c248-9f13-484f-8401-ecb16b5c05fd\") " Dec 07 19:37:05 crc kubenswrapper[4815]: I1207 19:37:05.174150 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44ff4\" (UniqueName: \"kubernetes.io/projected/46b8c248-9f13-484f-8401-ecb16b5c05fd-kube-api-access-44ff4\") pod \"46b8c248-9f13-484f-8401-ecb16b5c05fd\" (UID: \"46b8c248-9f13-484f-8401-ecb16b5c05fd\") " Dec 07 19:37:05 crc kubenswrapper[4815]: I1207 19:37:05.174218 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46b8c248-9f13-484f-8401-ecb16b5c05fd-config-data\") pod \"46b8c248-9f13-484f-8401-ecb16b5c05fd\" (UID: \"46b8c248-9f13-484f-8401-ecb16b5c05fd\") " Dec 07 19:37:05 crc kubenswrapper[4815]: I1207 19:37:05.186087 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46b8c248-9f13-484f-8401-ecb16b5c05fd-kube-api-access-44ff4" (OuterVolumeSpecName: "kube-api-access-44ff4") pod "46b8c248-9f13-484f-8401-ecb16b5c05fd" (UID: "46b8c248-9f13-484f-8401-ecb16b5c05fd"). InnerVolumeSpecName "kube-api-access-44ff4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:37:05 crc kubenswrapper[4815]: I1207 19:37:05.209211 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46b8c248-9f13-484f-8401-ecb16b5c05fd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "46b8c248-9f13-484f-8401-ecb16b5c05fd" (UID: "46b8c248-9f13-484f-8401-ecb16b5c05fd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:37:05 crc kubenswrapper[4815]: I1207 19:37:05.241136 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46b8c248-9f13-484f-8401-ecb16b5c05fd-config-data" (OuterVolumeSpecName: "config-data") pod "46b8c248-9f13-484f-8401-ecb16b5c05fd" (UID: "46b8c248-9f13-484f-8401-ecb16b5c05fd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:37:05 crc kubenswrapper[4815]: I1207 19:37:05.276837 4815 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46b8c248-9f13-484f-8401-ecb16b5c05fd-config-data\") on node \"crc\" DevicePath \"\"" Dec 07 19:37:05 crc kubenswrapper[4815]: I1207 19:37:05.276870 4815 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46b8c248-9f13-484f-8401-ecb16b5c05fd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 07 19:37:05 crc kubenswrapper[4815]: I1207 19:37:05.276882 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44ff4\" (UniqueName: \"kubernetes.io/projected/46b8c248-9f13-484f-8401-ecb16b5c05fd-kube-api-access-44ff4\") on node \"crc\" DevicePath \"\"" Dec 07 19:37:05 crc kubenswrapper[4815]: I1207 19:37:05.633043 4815 generic.go:334] "Generic (PLEG): container finished" podID="46b8c248-9f13-484f-8401-ecb16b5c05fd" containerID="ee6e5f2b7f2e5f0ab8ea23735fdba52239a9a6096f7bbfb4e0673cf2751ae3c1" exitCode=0 Dec 07 19:37:05 crc kubenswrapper[4815]: I1207 19:37:05.633430 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"46b8c248-9f13-484f-8401-ecb16b5c05fd","Type":"ContainerDied","Data":"ee6e5f2b7f2e5f0ab8ea23735fdba52239a9a6096f7bbfb4e0673cf2751ae3c1"} Dec 07 19:37:05 crc kubenswrapper[4815]: I1207 19:37:05.633464 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"46b8c248-9f13-484f-8401-ecb16b5c05fd","Type":"ContainerDied","Data":"0684989e52d790ebd57d6783bd48a00f7560b28d9e3c95ad12e32b19413e8398"} Dec 07 19:37:05 crc kubenswrapper[4815]: I1207 19:37:05.633481 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 07 19:37:05 crc kubenswrapper[4815]: I1207 19:37:05.633479 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 07 19:37:05 crc kubenswrapper[4815]: I1207 19:37:05.633499 4815 scope.go:117] "RemoveContainer" containerID="ee6e5f2b7f2e5f0ab8ea23735fdba52239a9a6096f7bbfb4e0673cf2751ae3c1" Dec 07 19:37:05 crc kubenswrapper[4815]: I1207 19:37:05.636236 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Dec 07 19:37:05 crc kubenswrapper[4815]: I1207 19:37:05.661408 4815 scope.go:117] "RemoveContainer" containerID="ee6e5f2b7f2e5f0ab8ea23735fdba52239a9a6096f7bbfb4e0673cf2751ae3c1" Dec 07 19:37:05 crc kubenswrapper[4815]: E1207 19:37:05.661824 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee6e5f2b7f2e5f0ab8ea23735fdba52239a9a6096f7bbfb4e0673cf2751ae3c1\": container with ID starting with ee6e5f2b7f2e5f0ab8ea23735fdba52239a9a6096f7bbfb4e0673cf2751ae3c1 not found: ID does not exist" containerID="ee6e5f2b7f2e5f0ab8ea23735fdba52239a9a6096f7bbfb4e0673cf2751ae3c1" Dec 07 19:37:05 crc kubenswrapper[4815]: I1207 19:37:05.661862 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee6e5f2b7f2e5f0ab8ea23735fdba52239a9a6096f7bbfb4e0673cf2751ae3c1"} err="failed to get container status \"ee6e5f2b7f2e5f0ab8ea23735fdba52239a9a6096f7bbfb4e0673cf2751ae3c1\": rpc error: code = NotFound desc = could not find container \"ee6e5f2b7f2e5f0ab8ea23735fdba52239a9a6096f7bbfb4e0673cf2751ae3c1\": container with ID starting with ee6e5f2b7f2e5f0ab8ea23735fdba52239a9a6096f7bbfb4e0673cf2751ae3c1 not found: ID does not exist" Dec 07 19:37:05 crc kubenswrapper[4815]: W1207 19:37:05.673436 4815 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb7bad849_31ee_4fba_8468_d9b63c4ccfca.slice/crio-ae99a5cb83ce9096aa0d0e7505974da938bdb46658a6ab6b5b1127ba9e7f8dbf WatchSource:0}: Error finding container ae99a5cb83ce9096aa0d0e7505974da938bdb46658a6ab6b5b1127ba9e7f8dbf: Status 404 returned error can't find the container with id ae99a5cb83ce9096aa0d0e7505974da938bdb46658a6ab6b5b1127ba9e7f8dbf Dec 07 19:37:05 crc kubenswrapper[4815]: I1207 19:37:05.679879 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 07 19:37:05 crc kubenswrapper[4815]: I1207 19:37:05.710132 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 07 19:37:05 crc kubenswrapper[4815]: I1207 19:37:05.728962 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 07 19:37:05 crc kubenswrapper[4815]: E1207 19:37:05.729357 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46b8c248-9f13-484f-8401-ecb16b5c05fd" containerName="nova-scheduler-scheduler" Dec 07 19:37:05 crc kubenswrapper[4815]: I1207 19:37:05.729373 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="46b8c248-9f13-484f-8401-ecb16b5c05fd" containerName="nova-scheduler-scheduler" Dec 07 19:37:05 crc kubenswrapper[4815]: I1207 19:37:05.729577 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="46b8c248-9f13-484f-8401-ecb16b5c05fd" containerName="nova-scheduler-scheduler" Dec 07 19:37:05 crc kubenswrapper[4815]: I1207 19:37:05.730180 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 07 19:37:05 crc kubenswrapper[4815]: I1207 19:37:05.733227 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 07 19:37:05 crc kubenswrapper[4815]: I1207 19:37:05.736111 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 07 19:37:05 crc kubenswrapper[4815]: I1207 19:37:05.784407 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46b8c248-9f13-484f-8401-ecb16b5c05fd" path="/var/lib/kubelet/pods/46b8c248-9f13-484f-8401-ecb16b5c05fd/volumes" Dec 07 19:37:05 crc kubenswrapper[4815]: I1207 19:37:05.784947 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ade483a-7452-4233-be4b-d2de4c5c6549" path="/var/lib/kubelet/pods/8ade483a-7452-4233-be4b-d2de4c5c6549/volumes" Dec 07 19:37:05 crc kubenswrapper[4815]: I1207 19:37:05.793962 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/249344af-6484-4afc-9771-4d6fe5cee8e1-config-data\") pod \"nova-scheduler-0\" (UID: \"249344af-6484-4afc-9771-4d6fe5cee8e1\") " pod="openstack/nova-scheduler-0" Dec 07 19:37:05 crc kubenswrapper[4815]: I1207 19:37:05.794056 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/249344af-6484-4afc-9771-4d6fe5cee8e1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"249344af-6484-4afc-9771-4d6fe5cee8e1\") " pod="openstack/nova-scheduler-0" Dec 07 19:37:05 crc kubenswrapper[4815]: I1207 19:37:05.794172 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v77q4\" (UniqueName: \"kubernetes.io/projected/249344af-6484-4afc-9771-4d6fe5cee8e1-kube-api-access-v77q4\") pod \"nova-scheduler-0\" (UID: \"249344af-6484-4afc-9771-4d6fe5cee8e1\") " pod="openstack/nova-scheduler-0" Dec 07 19:37:05 crc kubenswrapper[4815]: I1207 19:37:05.895585 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v77q4\" (UniqueName: \"kubernetes.io/projected/249344af-6484-4afc-9771-4d6fe5cee8e1-kube-api-access-v77q4\") pod \"nova-scheduler-0\" (UID: \"249344af-6484-4afc-9771-4d6fe5cee8e1\") " pod="openstack/nova-scheduler-0" Dec 07 19:37:05 crc kubenswrapper[4815]: I1207 19:37:05.895765 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/249344af-6484-4afc-9771-4d6fe5cee8e1-config-data\") pod \"nova-scheduler-0\" (UID: \"249344af-6484-4afc-9771-4d6fe5cee8e1\") " pod="openstack/nova-scheduler-0" Dec 07 19:37:05 crc kubenswrapper[4815]: I1207 19:37:05.895799 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/249344af-6484-4afc-9771-4d6fe5cee8e1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"249344af-6484-4afc-9771-4d6fe5cee8e1\") " pod="openstack/nova-scheduler-0" Dec 07 19:37:05 crc kubenswrapper[4815]: I1207 19:37:05.902325 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/249344af-6484-4afc-9771-4d6fe5cee8e1-config-data\") pod \"nova-scheduler-0\" (UID: \"249344af-6484-4afc-9771-4d6fe5cee8e1\") " pod="openstack/nova-scheduler-0" Dec 07 19:37:05 crc kubenswrapper[4815]: I1207 19:37:05.904595 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/249344af-6484-4afc-9771-4d6fe5cee8e1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"249344af-6484-4afc-9771-4d6fe5cee8e1\") " pod="openstack/nova-scheduler-0" Dec 07 19:37:05 crc kubenswrapper[4815]: I1207 19:37:05.914988 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v77q4\" (UniqueName: \"kubernetes.io/projected/249344af-6484-4afc-9771-4d6fe5cee8e1-kube-api-access-v77q4\") pod \"nova-scheduler-0\" (UID: \"249344af-6484-4afc-9771-4d6fe5cee8e1\") " pod="openstack/nova-scheduler-0" Dec 07 19:37:06 crc kubenswrapper[4815]: I1207 19:37:06.105403 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 07 19:37:06 crc kubenswrapper[4815]: I1207 19:37:06.628753 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 07 19:37:06 crc kubenswrapper[4815]: I1207 19:37:06.651311 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b7bad849-31ee-4fba-8468-d9b63c4ccfca","Type":"ContainerStarted","Data":"c963af599373fa32e7231f593104e4e1c347d8ae27fc2993761943d94089a3d4"} Dec 07 19:37:06 crc kubenswrapper[4815]: I1207 19:37:06.651362 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b7bad849-31ee-4fba-8468-d9b63c4ccfca","Type":"ContainerStarted","Data":"73c3ef270c4ab43c6476e5f134609bb6906f1d3319a3b61f427b6aca790d57cd"} Dec 07 19:37:06 crc kubenswrapper[4815]: I1207 19:37:06.651377 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b7bad849-31ee-4fba-8468-d9b63c4ccfca","Type":"ContainerStarted","Data":"ae99a5cb83ce9096aa0d0e7505974da938bdb46658a6ab6b5b1127ba9e7f8dbf"} Dec 07 19:37:06 crc kubenswrapper[4815]: I1207 19:37:06.655473 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"249344af-6484-4afc-9771-4d6fe5cee8e1","Type":"ContainerStarted","Data":"70e0cc5bbc95a47a04f90831ef28b1b89f0d7f06f0275da7b7652116390bf354"} Dec 07 19:37:07 crc kubenswrapper[4815]: I1207 19:37:07.039110 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 07 19:37:07 crc kubenswrapper[4815]: I1207 19:37:07.039457 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 07 19:37:07 crc kubenswrapper[4815]: I1207 19:37:07.664415 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"249344af-6484-4afc-9771-4d6fe5cee8e1","Type":"ContainerStarted","Data":"d133fd3cbc74694ad05f07bece28185845a5cd322d8c5746ff7a2b0e166535db"} Dec 07 19:37:07 crc kubenswrapper[4815]: I1207 19:37:07.689977 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.689944601 podStartE2EDuration="3.689944601s" podCreationTimestamp="2025-12-07 19:37:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:37:06.679327616 +0000 UTC m=+1331.258317661" watchObservedRunningTime="2025-12-07 19:37:07.689944601 +0000 UTC m=+1332.268934656" Dec 07 19:37:07 crc kubenswrapper[4815]: I1207 19:37:07.694387 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.694373426 podStartE2EDuration="2.694373426s" podCreationTimestamp="2025-12-07 19:37:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:37:07.685860985 +0000 UTC m=+1332.264851030" watchObservedRunningTime="2025-12-07 19:37:07.694373426 +0000 UTC m=+1332.273363481" Dec 07 19:37:08 crc kubenswrapper[4815]: I1207 19:37:08.029235 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Dec 07 19:37:11 crc kubenswrapper[4815]: I1207 19:37:11.106251 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 07 19:37:12 crc kubenswrapper[4815]: I1207 19:37:12.080453 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 07 19:37:12 crc kubenswrapper[4815]: I1207 19:37:12.080686 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 07 19:37:13 crc kubenswrapper[4815]: I1207 19:37:13.055229 4815 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="e4fe96a6-892b-4178-ad60-b1b256140e05" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.176:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 07 19:37:13 crc kubenswrapper[4815]: I1207 19:37:13.090135 4815 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="e4fe96a6-892b-4178-ad60-b1b256140e05" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.176:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 07 19:37:15 crc kubenswrapper[4815]: I1207 19:37:15.093249 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 07 19:37:15 crc kubenswrapper[4815]: I1207 19:37:15.093825 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 07 19:37:16 crc kubenswrapper[4815]: I1207 19:37:16.106041 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 07 19:37:16 crc kubenswrapper[4815]: I1207 19:37:16.144455 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 07 19:37:16 crc kubenswrapper[4815]: I1207 19:37:16.176111 4815 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b7bad849-31ee-4fba-8468-d9b63c4ccfca" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.178:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 07 19:37:16 crc kubenswrapper[4815]: I1207 19:37:16.176111 4815 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b7bad849-31ee-4fba-8468-d9b63c4ccfca" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.178:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 07 19:37:16 crc kubenswrapper[4815]: I1207 19:37:16.931026 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 07 19:37:22 crc kubenswrapper[4815]: I1207 19:37:22.056288 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 07 19:37:22 crc kubenswrapper[4815]: I1207 19:37:22.056902 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 07 19:37:22 crc kubenswrapper[4815]: I1207 19:37:22.062902 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 07 19:37:22 crc kubenswrapper[4815]: I1207 19:37:22.063472 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 07 19:37:22 crc kubenswrapper[4815]: I1207 19:37:22.909333 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 07 19:37:22 crc kubenswrapper[4815]: I1207 19:37:22.952948 4815 generic.go:334] "Generic (PLEG): container finished" podID="4b346b54-d142-4628-af7d-0479df88e469" containerID="5f54ddcfc950e92bc25aa657f4217c1eefdbd14f27202b47ba786b4e3c162e41" exitCode=137 Dec 07 19:37:22 crc kubenswrapper[4815]: I1207 19:37:22.953091 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"4b346b54-d142-4628-af7d-0479df88e469","Type":"ContainerDied","Data":"5f54ddcfc950e92bc25aa657f4217c1eefdbd14f27202b47ba786b4e3c162e41"} Dec 07 19:37:22 crc kubenswrapper[4815]: I1207 19:37:22.953169 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"4b346b54-d142-4628-af7d-0479df88e469","Type":"ContainerDied","Data":"613bacf9e69cbd4ad629e87824664890018715e3d183b14aa9489fa4c52dce67"} Dec 07 19:37:22 crc kubenswrapper[4815]: I1207 19:37:22.953245 4815 scope.go:117] "RemoveContainer" containerID="5f54ddcfc950e92bc25aa657f4217c1eefdbd14f27202b47ba786b4e3c162e41" Dec 07 19:37:22 crc kubenswrapper[4815]: I1207 19:37:22.953340 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 07 19:37:22 crc kubenswrapper[4815]: I1207 19:37:22.974178 4815 scope.go:117] "RemoveContainer" containerID="5f54ddcfc950e92bc25aa657f4217c1eefdbd14f27202b47ba786b4e3c162e41" Dec 07 19:37:22 crc kubenswrapper[4815]: E1207 19:37:22.974758 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f54ddcfc950e92bc25aa657f4217c1eefdbd14f27202b47ba786b4e3c162e41\": container with ID starting with 5f54ddcfc950e92bc25aa657f4217c1eefdbd14f27202b47ba786b4e3c162e41 not found: ID does not exist" containerID="5f54ddcfc950e92bc25aa657f4217c1eefdbd14f27202b47ba786b4e3c162e41" Dec 07 19:37:22 crc kubenswrapper[4815]: I1207 19:37:22.974857 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f54ddcfc950e92bc25aa657f4217c1eefdbd14f27202b47ba786b4e3c162e41"} err="failed to get container status \"5f54ddcfc950e92bc25aa657f4217c1eefdbd14f27202b47ba786b4e3c162e41\": rpc error: code = NotFound desc = could not find container \"5f54ddcfc950e92bc25aa657f4217c1eefdbd14f27202b47ba786b4e3c162e41\": container with ID starting with 5f54ddcfc950e92bc25aa657f4217c1eefdbd14f27202b47ba786b4e3c162e41 not found: ID does not exist" Dec 07 19:37:23 crc kubenswrapper[4815]: I1207 19:37:23.015693 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b346b54-d142-4628-af7d-0479df88e469-config-data\") pod \"4b346b54-d142-4628-af7d-0479df88e469\" (UID: \"4b346b54-d142-4628-af7d-0479df88e469\") " Dec 07 19:37:23 crc kubenswrapper[4815]: I1207 19:37:23.015762 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b346b54-d142-4628-af7d-0479df88e469-combined-ca-bundle\") pod \"4b346b54-d142-4628-af7d-0479df88e469\" (UID: \"4b346b54-d142-4628-af7d-0479df88e469\") " Dec 07 19:37:23 crc kubenswrapper[4815]: I1207 19:37:23.015843 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x8zbh\" (UniqueName: \"kubernetes.io/projected/4b346b54-d142-4628-af7d-0479df88e469-kube-api-access-x8zbh\") pod \"4b346b54-d142-4628-af7d-0479df88e469\" (UID: \"4b346b54-d142-4628-af7d-0479df88e469\") " Dec 07 19:37:23 crc kubenswrapper[4815]: I1207 19:37:23.028268 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b346b54-d142-4628-af7d-0479df88e469-kube-api-access-x8zbh" (OuterVolumeSpecName: "kube-api-access-x8zbh") pod "4b346b54-d142-4628-af7d-0479df88e469" (UID: "4b346b54-d142-4628-af7d-0479df88e469"). InnerVolumeSpecName "kube-api-access-x8zbh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:37:23 crc kubenswrapper[4815]: I1207 19:37:23.045041 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b346b54-d142-4628-af7d-0479df88e469-config-data" (OuterVolumeSpecName: "config-data") pod "4b346b54-d142-4628-af7d-0479df88e469" (UID: "4b346b54-d142-4628-af7d-0479df88e469"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:37:23 crc kubenswrapper[4815]: I1207 19:37:23.051290 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b346b54-d142-4628-af7d-0479df88e469-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4b346b54-d142-4628-af7d-0479df88e469" (UID: "4b346b54-d142-4628-af7d-0479df88e469"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:37:23 crc kubenswrapper[4815]: I1207 19:37:23.119408 4815 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b346b54-d142-4628-af7d-0479df88e469-config-data\") on node \"crc\" DevicePath \"\"" Dec 07 19:37:23 crc kubenswrapper[4815]: I1207 19:37:23.119452 4815 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b346b54-d142-4628-af7d-0479df88e469-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 07 19:37:23 crc kubenswrapper[4815]: I1207 19:37:23.119469 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x8zbh\" (UniqueName: \"kubernetes.io/projected/4b346b54-d142-4628-af7d-0479df88e469-kube-api-access-x8zbh\") on node \"crc\" DevicePath \"\"" Dec 07 19:37:23 crc kubenswrapper[4815]: I1207 19:37:23.297988 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 07 19:37:23 crc kubenswrapper[4815]: I1207 19:37:23.305438 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 07 19:37:23 crc kubenswrapper[4815]: I1207 19:37:23.333097 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 07 19:37:23 crc kubenswrapper[4815]: E1207 19:37:23.333610 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b346b54-d142-4628-af7d-0479df88e469" containerName="nova-cell1-novncproxy-novncproxy" Dec 07 19:37:23 crc kubenswrapper[4815]: I1207 19:37:23.333629 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b346b54-d142-4628-af7d-0479df88e469" containerName="nova-cell1-novncproxy-novncproxy" Dec 07 19:37:23 crc kubenswrapper[4815]: I1207 19:37:23.333886 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b346b54-d142-4628-af7d-0479df88e469" containerName="nova-cell1-novncproxy-novncproxy" Dec 07 19:37:23 crc kubenswrapper[4815]: I1207 19:37:23.334794 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 07 19:37:23 crc kubenswrapper[4815]: I1207 19:37:23.347466 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Dec 07 19:37:23 crc kubenswrapper[4815]: I1207 19:37:23.350308 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Dec 07 19:37:23 crc kubenswrapper[4815]: I1207 19:37:23.351158 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 07 19:37:23 crc kubenswrapper[4815]: I1207 19:37:23.356170 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 07 19:37:23 crc kubenswrapper[4815]: I1207 19:37:23.424699 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1986217b-0022-4645-bbd4-a0e3be8a0d03-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"1986217b-0022-4645-bbd4-a0e3be8a0d03\") " pod="openstack/nova-cell1-novncproxy-0" Dec 07 19:37:23 crc kubenswrapper[4815]: I1207 19:37:23.424749 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/1986217b-0022-4645-bbd4-a0e3be8a0d03-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"1986217b-0022-4645-bbd4-a0e3be8a0d03\") " pod="openstack/nova-cell1-novncproxy-0" Dec 07 19:37:23 crc kubenswrapper[4815]: I1207 19:37:23.424807 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/1986217b-0022-4645-bbd4-a0e3be8a0d03-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"1986217b-0022-4645-bbd4-a0e3be8a0d03\") " pod="openstack/nova-cell1-novncproxy-0" Dec 07 19:37:23 crc kubenswrapper[4815]: I1207 19:37:23.424824 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1986217b-0022-4645-bbd4-a0e3be8a0d03-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"1986217b-0022-4645-bbd4-a0e3be8a0d03\") " pod="openstack/nova-cell1-novncproxy-0" Dec 07 19:37:23 crc kubenswrapper[4815]: I1207 19:37:23.425008 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4ztc\" (UniqueName: \"kubernetes.io/projected/1986217b-0022-4645-bbd4-a0e3be8a0d03-kube-api-access-n4ztc\") pod \"nova-cell1-novncproxy-0\" (UID: \"1986217b-0022-4645-bbd4-a0e3be8a0d03\") " pod="openstack/nova-cell1-novncproxy-0" Dec 07 19:37:23 crc kubenswrapper[4815]: I1207 19:37:23.526733 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1986217b-0022-4645-bbd4-a0e3be8a0d03-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"1986217b-0022-4645-bbd4-a0e3be8a0d03\") " pod="openstack/nova-cell1-novncproxy-0" Dec 07 19:37:23 crc kubenswrapper[4815]: I1207 19:37:23.526814 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/1986217b-0022-4645-bbd4-a0e3be8a0d03-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"1986217b-0022-4645-bbd4-a0e3be8a0d03\") " pod="openstack/nova-cell1-novncproxy-0" Dec 07 19:37:23 crc kubenswrapper[4815]: I1207 19:37:23.526946 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/1986217b-0022-4645-bbd4-a0e3be8a0d03-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"1986217b-0022-4645-bbd4-a0e3be8a0d03\") " pod="openstack/nova-cell1-novncproxy-0" Dec 07 19:37:23 crc kubenswrapper[4815]: I1207 19:37:23.526989 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1986217b-0022-4645-bbd4-a0e3be8a0d03-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"1986217b-0022-4645-bbd4-a0e3be8a0d03\") " pod="openstack/nova-cell1-novncproxy-0" Dec 07 19:37:23 crc kubenswrapper[4815]: I1207 19:37:23.527091 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4ztc\" (UniqueName: \"kubernetes.io/projected/1986217b-0022-4645-bbd4-a0e3be8a0d03-kube-api-access-n4ztc\") pod \"nova-cell1-novncproxy-0\" (UID: \"1986217b-0022-4645-bbd4-a0e3be8a0d03\") " pod="openstack/nova-cell1-novncproxy-0" Dec 07 19:37:23 crc kubenswrapper[4815]: I1207 19:37:23.532079 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1986217b-0022-4645-bbd4-a0e3be8a0d03-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"1986217b-0022-4645-bbd4-a0e3be8a0d03\") " pod="openstack/nova-cell1-novncproxy-0" Dec 07 19:37:23 crc kubenswrapper[4815]: I1207 19:37:23.532352 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/1986217b-0022-4645-bbd4-a0e3be8a0d03-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"1986217b-0022-4645-bbd4-a0e3be8a0d03\") " pod="openstack/nova-cell1-novncproxy-0" Dec 07 19:37:23 crc kubenswrapper[4815]: I1207 19:37:23.532353 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1986217b-0022-4645-bbd4-a0e3be8a0d03-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"1986217b-0022-4645-bbd4-a0e3be8a0d03\") " pod="openstack/nova-cell1-novncproxy-0" Dec 07 19:37:23 crc kubenswrapper[4815]: I1207 19:37:23.532700 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/1986217b-0022-4645-bbd4-a0e3be8a0d03-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"1986217b-0022-4645-bbd4-a0e3be8a0d03\") " pod="openstack/nova-cell1-novncproxy-0" Dec 07 19:37:23 crc kubenswrapper[4815]: I1207 19:37:23.544583 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4ztc\" (UniqueName: \"kubernetes.io/projected/1986217b-0022-4645-bbd4-a0e3be8a0d03-kube-api-access-n4ztc\") pod \"nova-cell1-novncproxy-0\" (UID: \"1986217b-0022-4645-bbd4-a0e3be8a0d03\") " pod="openstack/nova-cell1-novncproxy-0" Dec 07 19:37:23 crc kubenswrapper[4815]: I1207 19:37:23.651008 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 07 19:37:23 crc kubenswrapper[4815]: I1207 19:37:23.823963 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b346b54-d142-4628-af7d-0479df88e469" path="/var/lib/kubelet/pods/4b346b54-d142-4628-af7d-0479df88e469/volumes" Dec 07 19:37:24 crc kubenswrapper[4815]: I1207 19:37:24.165592 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 07 19:37:24 crc kubenswrapper[4815]: I1207 19:37:24.971675 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"1986217b-0022-4645-bbd4-a0e3be8a0d03","Type":"ContainerStarted","Data":"f0ae830c54447b0c7c08c682df1d60c6149d480ce861be9347716d410bf0196d"} Dec 07 19:37:24 crc kubenswrapper[4815]: I1207 19:37:24.972251 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"1986217b-0022-4645-bbd4-a0e3be8a0d03","Type":"ContainerStarted","Data":"4af3d2b77cf6562b4e723d989d84dc79a5d9acd8f12454bf4a4a1b896b7db9f2"} Dec 07 19:37:24 crc kubenswrapper[4815]: I1207 19:37:24.999411 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=1.999388042 podStartE2EDuration="1.999388042s" podCreationTimestamp="2025-12-07 19:37:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:37:24.989170943 +0000 UTC m=+1349.568160988" watchObservedRunningTime="2025-12-07 19:37:24.999388042 +0000 UTC m=+1349.578378097" Dec 07 19:37:25 crc kubenswrapper[4815]: I1207 19:37:25.097144 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 07 19:37:25 crc kubenswrapper[4815]: I1207 19:37:25.098571 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 07 19:37:25 crc kubenswrapper[4815]: I1207 19:37:25.100898 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 07 19:37:25 crc kubenswrapper[4815]: I1207 19:37:25.110737 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 07 19:37:25 crc kubenswrapper[4815]: I1207 19:37:25.992401 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 07 19:37:26 crc kubenswrapper[4815]: I1207 19:37:26.001355 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 07 19:37:26 crc kubenswrapper[4815]: I1207 19:37:26.229326 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-68d4b6d797-ts764"] Dec 07 19:37:26 crc kubenswrapper[4815]: I1207 19:37:26.234837 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68d4b6d797-ts764" Dec 07 19:37:26 crc kubenswrapper[4815]: I1207 19:37:26.248278 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-68d4b6d797-ts764"] Dec 07 19:37:26 crc kubenswrapper[4815]: I1207 19:37:26.359949 4815 patch_prober.go:28] interesting pod/machine-config-daemon-gkn4h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 07 19:37:26 crc kubenswrapper[4815]: I1207 19:37:26.360011 4815 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 07 19:37:26 crc kubenswrapper[4815]: I1207 19:37:26.402483 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8a98cb42-d20e-491f-ba28-8202658078bf-dns-svc\") pod \"dnsmasq-dns-68d4b6d797-ts764\" (UID: \"8a98cb42-d20e-491f-ba28-8202658078bf\") " pod="openstack/dnsmasq-dns-68d4b6d797-ts764" Dec 07 19:37:26 crc kubenswrapper[4815]: I1207 19:37:26.402553 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6sjk\" (UniqueName: \"kubernetes.io/projected/8a98cb42-d20e-491f-ba28-8202658078bf-kube-api-access-b6sjk\") pod \"dnsmasq-dns-68d4b6d797-ts764\" (UID: \"8a98cb42-d20e-491f-ba28-8202658078bf\") " pod="openstack/dnsmasq-dns-68d4b6d797-ts764" Dec 07 19:37:26 crc kubenswrapper[4815]: I1207 19:37:26.402589 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8a98cb42-d20e-491f-ba28-8202658078bf-ovsdbserver-nb\") pod \"dnsmasq-dns-68d4b6d797-ts764\" (UID: \"8a98cb42-d20e-491f-ba28-8202658078bf\") " pod="openstack/dnsmasq-dns-68d4b6d797-ts764" Dec 07 19:37:26 crc kubenswrapper[4815]: I1207 19:37:26.402741 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a98cb42-d20e-491f-ba28-8202658078bf-config\") pod \"dnsmasq-dns-68d4b6d797-ts764\" (UID: \"8a98cb42-d20e-491f-ba28-8202658078bf\") " pod="openstack/dnsmasq-dns-68d4b6d797-ts764" Dec 07 19:37:26 crc kubenswrapper[4815]: I1207 19:37:26.402772 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8a98cb42-d20e-491f-ba28-8202658078bf-ovsdbserver-sb\") pod \"dnsmasq-dns-68d4b6d797-ts764\" (UID: \"8a98cb42-d20e-491f-ba28-8202658078bf\") " pod="openstack/dnsmasq-dns-68d4b6d797-ts764" Dec 07 19:37:26 crc kubenswrapper[4815]: I1207 19:37:26.503941 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6sjk\" (UniqueName: \"kubernetes.io/projected/8a98cb42-d20e-491f-ba28-8202658078bf-kube-api-access-b6sjk\") pod \"dnsmasq-dns-68d4b6d797-ts764\" (UID: \"8a98cb42-d20e-491f-ba28-8202658078bf\") " pod="openstack/dnsmasq-dns-68d4b6d797-ts764" Dec 07 19:37:26 crc kubenswrapper[4815]: I1207 19:37:26.503987 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8a98cb42-d20e-491f-ba28-8202658078bf-ovsdbserver-nb\") pod \"dnsmasq-dns-68d4b6d797-ts764\" (UID: \"8a98cb42-d20e-491f-ba28-8202658078bf\") " pod="openstack/dnsmasq-dns-68d4b6d797-ts764" Dec 07 19:37:26 crc kubenswrapper[4815]: I1207 19:37:26.504076 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a98cb42-d20e-491f-ba28-8202658078bf-config\") pod \"dnsmasq-dns-68d4b6d797-ts764\" (UID: \"8a98cb42-d20e-491f-ba28-8202658078bf\") " pod="openstack/dnsmasq-dns-68d4b6d797-ts764" Dec 07 19:37:26 crc kubenswrapper[4815]: I1207 19:37:26.504097 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8a98cb42-d20e-491f-ba28-8202658078bf-ovsdbserver-sb\") pod \"dnsmasq-dns-68d4b6d797-ts764\" (UID: \"8a98cb42-d20e-491f-ba28-8202658078bf\") " pod="openstack/dnsmasq-dns-68d4b6d797-ts764" Dec 07 19:37:26 crc kubenswrapper[4815]: I1207 19:37:26.504888 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a98cb42-d20e-491f-ba28-8202658078bf-config\") pod \"dnsmasq-dns-68d4b6d797-ts764\" (UID: \"8a98cb42-d20e-491f-ba28-8202658078bf\") " pod="openstack/dnsmasq-dns-68d4b6d797-ts764" Dec 07 19:37:26 crc kubenswrapper[4815]: I1207 19:37:26.505018 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8a98cb42-d20e-491f-ba28-8202658078bf-dns-svc\") pod \"dnsmasq-dns-68d4b6d797-ts764\" (UID: \"8a98cb42-d20e-491f-ba28-8202658078bf\") " pod="openstack/dnsmasq-dns-68d4b6d797-ts764" Dec 07 19:37:26 crc kubenswrapper[4815]: I1207 19:37:26.505147 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8a98cb42-d20e-491f-ba28-8202658078bf-ovsdbserver-nb\") pod \"dnsmasq-dns-68d4b6d797-ts764\" (UID: \"8a98cb42-d20e-491f-ba28-8202658078bf\") " pod="openstack/dnsmasq-dns-68d4b6d797-ts764" Dec 07 19:37:26 crc kubenswrapper[4815]: I1207 19:37:26.505187 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8a98cb42-d20e-491f-ba28-8202658078bf-ovsdbserver-sb\") pod \"dnsmasq-dns-68d4b6d797-ts764\" (UID: \"8a98cb42-d20e-491f-ba28-8202658078bf\") " pod="openstack/dnsmasq-dns-68d4b6d797-ts764" Dec 07 19:37:26 crc kubenswrapper[4815]: I1207 19:37:26.505590 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8a98cb42-d20e-491f-ba28-8202658078bf-dns-svc\") pod \"dnsmasq-dns-68d4b6d797-ts764\" (UID: \"8a98cb42-d20e-491f-ba28-8202658078bf\") " pod="openstack/dnsmasq-dns-68d4b6d797-ts764" Dec 07 19:37:26 crc kubenswrapper[4815]: I1207 19:37:26.563885 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6sjk\" (UniqueName: \"kubernetes.io/projected/8a98cb42-d20e-491f-ba28-8202658078bf-kube-api-access-b6sjk\") pod \"dnsmasq-dns-68d4b6d797-ts764\" (UID: \"8a98cb42-d20e-491f-ba28-8202658078bf\") " pod="openstack/dnsmasq-dns-68d4b6d797-ts764" Dec 07 19:37:26 crc kubenswrapper[4815]: I1207 19:37:26.855852 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68d4b6d797-ts764" Dec 07 19:37:27 crc kubenswrapper[4815]: I1207 19:37:27.451810 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-68d4b6d797-ts764"] Dec 07 19:37:28 crc kubenswrapper[4815]: I1207 19:37:28.015422 4815 generic.go:334] "Generic (PLEG): container finished" podID="8a98cb42-d20e-491f-ba28-8202658078bf" containerID="cfdadee2334c72a0d3566bf5a6d4759a9585f63447a15b08434ff5b6bf8d778c" exitCode=0 Dec 07 19:37:28 crc kubenswrapper[4815]: I1207 19:37:28.016034 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68d4b6d797-ts764" event={"ID":"8a98cb42-d20e-491f-ba28-8202658078bf","Type":"ContainerDied","Data":"cfdadee2334c72a0d3566bf5a6d4759a9585f63447a15b08434ff5b6bf8d778c"} Dec 07 19:37:28 crc kubenswrapper[4815]: I1207 19:37:28.017167 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68d4b6d797-ts764" event={"ID":"8a98cb42-d20e-491f-ba28-8202658078bf","Type":"ContainerStarted","Data":"d84f7168523e803bf032d86274498bd529c26d406ba525b76b8ca64f8ab90151"} Dec 07 19:37:28 crc kubenswrapper[4815]: I1207 19:37:28.652553 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 07 19:37:28 crc kubenswrapper[4815]: I1207 19:37:28.979547 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 07 19:37:29 crc kubenswrapper[4815]: I1207 19:37:29.029343 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68d4b6d797-ts764" event={"ID":"8a98cb42-d20e-491f-ba28-8202658078bf","Type":"ContainerStarted","Data":"c7f8405a81363d17c00f1f7d8b5278c6a1267b3075b9ad696128b7600991c6f4"} Dec 07 19:37:29 crc kubenswrapper[4815]: I1207 19:37:29.030298 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-68d4b6d797-ts764" Dec 07 19:37:29 crc kubenswrapper[4815]: I1207 19:37:29.092288 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-68d4b6d797-ts764" podStartSLOduration=3.092264307 podStartE2EDuration="3.092264307s" podCreationTimestamp="2025-12-07 19:37:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:37:29.08033817 +0000 UTC m=+1353.659328215" watchObservedRunningTime="2025-12-07 19:37:29.092264307 +0000 UTC m=+1353.671254352" Dec 07 19:37:29 crc kubenswrapper[4815]: I1207 19:37:29.270116 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 07 19:37:29 crc kubenswrapper[4815]: I1207 19:37:29.270395 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9587a968-ace0-4eb2-8e73-4230356ac221" containerName="ceilometer-central-agent" containerID="cri-o://cbde8f2beb045c26d969b77b9cbb14e6390930ca52357696b29fe7e8b355641b" gracePeriod=30 Dec 07 19:37:29 crc kubenswrapper[4815]: I1207 19:37:29.270487 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9587a968-ace0-4eb2-8e73-4230356ac221" containerName="sg-core" containerID="cri-o://87ea4f424ab8a48c8689ba8307522862e75e5c764c210cf7a2f31c773ea54c33" gracePeriod=30 Dec 07 19:37:29 crc kubenswrapper[4815]: I1207 19:37:29.270534 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9587a968-ace0-4eb2-8e73-4230356ac221" containerName="ceilometer-notification-agent" containerID="cri-o://35d5f4c4104bfe95fd8ef2578c85d28f57d32c239db136652c73c0055eed73e6" gracePeriod=30 Dec 07 19:37:29 crc kubenswrapper[4815]: I1207 19:37:29.270640 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9587a968-ace0-4eb2-8e73-4230356ac221" containerName="proxy-httpd" containerID="cri-o://dde72a050ae6583c1f997b1d0632ab6df2fe05042d35bbd8416f744433881fca" gracePeriod=30 Dec 07 19:37:29 crc kubenswrapper[4815]: I1207 19:37:29.567073 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 07 19:37:29 crc kubenswrapper[4815]: I1207 19:37:29.567418 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="b7bad849-31ee-4fba-8468-d9b63c4ccfca" containerName="nova-api-api" containerID="cri-o://c963af599373fa32e7231f593104e4e1c347d8ae27fc2993761943d94089a3d4" gracePeriod=30 Dec 07 19:37:29 crc kubenswrapper[4815]: I1207 19:37:29.567334 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="b7bad849-31ee-4fba-8468-d9b63c4ccfca" containerName="nova-api-log" containerID="cri-o://73c3ef270c4ab43c6476e5f134609bb6906f1d3319a3b61f427b6aca790d57cd" gracePeriod=30 Dec 07 19:37:30 crc kubenswrapper[4815]: I1207 19:37:30.040830 4815 generic.go:334] "Generic (PLEG): container finished" podID="9587a968-ace0-4eb2-8e73-4230356ac221" containerID="dde72a050ae6583c1f997b1d0632ab6df2fe05042d35bbd8416f744433881fca" exitCode=0 Dec 07 19:37:30 crc kubenswrapper[4815]: I1207 19:37:30.041198 4815 generic.go:334] "Generic (PLEG): container finished" podID="9587a968-ace0-4eb2-8e73-4230356ac221" containerID="87ea4f424ab8a48c8689ba8307522862e75e5c764c210cf7a2f31c773ea54c33" exitCode=2 Dec 07 19:37:30 crc kubenswrapper[4815]: I1207 19:37:30.041216 4815 generic.go:334] "Generic (PLEG): container finished" podID="9587a968-ace0-4eb2-8e73-4230356ac221" containerID="cbde8f2beb045c26d969b77b9cbb14e6390930ca52357696b29fe7e8b355641b" exitCode=0 Dec 07 19:37:30 crc kubenswrapper[4815]: I1207 19:37:30.041064 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9587a968-ace0-4eb2-8e73-4230356ac221","Type":"ContainerDied","Data":"dde72a050ae6583c1f997b1d0632ab6df2fe05042d35bbd8416f744433881fca"} Dec 07 19:37:30 crc kubenswrapper[4815]: I1207 19:37:30.041298 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9587a968-ace0-4eb2-8e73-4230356ac221","Type":"ContainerDied","Data":"87ea4f424ab8a48c8689ba8307522862e75e5c764c210cf7a2f31c773ea54c33"} Dec 07 19:37:30 crc kubenswrapper[4815]: I1207 19:37:30.041321 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9587a968-ace0-4eb2-8e73-4230356ac221","Type":"ContainerDied","Data":"cbde8f2beb045c26d969b77b9cbb14e6390930ca52357696b29fe7e8b355641b"} Dec 07 19:37:30 crc kubenswrapper[4815]: I1207 19:37:30.046351 4815 generic.go:334] "Generic (PLEG): container finished" podID="b7bad849-31ee-4fba-8468-d9b63c4ccfca" containerID="73c3ef270c4ab43c6476e5f134609bb6906f1d3319a3b61f427b6aca790d57cd" exitCode=143 Dec 07 19:37:30 crc kubenswrapper[4815]: I1207 19:37:30.046413 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b7bad849-31ee-4fba-8468-d9b63c4ccfca","Type":"ContainerDied","Data":"73c3ef270c4ab43c6476e5f134609bb6906f1d3319a3b61f427b6aca790d57cd"} Dec 07 19:37:33 crc kubenswrapper[4815]: I1207 19:37:33.100058 4815 generic.go:334] "Generic (PLEG): container finished" podID="b7bad849-31ee-4fba-8468-d9b63c4ccfca" containerID="c963af599373fa32e7231f593104e4e1c347d8ae27fc2993761943d94089a3d4" exitCode=0 Dec 07 19:37:33 crc kubenswrapper[4815]: I1207 19:37:33.100582 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b7bad849-31ee-4fba-8468-d9b63c4ccfca","Type":"ContainerDied","Data":"c963af599373fa32e7231f593104e4e1c347d8ae27fc2993761943d94089a3d4"} Dec 07 19:37:33 crc kubenswrapper[4815]: I1207 19:37:33.290524 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 07 19:37:33 crc kubenswrapper[4815]: I1207 19:37:33.429832 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8xr8l\" (UniqueName: \"kubernetes.io/projected/b7bad849-31ee-4fba-8468-d9b63c4ccfca-kube-api-access-8xr8l\") pod \"b7bad849-31ee-4fba-8468-d9b63c4ccfca\" (UID: \"b7bad849-31ee-4fba-8468-d9b63c4ccfca\") " Dec 07 19:37:33 crc kubenswrapper[4815]: I1207 19:37:33.429980 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7bad849-31ee-4fba-8468-d9b63c4ccfca-config-data\") pod \"b7bad849-31ee-4fba-8468-d9b63c4ccfca\" (UID: \"b7bad849-31ee-4fba-8468-d9b63c4ccfca\") " Dec 07 19:37:33 crc kubenswrapper[4815]: I1207 19:37:33.430059 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7bad849-31ee-4fba-8468-d9b63c4ccfca-combined-ca-bundle\") pod \"b7bad849-31ee-4fba-8468-d9b63c4ccfca\" (UID: \"b7bad849-31ee-4fba-8468-d9b63c4ccfca\") " Dec 07 19:37:33 crc kubenswrapper[4815]: I1207 19:37:33.430145 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7bad849-31ee-4fba-8468-d9b63c4ccfca-logs\") pod \"b7bad849-31ee-4fba-8468-d9b63c4ccfca\" (UID: \"b7bad849-31ee-4fba-8468-d9b63c4ccfca\") " Dec 07 19:37:33 crc kubenswrapper[4815]: I1207 19:37:33.431510 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7bad849-31ee-4fba-8468-d9b63c4ccfca-logs" (OuterVolumeSpecName: "logs") pod "b7bad849-31ee-4fba-8468-d9b63c4ccfca" (UID: "b7bad849-31ee-4fba-8468-d9b63c4ccfca"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 07 19:37:33 crc kubenswrapper[4815]: I1207 19:37:33.435672 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7bad849-31ee-4fba-8468-d9b63c4ccfca-kube-api-access-8xr8l" (OuterVolumeSpecName: "kube-api-access-8xr8l") pod "b7bad849-31ee-4fba-8468-d9b63c4ccfca" (UID: "b7bad849-31ee-4fba-8468-d9b63c4ccfca"). InnerVolumeSpecName "kube-api-access-8xr8l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:37:33 crc kubenswrapper[4815]: I1207 19:37:33.458359 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7bad849-31ee-4fba-8468-d9b63c4ccfca-config-data" (OuterVolumeSpecName: "config-data") pod "b7bad849-31ee-4fba-8468-d9b63c4ccfca" (UID: "b7bad849-31ee-4fba-8468-d9b63c4ccfca"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:37:33 crc kubenswrapper[4815]: I1207 19:37:33.468287 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7bad849-31ee-4fba-8468-d9b63c4ccfca-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b7bad849-31ee-4fba-8468-d9b63c4ccfca" (UID: "b7bad849-31ee-4fba-8468-d9b63c4ccfca"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:37:33 crc kubenswrapper[4815]: I1207 19:37:33.532759 4815 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7bad849-31ee-4fba-8468-d9b63c4ccfca-logs\") on node \"crc\" DevicePath \"\"" Dec 07 19:37:33 crc kubenswrapper[4815]: I1207 19:37:33.532790 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8xr8l\" (UniqueName: \"kubernetes.io/projected/b7bad849-31ee-4fba-8468-d9b63c4ccfca-kube-api-access-8xr8l\") on node \"crc\" DevicePath \"\"" Dec 07 19:37:33 crc kubenswrapper[4815]: I1207 19:37:33.532803 4815 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7bad849-31ee-4fba-8468-d9b63c4ccfca-config-data\") on node \"crc\" DevicePath \"\"" Dec 07 19:37:33 crc kubenswrapper[4815]: I1207 19:37:33.532813 4815 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7bad849-31ee-4fba-8468-d9b63c4ccfca-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 07 19:37:33 crc kubenswrapper[4815]: I1207 19:37:33.651784 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Dec 07 19:37:33 crc kubenswrapper[4815]: I1207 19:37:33.675899 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.085085 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.110468 4815 generic.go:334] "Generic (PLEG): container finished" podID="9587a968-ace0-4eb2-8e73-4230356ac221" containerID="35d5f4c4104bfe95fd8ef2578c85d28f57d32c239db136652c73c0055eed73e6" exitCode=0 Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.111168 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9587a968-ace0-4eb2-8e73-4230356ac221","Type":"ContainerDied","Data":"35d5f4c4104bfe95fd8ef2578c85d28f57d32c239db136652c73c0055eed73e6"} Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.111265 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9587a968-ace0-4eb2-8e73-4230356ac221","Type":"ContainerDied","Data":"a89a9b7487c8ed5c1fb6ecff4b270d6f0460bc7c08f54595afc927391139eacf"} Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.111334 4815 scope.go:117] "RemoveContainer" containerID="dde72a050ae6583c1f997b1d0632ab6df2fe05042d35bbd8416f744433881fca" Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.111519 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.117743 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.119175 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b7bad849-31ee-4fba-8468-d9b63c4ccfca","Type":"ContainerDied","Data":"ae99a5cb83ce9096aa0d0e7505974da938bdb46658a6ab6b5b1127ba9e7f8dbf"} Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.149019 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.162013 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.174612 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.175444 4815 scope.go:117] "RemoveContainer" containerID="87ea4f424ab8a48c8689ba8307522862e75e5c764c210cf7a2f31c773ea54c33" Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.221291 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 07 19:37:34 crc kubenswrapper[4815]: E1207 19:37:34.223514 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9587a968-ace0-4eb2-8e73-4230356ac221" containerName="proxy-httpd" Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.223532 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="9587a968-ace0-4eb2-8e73-4230356ac221" containerName="proxy-httpd" Dec 07 19:37:34 crc kubenswrapper[4815]: E1207 19:37:34.223548 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9587a968-ace0-4eb2-8e73-4230356ac221" containerName="ceilometer-notification-agent" Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.223554 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="9587a968-ace0-4eb2-8e73-4230356ac221" containerName="ceilometer-notification-agent" Dec 07 19:37:34 crc kubenswrapper[4815]: E1207 19:37:34.223564 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9587a968-ace0-4eb2-8e73-4230356ac221" containerName="ceilometer-central-agent" Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.223572 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="9587a968-ace0-4eb2-8e73-4230356ac221" containerName="ceilometer-central-agent" Dec 07 19:37:34 crc kubenswrapper[4815]: E1207 19:37:34.223586 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9587a968-ace0-4eb2-8e73-4230356ac221" containerName="sg-core" Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.223591 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="9587a968-ace0-4eb2-8e73-4230356ac221" containerName="sg-core" Dec 07 19:37:34 crc kubenswrapper[4815]: E1207 19:37:34.223608 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7bad849-31ee-4fba-8468-d9b63c4ccfca" containerName="nova-api-api" Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.223615 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7bad849-31ee-4fba-8468-d9b63c4ccfca" containerName="nova-api-api" Dec 07 19:37:34 crc kubenswrapper[4815]: E1207 19:37:34.223881 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7bad849-31ee-4fba-8468-d9b63c4ccfca" containerName="nova-api-log" Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.223889 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7bad849-31ee-4fba-8468-d9b63c4ccfca" containerName="nova-api-log" Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.224074 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="9587a968-ace0-4eb2-8e73-4230356ac221" containerName="ceilometer-central-agent" Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.224091 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="9587a968-ace0-4eb2-8e73-4230356ac221" containerName="proxy-httpd" Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.224101 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="9587a968-ace0-4eb2-8e73-4230356ac221" containerName="ceilometer-notification-agent" Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.224114 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="9587a968-ace0-4eb2-8e73-4230356ac221" containerName="sg-core" Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.224123 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7bad849-31ee-4fba-8468-d9b63c4ccfca" containerName="nova-api-api" Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.224138 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7bad849-31ee-4fba-8468-d9b63c4ccfca" containerName="nova-api-log" Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.226308 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.229808 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.231823 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.231996 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.232098 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.250115 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9587a968-ace0-4eb2-8e73-4230356ac221-combined-ca-bundle\") pod \"9587a968-ace0-4eb2-8e73-4230356ac221\" (UID: \"9587a968-ace0-4eb2-8e73-4230356ac221\") " Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.250160 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9587a968-ace0-4eb2-8e73-4230356ac221-sg-core-conf-yaml\") pod \"9587a968-ace0-4eb2-8e73-4230356ac221\" (UID: \"9587a968-ace0-4eb2-8e73-4230356ac221\") " Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.250216 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9587a968-ace0-4eb2-8e73-4230356ac221-run-httpd\") pod \"9587a968-ace0-4eb2-8e73-4230356ac221\" (UID: \"9587a968-ace0-4eb2-8e73-4230356ac221\") " Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.250244 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-chr2v\" (UniqueName: \"kubernetes.io/projected/9587a968-ace0-4eb2-8e73-4230356ac221-kube-api-access-chr2v\") pod \"9587a968-ace0-4eb2-8e73-4230356ac221\" (UID: \"9587a968-ace0-4eb2-8e73-4230356ac221\") " Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.250274 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9587a968-ace0-4eb2-8e73-4230356ac221-config-data\") pod \"9587a968-ace0-4eb2-8e73-4230356ac221\" (UID: \"9587a968-ace0-4eb2-8e73-4230356ac221\") " Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.250290 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9587a968-ace0-4eb2-8e73-4230356ac221-scripts\") pod \"9587a968-ace0-4eb2-8e73-4230356ac221\" (UID: \"9587a968-ace0-4eb2-8e73-4230356ac221\") " Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.250323 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9587a968-ace0-4eb2-8e73-4230356ac221-log-httpd\") pod \"9587a968-ace0-4eb2-8e73-4230356ac221\" (UID: \"9587a968-ace0-4eb2-8e73-4230356ac221\") " Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.251098 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9587a968-ace0-4eb2-8e73-4230356ac221-ceilometer-tls-certs\") pod \"9587a968-ace0-4eb2-8e73-4230356ac221\" (UID: \"9587a968-ace0-4eb2-8e73-4230356ac221\") " Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.267879 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9587a968-ace0-4eb2-8e73-4230356ac221-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "9587a968-ace0-4eb2-8e73-4230356ac221" (UID: "9587a968-ace0-4eb2-8e73-4230356ac221"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.268022 4815 scope.go:117] "RemoveContainer" containerID="35d5f4c4104bfe95fd8ef2578c85d28f57d32c239db136652c73c0055eed73e6" Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.268028 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9587a968-ace0-4eb2-8e73-4230356ac221-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "9587a968-ace0-4eb2-8e73-4230356ac221" (UID: "9587a968-ace0-4eb2-8e73-4230356ac221"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.292672 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9587a968-ace0-4eb2-8e73-4230356ac221-kube-api-access-chr2v" (OuterVolumeSpecName: "kube-api-access-chr2v") pod "9587a968-ace0-4eb2-8e73-4230356ac221" (UID: "9587a968-ace0-4eb2-8e73-4230356ac221"). InnerVolumeSpecName "kube-api-access-chr2v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.297509 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9587a968-ace0-4eb2-8e73-4230356ac221-scripts" (OuterVolumeSpecName: "scripts") pod "9587a968-ace0-4eb2-8e73-4230356ac221" (UID: "9587a968-ace0-4eb2-8e73-4230356ac221"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.316951 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9587a968-ace0-4eb2-8e73-4230356ac221-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "9587a968-ace0-4eb2-8e73-4230356ac221" (UID: "9587a968-ace0-4eb2-8e73-4230356ac221"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.352889 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86e51347-8e5b-4d00-bfed-fd6a64af7e74-config-data\") pod \"nova-api-0\" (UID: \"86e51347-8e5b-4d00-bfed-fd6a64af7e74\") " pod="openstack/nova-api-0" Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.353096 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86e51347-8e5b-4d00-bfed-fd6a64af7e74-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"86e51347-8e5b-4d00-bfed-fd6a64af7e74\") " pod="openstack/nova-api-0" Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.353122 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/86e51347-8e5b-4d00-bfed-fd6a64af7e74-logs\") pod \"nova-api-0\" (UID: \"86e51347-8e5b-4d00-bfed-fd6a64af7e74\") " pod="openstack/nova-api-0" Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.353141 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/86e51347-8e5b-4d00-bfed-fd6a64af7e74-internal-tls-certs\") pod \"nova-api-0\" (UID: \"86e51347-8e5b-4d00-bfed-fd6a64af7e74\") " pod="openstack/nova-api-0" Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.353161 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pd2dq\" (UniqueName: \"kubernetes.io/projected/86e51347-8e5b-4d00-bfed-fd6a64af7e74-kube-api-access-pd2dq\") pod \"nova-api-0\" (UID: \"86e51347-8e5b-4d00-bfed-fd6a64af7e74\") " pod="openstack/nova-api-0" Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.353177 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/86e51347-8e5b-4d00-bfed-fd6a64af7e74-public-tls-certs\") pod \"nova-api-0\" (UID: \"86e51347-8e5b-4d00-bfed-fd6a64af7e74\") " pod="openstack/nova-api-0" Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.353227 4815 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9587a968-ace0-4eb2-8e73-4230356ac221-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.353238 4815 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9587a968-ace0-4eb2-8e73-4230356ac221-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.353249 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-chr2v\" (UniqueName: \"kubernetes.io/projected/9587a968-ace0-4eb2-8e73-4230356ac221-kube-api-access-chr2v\") on node \"crc\" DevicePath \"\"" Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.353259 4815 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9587a968-ace0-4eb2-8e73-4230356ac221-scripts\") on node \"crc\" DevicePath \"\"" Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.353266 4815 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9587a968-ace0-4eb2-8e73-4230356ac221-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.388714 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9587a968-ace0-4eb2-8e73-4230356ac221-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "9587a968-ace0-4eb2-8e73-4230356ac221" (UID: "9587a968-ace0-4eb2-8e73-4230356ac221"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.388805 4815 scope.go:117] "RemoveContainer" containerID="cbde8f2beb045c26d969b77b9cbb14e6390930ca52357696b29fe7e8b355641b" Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.440505 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-7kvht"] Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.442248 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-7kvht" Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.444530 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.445041 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.453116 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-7kvht"] Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.455130 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86e51347-8e5b-4d00-bfed-fd6a64af7e74-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"86e51347-8e5b-4d00-bfed-fd6a64af7e74\") " pod="openstack/nova-api-0" Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.455174 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/86e51347-8e5b-4d00-bfed-fd6a64af7e74-logs\") pod \"nova-api-0\" (UID: \"86e51347-8e5b-4d00-bfed-fd6a64af7e74\") " pod="openstack/nova-api-0" Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.455217 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/86e51347-8e5b-4d00-bfed-fd6a64af7e74-internal-tls-certs\") pod \"nova-api-0\" (UID: \"86e51347-8e5b-4d00-bfed-fd6a64af7e74\") " pod="openstack/nova-api-0" Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.455243 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pd2dq\" (UniqueName: \"kubernetes.io/projected/86e51347-8e5b-4d00-bfed-fd6a64af7e74-kube-api-access-pd2dq\") pod \"nova-api-0\" (UID: \"86e51347-8e5b-4d00-bfed-fd6a64af7e74\") " pod="openstack/nova-api-0" Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.455283 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/86e51347-8e5b-4d00-bfed-fd6a64af7e74-public-tls-certs\") pod \"nova-api-0\" (UID: \"86e51347-8e5b-4d00-bfed-fd6a64af7e74\") " pod="openstack/nova-api-0" Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.455317 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86e51347-8e5b-4d00-bfed-fd6a64af7e74-config-data\") pod \"nova-api-0\" (UID: \"86e51347-8e5b-4d00-bfed-fd6a64af7e74\") " pod="openstack/nova-api-0" Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.455527 4815 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9587a968-ace0-4eb2-8e73-4230356ac221-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.455562 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/86e51347-8e5b-4d00-bfed-fd6a64af7e74-logs\") pod \"nova-api-0\" (UID: \"86e51347-8e5b-4d00-bfed-fd6a64af7e74\") " pod="openstack/nova-api-0" Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.459660 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86e51347-8e5b-4d00-bfed-fd6a64af7e74-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"86e51347-8e5b-4d00-bfed-fd6a64af7e74\") " pod="openstack/nova-api-0" Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.469077 4815 scope.go:117] "RemoveContainer" containerID="dde72a050ae6583c1f997b1d0632ab6df2fe05042d35bbd8416f744433881fca" Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.469370 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/86e51347-8e5b-4d00-bfed-fd6a64af7e74-internal-tls-certs\") pod \"nova-api-0\" (UID: \"86e51347-8e5b-4d00-bfed-fd6a64af7e74\") " pod="openstack/nova-api-0" Dec 07 19:37:34 crc kubenswrapper[4815]: E1207 19:37:34.470678 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dde72a050ae6583c1f997b1d0632ab6df2fe05042d35bbd8416f744433881fca\": container with ID starting with dde72a050ae6583c1f997b1d0632ab6df2fe05042d35bbd8416f744433881fca not found: ID does not exist" containerID="dde72a050ae6583c1f997b1d0632ab6df2fe05042d35bbd8416f744433881fca" Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.470707 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dde72a050ae6583c1f997b1d0632ab6df2fe05042d35bbd8416f744433881fca"} err="failed to get container status \"dde72a050ae6583c1f997b1d0632ab6df2fe05042d35bbd8416f744433881fca\": rpc error: code = NotFound desc = could not find container \"dde72a050ae6583c1f997b1d0632ab6df2fe05042d35bbd8416f744433881fca\": container with ID starting with dde72a050ae6583c1f997b1d0632ab6df2fe05042d35bbd8416f744433881fca not found: ID does not exist" Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.470729 4815 scope.go:117] "RemoveContainer" containerID="87ea4f424ab8a48c8689ba8307522862e75e5c764c210cf7a2f31c773ea54c33" Dec 07 19:37:34 crc kubenswrapper[4815]: E1207 19:37:34.470989 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87ea4f424ab8a48c8689ba8307522862e75e5c764c210cf7a2f31c773ea54c33\": container with ID starting with 87ea4f424ab8a48c8689ba8307522862e75e5c764c210cf7a2f31c773ea54c33 not found: ID does not exist" containerID="87ea4f424ab8a48c8689ba8307522862e75e5c764c210cf7a2f31c773ea54c33" Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.471011 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87ea4f424ab8a48c8689ba8307522862e75e5c764c210cf7a2f31c773ea54c33"} err="failed to get container status \"87ea4f424ab8a48c8689ba8307522862e75e5c764c210cf7a2f31c773ea54c33\": rpc error: code = NotFound desc = could not find container \"87ea4f424ab8a48c8689ba8307522862e75e5c764c210cf7a2f31c773ea54c33\": container with ID starting with 87ea4f424ab8a48c8689ba8307522862e75e5c764c210cf7a2f31c773ea54c33 not found: ID does not exist" Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.471025 4815 scope.go:117] "RemoveContainer" containerID="35d5f4c4104bfe95fd8ef2578c85d28f57d32c239db136652c73c0055eed73e6" Dec 07 19:37:34 crc kubenswrapper[4815]: E1207 19:37:34.471219 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35d5f4c4104bfe95fd8ef2578c85d28f57d32c239db136652c73c0055eed73e6\": container with ID starting with 35d5f4c4104bfe95fd8ef2578c85d28f57d32c239db136652c73c0055eed73e6 not found: ID does not exist" containerID="35d5f4c4104bfe95fd8ef2578c85d28f57d32c239db136652c73c0055eed73e6" Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.471265 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35d5f4c4104bfe95fd8ef2578c85d28f57d32c239db136652c73c0055eed73e6"} err="failed to get container status \"35d5f4c4104bfe95fd8ef2578c85d28f57d32c239db136652c73c0055eed73e6\": rpc error: code = NotFound desc = could not find container \"35d5f4c4104bfe95fd8ef2578c85d28f57d32c239db136652c73c0055eed73e6\": container with ID starting with 35d5f4c4104bfe95fd8ef2578c85d28f57d32c239db136652c73c0055eed73e6 not found: ID does not exist" Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.471319 4815 scope.go:117] "RemoveContainer" containerID="cbde8f2beb045c26d969b77b9cbb14e6390930ca52357696b29fe7e8b355641b" Dec 07 19:37:34 crc kubenswrapper[4815]: E1207 19:37:34.471501 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbde8f2beb045c26d969b77b9cbb14e6390930ca52357696b29fe7e8b355641b\": container with ID starting with cbde8f2beb045c26d969b77b9cbb14e6390930ca52357696b29fe7e8b355641b not found: ID does not exist" containerID="cbde8f2beb045c26d969b77b9cbb14e6390930ca52357696b29fe7e8b355641b" Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.471530 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbde8f2beb045c26d969b77b9cbb14e6390930ca52357696b29fe7e8b355641b"} err="failed to get container status \"cbde8f2beb045c26d969b77b9cbb14e6390930ca52357696b29fe7e8b355641b\": rpc error: code = NotFound desc = could not find container \"cbde8f2beb045c26d969b77b9cbb14e6390930ca52357696b29fe7e8b355641b\": container with ID starting with cbde8f2beb045c26d969b77b9cbb14e6390930ca52357696b29fe7e8b355641b not found: ID does not exist" Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.471544 4815 scope.go:117] "RemoveContainer" containerID="c963af599373fa32e7231f593104e4e1c347d8ae27fc2993761943d94089a3d4" Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.483523 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pd2dq\" (UniqueName: \"kubernetes.io/projected/86e51347-8e5b-4d00-bfed-fd6a64af7e74-kube-api-access-pd2dq\") pod \"nova-api-0\" (UID: \"86e51347-8e5b-4d00-bfed-fd6a64af7e74\") " pod="openstack/nova-api-0" Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.486485 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/86e51347-8e5b-4d00-bfed-fd6a64af7e74-public-tls-certs\") pod \"nova-api-0\" (UID: \"86e51347-8e5b-4d00-bfed-fd6a64af7e74\") " pod="openstack/nova-api-0" Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.488764 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86e51347-8e5b-4d00-bfed-fd6a64af7e74-config-data\") pod \"nova-api-0\" (UID: \"86e51347-8e5b-4d00-bfed-fd6a64af7e74\") " pod="openstack/nova-api-0" Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.489985 4815 scope.go:117] "RemoveContainer" containerID="73c3ef270c4ab43c6476e5f134609bb6906f1d3319a3b61f427b6aca790d57cd" Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.495300 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9587a968-ace0-4eb2-8e73-4230356ac221-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9587a968-ace0-4eb2-8e73-4230356ac221" (UID: "9587a968-ace0-4eb2-8e73-4230356ac221"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.497740 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9587a968-ace0-4eb2-8e73-4230356ac221-config-data" (OuterVolumeSpecName: "config-data") pod "9587a968-ace0-4eb2-8e73-4230356ac221" (UID: "9587a968-ace0-4eb2-8e73-4230356ac221"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.556924 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97bed517-8a3a-42ee-8a74-e9ad9416898a-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-7kvht\" (UID: \"97bed517-8a3a-42ee-8a74-e9ad9416898a\") " pod="openstack/nova-cell1-cell-mapping-7kvht" Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.557050 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97bed517-8a3a-42ee-8a74-e9ad9416898a-scripts\") pod \"nova-cell1-cell-mapping-7kvht\" (UID: \"97bed517-8a3a-42ee-8a74-e9ad9416898a\") " pod="openstack/nova-cell1-cell-mapping-7kvht" Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.557096 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97bed517-8a3a-42ee-8a74-e9ad9416898a-config-data\") pod \"nova-cell1-cell-mapping-7kvht\" (UID: \"97bed517-8a3a-42ee-8a74-e9ad9416898a\") " pod="openstack/nova-cell1-cell-mapping-7kvht" Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.557179 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqpjf\" (UniqueName: \"kubernetes.io/projected/97bed517-8a3a-42ee-8a74-e9ad9416898a-kube-api-access-rqpjf\") pod \"nova-cell1-cell-mapping-7kvht\" (UID: \"97bed517-8a3a-42ee-8a74-e9ad9416898a\") " pod="openstack/nova-cell1-cell-mapping-7kvht" Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.557279 4815 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9587a968-ace0-4eb2-8e73-4230356ac221-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.557292 4815 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9587a968-ace0-4eb2-8e73-4230356ac221-config-data\") on node \"crc\" DevicePath \"\"" Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.561517 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.658607 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97bed517-8a3a-42ee-8a74-e9ad9416898a-scripts\") pod \"nova-cell1-cell-mapping-7kvht\" (UID: \"97bed517-8a3a-42ee-8a74-e9ad9416898a\") " pod="openstack/nova-cell1-cell-mapping-7kvht" Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.658679 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97bed517-8a3a-42ee-8a74-e9ad9416898a-config-data\") pod \"nova-cell1-cell-mapping-7kvht\" (UID: \"97bed517-8a3a-42ee-8a74-e9ad9416898a\") " pod="openstack/nova-cell1-cell-mapping-7kvht" Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.658741 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqpjf\" (UniqueName: \"kubernetes.io/projected/97bed517-8a3a-42ee-8a74-e9ad9416898a-kube-api-access-rqpjf\") pod \"nova-cell1-cell-mapping-7kvht\" (UID: \"97bed517-8a3a-42ee-8a74-e9ad9416898a\") " pod="openstack/nova-cell1-cell-mapping-7kvht" Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.658804 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97bed517-8a3a-42ee-8a74-e9ad9416898a-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-7kvht\" (UID: \"97bed517-8a3a-42ee-8a74-e9ad9416898a\") " pod="openstack/nova-cell1-cell-mapping-7kvht" Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.667958 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97bed517-8a3a-42ee-8a74-e9ad9416898a-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-7kvht\" (UID: \"97bed517-8a3a-42ee-8a74-e9ad9416898a\") " pod="openstack/nova-cell1-cell-mapping-7kvht" Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.672537 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97bed517-8a3a-42ee-8a74-e9ad9416898a-scripts\") pod \"nova-cell1-cell-mapping-7kvht\" (UID: \"97bed517-8a3a-42ee-8a74-e9ad9416898a\") " pod="openstack/nova-cell1-cell-mapping-7kvht" Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.676183 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97bed517-8a3a-42ee-8a74-e9ad9416898a-config-data\") pod \"nova-cell1-cell-mapping-7kvht\" (UID: \"97bed517-8a3a-42ee-8a74-e9ad9416898a\") " pod="openstack/nova-cell1-cell-mapping-7kvht" Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.688456 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqpjf\" (UniqueName: \"kubernetes.io/projected/97bed517-8a3a-42ee-8a74-e9ad9416898a-kube-api-access-rqpjf\") pod \"nova-cell1-cell-mapping-7kvht\" (UID: \"97bed517-8a3a-42ee-8a74-e9ad9416898a\") " pod="openstack/nova-cell1-cell-mapping-7kvht" Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.763817 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.778704 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-7kvht" Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.783427 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.805375 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.809529 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.811597 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.812727 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.812860 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.838036 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.971589 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2871977e-b513-4279-b52b-c9612ddc9005-config-data\") pod \"ceilometer-0\" (UID: \"2871977e-b513-4279-b52b-c9612ddc9005\") " pod="openstack/ceilometer-0" Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.971646 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2871977e-b513-4279-b52b-c9612ddc9005-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2871977e-b513-4279-b52b-c9612ddc9005\") " pod="openstack/ceilometer-0" Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.971664 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2871977e-b513-4279-b52b-c9612ddc9005-scripts\") pod \"ceilometer-0\" (UID: \"2871977e-b513-4279-b52b-c9612ddc9005\") " pod="openstack/ceilometer-0" Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.971702 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2871977e-b513-4279-b52b-c9612ddc9005-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2871977e-b513-4279-b52b-c9612ddc9005\") " pod="openstack/ceilometer-0" Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.971721 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2871977e-b513-4279-b52b-c9612ddc9005-log-httpd\") pod \"ceilometer-0\" (UID: \"2871977e-b513-4279-b52b-c9612ddc9005\") " pod="openstack/ceilometer-0" Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.971747 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jndp7\" (UniqueName: \"kubernetes.io/projected/2871977e-b513-4279-b52b-c9612ddc9005-kube-api-access-jndp7\") pod \"ceilometer-0\" (UID: \"2871977e-b513-4279-b52b-c9612ddc9005\") " pod="openstack/ceilometer-0" Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.971770 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2871977e-b513-4279-b52b-c9612ddc9005-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2871977e-b513-4279-b52b-c9612ddc9005\") " pod="openstack/ceilometer-0" Dec 07 19:37:34 crc kubenswrapper[4815]: I1207 19:37:34.971798 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2871977e-b513-4279-b52b-c9612ddc9005-run-httpd\") pod \"ceilometer-0\" (UID: \"2871977e-b513-4279-b52b-c9612ddc9005\") " pod="openstack/ceilometer-0" Dec 07 19:37:35 crc kubenswrapper[4815]: I1207 19:37:35.066848 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 07 19:37:35 crc kubenswrapper[4815]: I1207 19:37:35.072855 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2871977e-b513-4279-b52b-c9612ddc9005-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2871977e-b513-4279-b52b-c9612ddc9005\") " pod="openstack/ceilometer-0" Dec 07 19:37:35 crc kubenswrapper[4815]: I1207 19:37:35.072904 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2871977e-b513-4279-b52b-c9612ddc9005-log-httpd\") pod \"ceilometer-0\" (UID: \"2871977e-b513-4279-b52b-c9612ddc9005\") " pod="openstack/ceilometer-0" Dec 07 19:37:35 crc kubenswrapper[4815]: I1207 19:37:35.072970 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jndp7\" (UniqueName: \"kubernetes.io/projected/2871977e-b513-4279-b52b-c9612ddc9005-kube-api-access-jndp7\") pod \"ceilometer-0\" (UID: \"2871977e-b513-4279-b52b-c9612ddc9005\") " pod="openstack/ceilometer-0" Dec 07 19:37:35 crc kubenswrapper[4815]: I1207 19:37:35.073005 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2871977e-b513-4279-b52b-c9612ddc9005-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2871977e-b513-4279-b52b-c9612ddc9005\") " pod="openstack/ceilometer-0" Dec 07 19:37:35 crc kubenswrapper[4815]: I1207 19:37:35.075378 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2871977e-b513-4279-b52b-c9612ddc9005-run-httpd\") pod \"ceilometer-0\" (UID: \"2871977e-b513-4279-b52b-c9612ddc9005\") " pod="openstack/ceilometer-0" Dec 07 19:37:35 crc kubenswrapper[4815]: I1207 19:37:35.075472 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2871977e-b513-4279-b52b-c9612ddc9005-config-data\") pod \"ceilometer-0\" (UID: \"2871977e-b513-4279-b52b-c9612ddc9005\") " pod="openstack/ceilometer-0" Dec 07 19:37:35 crc kubenswrapper[4815]: I1207 19:37:35.075564 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2871977e-b513-4279-b52b-c9612ddc9005-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2871977e-b513-4279-b52b-c9612ddc9005\") " pod="openstack/ceilometer-0" Dec 07 19:37:35 crc kubenswrapper[4815]: I1207 19:37:35.075588 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2871977e-b513-4279-b52b-c9612ddc9005-scripts\") pod \"ceilometer-0\" (UID: \"2871977e-b513-4279-b52b-c9612ddc9005\") " pod="openstack/ceilometer-0" Dec 07 19:37:35 crc kubenswrapper[4815]: I1207 19:37:35.079396 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2871977e-b513-4279-b52b-c9612ddc9005-log-httpd\") pod \"ceilometer-0\" (UID: \"2871977e-b513-4279-b52b-c9612ddc9005\") " pod="openstack/ceilometer-0" Dec 07 19:37:35 crc kubenswrapper[4815]: I1207 19:37:35.079778 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2871977e-b513-4279-b52b-c9612ddc9005-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2871977e-b513-4279-b52b-c9612ddc9005\") " pod="openstack/ceilometer-0" Dec 07 19:37:35 crc kubenswrapper[4815]: I1207 19:37:35.080151 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2871977e-b513-4279-b52b-c9612ddc9005-run-httpd\") pod \"ceilometer-0\" (UID: \"2871977e-b513-4279-b52b-c9612ddc9005\") " pod="openstack/ceilometer-0" Dec 07 19:37:35 crc kubenswrapper[4815]: I1207 19:37:35.080415 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2871977e-b513-4279-b52b-c9612ddc9005-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2871977e-b513-4279-b52b-c9612ddc9005\") " pod="openstack/ceilometer-0" Dec 07 19:37:35 crc kubenswrapper[4815]: I1207 19:37:35.081004 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2871977e-b513-4279-b52b-c9612ddc9005-scripts\") pod \"ceilometer-0\" (UID: \"2871977e-b513-4279-b52b-c9612ddc9005\") " pod="openstack/ceilometer-0" Dec 07 19:37:35 crc kubenswrapper[4815]: I1207 19:37:35.087776 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2871977e-b513-4279-b52b-c9612ddc9005-config-data\") pod \"ceilometer-0\" (UID: \"2871977e-b513-4279-b52b-c9612ddc9005\") " pod="openstack/ceilometer-0" Dec 07 19:37:35 crc kubenswrapper[4815]: I1207 19:37:35.090562 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2871977e-b513-4279-b52b-c9612ddc9005-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2871977e-b513-4279-b52b-c9612ddc9005\") " pod="openstack/ceilometer-0" Dec 07 19:37:35 crc kubenswrapper[4815]: I1207 19:37:35.098353 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jndp7\" (UniqueName: \"kubernetes.io/projected/2871977e-b513-4279-b52b-c9612ddc9005-kube-api-access-jndp7\") pod \"ceilometer-0\" (UID: \"2871977e-b513-4279-b52b-c9612ddc9005\") " pod="openstack/ceilometer-0" Dec 07 19:37:35 crc kubenswrapper[4815]: I1207 19:37:35.129217 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"86e51347-8e5b-4d00-bfed-fd6a64af7e74","Type":"ContainerStarted","Data":"8bb3698c5a8bf55e7720221c900066953fdb9c9a28a3e2271e3885534261f285"} Dec 07 19:37:35 crc kubenswrapper[4815]: I1207 19:37:35.269582 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-7kvht"] Dec 07 19:37:35 crc kubenswrapper[4815]: I1207 19:37:35.271268 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 07 19:37:35 crc kubenswrapper[4815]: W1207 19:37:35.727357 4815 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2871977e_b513_4279_b52b_c9612ddc9005.slice/crio-23e40f926f52e6bf90e3779e7799bbc4a114e3573aa0018a2ad21c77296c4991 WatchSource:0}: Error finding container 23e40f926f52e6bf90e3779e7799bbc4a114e3573aa0018a2ad21c77296c4991: Status 404 returned error can't find the container with id 23e40f926f52e6bf90e3779e7799bbc4a114e3573aa0018a2ad21c77296c4991 Dec 07 19:37:35 crc kubenswrapper[4815]: I1207 19:37:35.730016 4815 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 07 19:37:35 crc kubenswrapper[4815]: I1207 19:37:35.731833 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 07 19:37:35 crc kubenswrapper[4815]: I1207 19:37:35.798386 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9587a968-ace0-4eb2-8e73-4230356ac221" path="/var/lib/kubelet/pods/9587a968-ace0-4eb2-8e73-4230356ac221/volumes" Dec 07 19:37:35 crc kubenswrapper[4815]: I1207 19:37:35.799359 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7bad849-31ee-4fba-8468-d9b63c4ccfca" path="/var/lib/kubelet/pods/b7bad849-31ee-4fba-8468-d9b63c4ccfca/volumes" Dec 07 19:37:36 crc kubenswrapper[4815]: I1207 19:37:36.153372 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-7kvht" event={"ID":"97bed517-8a3a-42ee-8a74-e9ad9416898a","Type":"ContainerStarted","Data":"484bea2a3a785e8e4efcc939223774e3732c6c7235cd3dd30885896a89b6a7dd"} Dec 07 19:37:36 crc kubenswrapper[4815]: I1207 19:37:36.153425 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-7kvht" event={"ID":"97bed517-8a3a-42ee-8a74-e9ad9416898a","Type":"ContainerStarted","Data":"89b4ff301a6bf53b612a2b148af0ec0ba4b2ef1cdc23bf3a1813e84775adc285"} Dec 07 19:37:36 crc kubenswrapper[4815]: I1207 19:37:36.156769 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2871977e-b513-4279-b52b-c9612ddc9005","Type":"ContainerStarted","Data":"23e40f926f52e6bf90e3779e7799bbc4a114e3573aa0018a2ad21c77296c4991"} Dec 07 19:37:36 crc kubenswrapper[4815]: I1207 19:37:36.159295 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"86e51347-8e5b-4d00-bfed-fd6a64af7e74","Type":"ContainerStarted","Data":"688b5a1e33aec6fb3f8977ce625ae7b61a265d7bf10a234593dd01692d4aeb78"} Dec 07 19:37:36 crc kubenswrapper[4815]: I1207 19:37:36.159337 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"86e51347-8e5b-4d00-bfed-fd6a64af7e74","Type":"ContainerStarted","Data":"c604803ec0f05e625d3d60a546f978661ca9cba5978a160b9056cf053104af25"} Dec 07 19:37:36 crc kubenswrapper[4815]: I1207 19:37:36.176809 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-7kvht" podStartSLOduration=2.176789806 podStartE2EDuration="2.176789806s" podCreationTimestamp="2025-12-07 19:37:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:37:36.17196917 +0000 UTC m=+1360.750959215" watchObservedRunningTime="2025-12-07 19:37:36.176789806 +0000 UTC m=+1360.755779841" Dec 07 19:37:36 crc kubenswrapper[4815]: I1207 19:37:36.208519 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.208501781 podStartE2EDuration="2.208501781s" podCreationTimestamp="2025-12-07 19:37:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:37:36.200961198 +0000 UTC m=+1360.779951243" watchObservedRunningTime="2025-12-07 19:37:36.208501781 +0000 UTC m=+1360.787491816" Dec 07 19:37:36 crc kubenswrapper[4815]: I1207 19:37:36.858040 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-68d4b6d797-ts764" Dec 07 19:37:36 crc kubenswrapper[4815]: I1207 19:37:36.953399 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8b8cf6657-nlxw2"] Dec 07 19:37:36 crc kubenswrapper[4815]: I1207 19:37:36.953661 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8b8cf6657-nlxw2" podUID="75629150-d865-4487-93f7-5b60e2194ab7" containerName="dnsmasq-dns" containerID="cri-o://18ced62d2d9a61bb2e5e92bf9fabb0293ce8263a8c67bd7ac58c5ba5feba5ba2" gracePeriod=10 Dec 07 19:37:37 crc kubenswrapper[4815]: I1207 19:37:37.190763 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2871977e-b513-4279-b52b-c9612ddc9005","Type":"ContainerStarted","Data":"c8e5a6e368365f90f54031912d6730d266520508733e3020147a5fa19178711a"} Dec 07 19:37:37 crc kubenswrapper[4815]: I1207 19:37:37.193075 4815 generic.go:334] "Generic (PLEG): container finished" podID="75629150-d865-4487-93f7-5b60e2194ab7" containerID="18ced62d2d9a61bb2e5e92bf9fabb0293ce8263a8c67bd7ac58c5ba5feba5ba2" exitCode=0 Dec 07 19:37:37 crc kubenswrapper[4815]: I1207 19:37:37.193783 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b8cf6657-nlxw2" event={"ID":"75629150-d865-4487-93f7-5b60e2194ab7","Type":"ContainerDied","Data":"18ced62d2d9a61bb2e5e92bf9fabb0293ce8263a8c67bd7ac58c5ba5feba5ba2"} Dec 07 19:37:37 crc kubenswrapper[4815]: I1207 19:37:37.458096 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b8cf6657-nlxw2" Dec 07 19:37:37 crc kubenswrapper[4815]: I1207 19:37:37.647404 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n4xpn\" (UniqueName: \"kubernetes.io/projected/75629150-d865-4487-93f7-5b60e2194ab7-kube-api-access-n4xpn\") pod \"75629150-d865-4487-93f7-5b60e2194ab7\" (UID: \"75629150-d865-4487-93f7-5b60e2194ab7\") " Dec 07 19:37:37 crc kubenswrapper[4815]: I1207 19:37:37.647832 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/75629150-d865-4487-93f7-5b60e2194ab7-ovsdbserver-nb\") pod \"75629150-d865-4487-93f7-5b60e2194ab7\" (UID: \"75629150-d865-4487-93f7-5b60e2194ab7\") " Dec 07 19:37:37 crc kubenswrapper[4815]: I1207 19:37:37.647978 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/75629150-d865-4487-93f7-5b60e2194ab7-ovsdbserver-sb\") pod \"75629150-d865-4487-93f7-5b60e2194ab7\" (UID: \"75629150-d865-4487-93f7-5b60e2194ab7\") " Dec 07 19:37:37 crc kubenswrapper[4815]: I1207 19:37:37.648124 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75629150-d865-4487-93f7-5b60e2194ab7-dns-svc\") pod \"75629150-d865-4487-93f7-5b60e2194ab7\" (UID: \"75629150-d865-4487-93f7-5b60e2194ab7\") " Dec 07 19:37:37 crc kubenswrapper[4815]: I1207 19:37:37.648191 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75629150-d865-4487-93f7-5b60e2194ab7-config\") pod \"75629150-d865-4487-93f7-5b60e2194ab7\" (UID: \"75629150-d865-4487-93f7-5b60e2194ab7\") " Dec 07 19:37:37 crc kubenswrapper[4815]: I1207 19:37:37.660609 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75629150-d865-4487-93f7-5b60e2194ab7-kube-api-access-n4xpn" (OuterVolumeSpecName: "kube-api-access-n4xpn") pod "75629150-d865-4487-93f7-5b60e2194ab7" (UID: "75629150-d865-4487-93f7-5b60e2194ab7"). InnerVolumeSpecName "kube-api-access-n4xpn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:37:37 crc kubenswrapper[4815]: I1207 19:37:37.714826 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75629150-d865-4487-93f7-5b60e2194ab7-config" (OuterVolumeSpecName: "config") pod "75629150-d865-4487-93f7-5b60e2194ab7" (UID: "75629150-d865-4487-93f7-5b60e2194ab7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:37:37 crc kubenswrapper[4815]: I1207 19:37:37.727296 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75629150-d865-4487-93f7-5b60e2194ab7-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "75629150-d865-4487-93f7-5b60e2194ab7" (UID: "75629150-d865-4487-93f7-5b60e2194ab7"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:37:37 crc kubenswrapper[4815]: I1207 19:37:37.735977 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75629150-d865-4487-93f7-5b60e2194ab7-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "75629150-d865-4487-93f7-5b60e2194ab7" (UID: "75629150-d865-4487-93f7-5b60e2194ab7"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:37:37 crc kubenswrapper[4815]: I1207 19:37:37.740911 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75629150-d865-4487-93f7-5b60e2194ab7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "75629150-d865-4487-93f7-5b60e2194ab7" (UID: "75629150-d865-4487-93f7-5b60e2194ab7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:37:37 crc kubenswrapper[4815]: I1207 19:37:37.750547 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n4xpn\" (UniqueName: \"kubernetes.io/projected/75629150-d865-4487-93f7-5b60e2194ab7-kube-api-access-n4xpn\") on node \"crc\" DevicePath \"\"" Dec 07 19:37:37 crc kubenswrapper[4815]: I1207 19:37:37.750581 4815 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/75629150-d865-4487-93f7-5b60e2194ab7-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 07 19:37:37 crc kubenswrapper[4815]: I1207 19:37:37.750591 4815 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/75629150-d865-4487-93f7-5b60e2194ab7-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 07 19:37:37 crc kubenswrapper[4815]: I1207 19:37:37.750600 4815 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75629150-d865-4487-93f7-5b60e2194ab7-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 07 19:37:37 crc kubenswrapper[4815]: I1207 19:37:37.750610 4815 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75629150-d865-4487-93f7-5b60e2194ab7-config\") on node \"crc\" DevicePath \"\"" Dec 07 19:37:38 crc kubenswrapper[4815]: I1207 19:37:38.204662 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2871977e-b513-4279-b52b-c9612ddc9005","Type":"ContainerStarted","Data":"101581095a8837da91b48363897d3dcf86bf5d63547602cb7a7809c55992b64a"} Dec 07 19:37:38 crc kubenswrapper[4815]: I1207 19:37:38.204792 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2871977e-b513-4279-b52b-c9612ddc9005","Type":"ContainerStarted","Data":"674b9e2d6fc9d96ee88040a7f82e92fdffc52bf3eac97e9bda94154855baf5bd"} Dec 07 19:37:38 crc kubenswrapper[4815]: I1207 19:37:38.207906 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b8cf6657-nlxw2" event={"ID":"75629150-d865-4487-93f7-5b60e2194ab7","Type":"ContainerDied","Data":"1dc4c5df84859027041de704c3221a6714807c1b20166873c96e4f9b88ae88a3"} Dec 07 19:37:38 crc kubenswrapper[4815]: I1207 19:37:38.207997 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b8cf6657-nlxw2" Dec 07 19:37:38 crc kubenswrapper[4815]: I1207 19:37:38.208238 4815 scope.go:117] "RemoveContainer" containerID="18ced62d2d9a61bb2e5e92bf9fabb0293ce8263a8c67bd7ac58c5ba5feba5ba2" Dec 07 19:37:38 crc kubenswrapper[4815]: I1207 19:37:38.242957 4815 scope.go:117] "RemoveContainer" containerID="407ecc3e063b112e52b2224b78b55d52b4bba65fc1f7704a4eb0e3ebb97da842" Dec 07 19:37:38 crc kubenswrapper[4815]: I1207 19:37:38.244649 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8b8cf6657-nlxw2"] Dec 07 19:37:38 crc kubenswrapper[4815]: I1207 19:37:38.257885 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8b8cf6657-nlxw2"] Dec 07 19:37:39 crc kubenswrapper[4815]: I1207 19:37:39.779455 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75629150-d865-4487-93f7-5b60e2194ab7" path="/var/lib/kubelet/pods/75629150-d865-4487-93f7-5b60e2194ab7/volumes" Dec 07 19:37:42 crc kubenswrapper[4815]: I1207 19:37:42.244201 4815 generic.go:334] "Generic (PLEG): container finished" podID="97bed517-8a3a-42ee-8a74-e9ad9416898a" containerID="484bea2a3a785e8e4efcc939223774e3732c6c7235cd3dd30885896a89b6a7dd" exitCode=0 Dec 07 19:37:42 crc kubenswrapper[4815]: I1207 19:37:42.244263 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-7kvht" event={"ID":"97bed517-8a3a-42ee-8a74-e9ad9416898a","Type":"ContainerDied","Data":"484bea2a3a785e8e4efcc939223774e3732c6c7235cd3dd30885896a89b6a7dd"} Dec 07 19:37:43 crc kubenswrapper[4815]: I1207 19:37:43.745682 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-7kvht" Dec 07 19:37:43 crc kubenswrapper[4815]: I1207 19:37:43.853029 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqpjf\" (UniqueName: \"kubernetes.io/projected/97bed517-8a3a-42ee-8a74-e9ad9416898a-kube-api-access-rqpjf\") pod \"97bed517-8a3a-42ee-8a74-e9ad9416898a\" (UID: \"97bed517-8a3a-42ee-8a74-e9ad9416898a\") " Dec 07 19:37:43 crc kubenswrapper[4815]: I1207 19:37:43.853102 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97bed517-8a3a-42ee-8a74-e9ad9416898a-config-data\") pod \"97bed517-8a3a-42ee-8a74-e9ad9416898a\" (UID: \"97bed517-8a3a-42ee-8a74-e9ad9416898a\") " Dec 07 19:37:43 crc kubenswrapper[4815]: I1207 19:37:43.853230 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97bed517-8a3a-42ee-8a74-e9ad9416898a-combined-ca-bundle\") pod \"97bed517-8a3a-42ee-8a74-e9ad9416898a\" (UID: \"97bed517-8a3a-42ee-8a74-e9ad9416898a\") " Dec 07 19:37:43 crc kubenswrapper[4815]: I1207 19:37:43.853301 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97bed517-8a3a-42ee-8a74-e9ad9416898a-scripts\") pod \"97bed517-8a3a-42ee-8a74-e9ad9416898a\" (UID: \"97bed517-8a3a-42ee-8a74-e9ad9416898a\") " Dec 07 19:37:43 crc kubenswrapper[4815]: I1207 19:37:43.859058 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97bed517-8a3a-42ee-8a74-e9ad9416898a-scripts" (OuterVolumeSpecName: "scripts") pod "97bed517-8a3a-42ee-8a74-e9ad9416898a" (UID: "97bed517-8a3a-42ee-8a74-e9ad9416898a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:37:43 crc kubenswrapper[4815]: I1207 19:37:43.863092 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97bed517-8a3a-42ee-8a74-e9ad9416898a-kube-api-access-rqpjf" (OuterVolumeSpecName: "kube-api-access-rqpjf") pod "97bed517-8a3a-42ee-8a74-e9ad9416898a" (UID: "97bed517-8a3a-42ee-8a74-e9ad9416898a"). InnerVolumeSpecName "kube-api-access-rqpjf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:37:43 crc kubenswrapper[4815]: I1207 19:37:43.883116 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97bed517-8a3a-42ee-8a74-e9ad9416898a-config-data" (OuterVolumeSpecName: "config-data") pod "97bed517-8a3a-42ee-8a74-e9ad9416898a" (UID: "97bed517-8a3a-42ee-8a74-e9ad9416898a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:37:43 crc kubenswrapper[4815]: I1207 19:37:43.897704 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97bed517-8a3a-42ee-8a74-e9ad9416898a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "97bed517-8a3a-42ee-8a74-e9ad9416898a" (UID: "97bed517-8a3a-42ee-8a74-e9ad9416898a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:37:43 crc kubenswrapper[4815]: I1207 19:37:43.954910 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rqpjf\" (UniqueName: \"kubernetes.io/projected/97bed517-8a3a-42ee-8a74-e9ad9416898a-kube-api-access-rqpjf\") on node \"crc\" DevicePath \"\"" Dec 07 19:37:43 crc kubenswrapper[4815]: I1207 19:37:43.954960 4815 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97bed517-8a3a-42ee-8a74-e9ad9416898a-config-data\") on node \"crc\" DevicePath \"\"" Dec 07 19:37:43 crc kubenswrapper[4815]: I1207 19:37:43.954970 4815 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97bed517-8a3a-42ee-8a74-e9ad9416898a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 07 19:37:43 crc kubenswrapper[4815]: I1207 19:37:43.954978 4815 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97bed517-8a3a-42ee-8a74-e9ad9416898a-scripts\") on node \"crc\" DevicePath \"\"" Dec 07 19:37:44 crc kubenswrapper[4815]: I1207 19:37:44.263309 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2871977e-b513-4279-b52b-c9612ddc9005","Type":"ContainerStarted","Data":"143e647a56c411cdd1d735ea323807103f1eb96add0daeb8edee6aeaf5431d9c"} Dec 07 19:37:44 crc kubenswrapper[4815]: I1207 19:37:44.264843 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 07 19:37:44 crc kubenswrapper[4815]: I1207 19:37:44.268726 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-7kvht" event={"ID":"97bed517-8a3a-42ee-8a74-e9ad9416898a","Type":"ContainerDied","Data":"89b4ff301a6bf53b612a2b148af0ec0ba4b2ef1cdc23bf3a1813e84775adc285"} Dec 07 19:37:44 crc kubenswrapper[4815]: I1207 19:37:44.268760 4815 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="89b4ff301a6bf53b612a2b148af0ec0ba4b2ef1cdc23bf3a1813e84775adc285" Dec 07 19:37:44 crc kubenswrapper[4815]: I1207 19:37:44.268776 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-7kvht" Dec 07 19:37:44 crc kubenswrapper[4815]: I1207 19:37:44.309606 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.990653684 podStartE2EDuration="10.309571084s" podCreationTimestamp="2025-12-07 19:37:34 +0000 UTC" firstStartedPulling="2025-12-07 19:37:35.729757849 +0000 UTC m=+1360.308747894" lastFinishedPulling="2025-12-07 19:37:44.048675249 +0000 UTC m=+1368.627665294" observedRunningTime="2025-12-07 19:37:44.298769929 +0000 UTC m=+1368.877759984" watchObservedRunningTime="2025-12-07 19:37:44.309571084 +0000 UTC m=+1368.888561129" Dec 07 19:37:44 crc kubenswrapper[4815]: I1207 19:37:44.470838 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 07 19:37:44 crc kubenswrapper[4815]: I1207 19:37:44.471458 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="86e51347-8e5b-4d00-bfed-fd6a64af7e74" containerName="nova-api-log" containerID="cri-o://c604803ec0f05e625d3d60a546f978661ca9cba5978a160b9056cf053104af25" gracePeriod=30 Dec 07 19:37:44 crc kubenswrapper[4815]: I1207 19:37:44.471533 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="86e51347-8e5b-4d00-bfed-fd6a64af7e74" containerName="nova-api-api" containerID="cri-o://688b5a1e33aec6fb3f8977ce625ae7b61a265d7bf10a234593dd01692d4aeb78" gracePeriod=30 Dec 07 19:37:44 crc kubenswrapper[4815]: I1207 19:37:44.498289 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 07 19:37:44 crc kubenswrapper[4815]: I1207 19:37:44.498531 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="249344af-6484-4afc-9771-4d6fe5cee8e1" containerName="nova-scheduler-scheduler" containerID="cri-o://d133fd3cbc74694ad05f07bece28185845a5cd322d8c5746ff7a2b0e166535db" gracePeriod=30 Dec 07 19:37:44 crc kubenswrapper[4815]: I1207 19:37:44.540531 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 07 19:37:44 crc kubenswrapper[4815]: I1207 19:37:44.540762 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e4fe96a6-892b-4178-ad60-b1b256140e05" containerName="nova-metadata-log" containerID="cri-o://6d7131c14f3d1ff802c3e9200b3b8ce02374bc6eb4749ad06558ac19f7787e89" gracePeriod=30 Dec 07 19:37:44 crc kubenswrapper[4815]: I1207 19:37:44.541043 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e4fe96a6-892b-4178-ad60-b1b256140e05" containerName="nova-metadata-metadata" containerID="cri-o://e264c9708358ce7f36356e8c85bacf46a94553e84b94adfae5b9b6b9d9e47617" gracePeriod=30 Dec 07 19:37:45 crc kubenswrapper[4815]: I1207 19:37:45.279237 4815 generic.go:334] "Generic (PLEG): container finished" podID="86e51347-8e5b-4d00-bfed-fd6a64af7e74" containerID="688b5a1e33aec6fb3f8977ce625ae7b61a265d7bf10a234593dd01692d4aeb78" exitCode=0 Dec 07 19:37:45 crc kubenswrapper[4815]: I1207 19:37:45.279517 4815 generic.go:334] "Generic (PLEG): container finished" podID="86e51347-8e5b-4d00-bfed-fd6a64af7e74" containerID="c604803ec0f05e625d3d60a546f978661ca9cba5978a160b9056cf053104af25" exitCode=143 Dec 07 19:37:45 crc kubenswrapper[4815]: I1207 19:37:45.279322 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"86e51347-8e5b-4d00-bfed-fd6a64af7e74","Type":"ContainerDied","Data":"688b5a1e33aec6fb3f8977ce625ae7b61a265d7bf10a234593dd01692d4aeb78"} Dec 07 19:37:45 crc kubenswrapper[4815]: I1207 19:37:45.279568 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"86e51347-8e5b-4d00-bfed-fd6a64af7e74","Type":"ContainerDied","Data":"c604803ec0f05e625d3d60a546f978661ca9cba5978a160b9056cf053104af25"} Dec 07 19:37:45 crc kubenswrapper[4815]: I1207 19:37:45.279588 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"86e51347-8e5b-4d00-bfed-fd6a64af7e74","Type":"ContainerDied","Data":"8bb3698c5a8bf55e7720221c900066953fdb9c9a28a3e2271e3885534261f285"} Dec 07 19:37:45 crc kubenswrapper[4815]: I1207 19:37:45.279600 4815 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8bb3698c5a8bf55e7720221c900066953fdb9c9a28a3e2271e3885534261f285" Dec 07 19:37:45 crc kubenswrapper[4815]: I1207 19:37:45.282127 4815 generic.go:334] "Generic (PLEG): container finished" podID="e4fe96a6-892b-4178-ad60-b1b256140e05" containerID="6d7131c14f3d1ff802c3e9200b3b8ce02374bc6eb4749ad06558ac19f7787e89" exitCode=143 Dec 07 19:37:45 crc kubenswrapper[4815]: I1207 19:37:45.283239 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e4fe96a6-892b-4178-ad60-b1b256140e05","Type":"ContainerDied","Data":"6d7131c14f3d1ff802c3e9200b3b8ce02374bc6eb4749ad06558ac19f7787e89"} Dec 07 19:37:45 crc kubenswrapper[4815]: I1207 19:37:45.331078 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 07 19:37:45 crc kubenswrapper[4815]: I1207 19:37:45.396194 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86e51347-8e5b-4d00-bfed-fd6a64af7e74-combined-ca-bundle\") pod \"86e51347-8e5b-4d00-bfed-fd6a64af7e74\" (UID: \"86e51347-8e5b-4d00-bfed-fd6a64af7e74\") " Dec 07 19:37:45 crc kubenswrapper[4815]: I1207 19:37:45.396269 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86e51347-8e5b-4d00-bfed-fd6a64af7e74-config-data\") pod \"86e51347-8e5b-4d00-bfed-fd6a64af7e74\" (UID: \"86e51347-8e5b-4d00-bfed-fd6a64af7e74\") " Dec 07 19:37:45 crc kubenswrapper[4815]: I1207 19:37:45.396300 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/86e51347-8e5b-4d00-bfed-fd6a64af7e74-public-tls-certs\") pod \"86e51347-8e5b-4d00-bfed-fd6a64af7e74\" (UID: \"86e51347-8e5b-4d00-bfed-fd6a64af7e74\") " Dec 07 19:37:45 crc kubenswrapper[4815]: I1207 19:37:45.396367 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/86e51347-8e5b-4d00-bfed-fd6a64af7e74-logs\") pod \"86e51347-8e5b-4d00-bfed-fd6a64af7e74\" (UID: \"86e51347-8e5b-4d00-bfed-fd6a64af7e74\") " Dec 07 19:37:45 crc kubenswrapper[4815]: I1207 19:37:45.396428 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pd2dq\" (UniqueName: \"kubernetes.io/projected/86e51347-8e5b-4d00-bfed-fd6a64af7e74-kube-api-access-pd2dq\") pod \"86e51347-8e5b-4d00-bfed-fd6a64af7e74\" (UID: \"86e51347-8e5b-4d00-bfed-fd6a64af7e74\") " Dec 07 19:37:45 crc kubenswrapper[4815]: I1207 19:37:45.396461 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/86e51347-8e5b-4d00-bfed-fd6a64af7e74-internal-tls-certs\") pod \"86e51347-8e5b-4d00-bfed-fd6a64af7e74\" (UID: \"86e51347-8e5b-4d00-bfed-fd6a64af7e74\") " Dec 07 19:37:45 crc kubenswrapper[4815]: I1207 19:37:45.396969 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86e51347-8e5b-4d00-bfed-fd6a64af7e74-logs" (OuterVolumeSpecName: "logs") pod "86e51347-8e5b-4d00-bfed-fd6a64af7e74" (UID: "86e51347-8e5b-4d00-bfed-fd6a64af7e74"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 07 19:37:45 crc kubenswrapper[4815]: I1207 19:37:45.421148 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86e51347-8e5b-4d00-bfed-fd6a64af7e74-kube-api-access-pd2dq" (OuterVolumeSpecName: "kube-api-access-pd2dq") pod "86e51347-8e5b-4d00-bfed-fd6a64af7e74" (UID: "86e51347-8e5b-4d00-bfed-fd6a64af7e74"). InnerVolumeSpecName "kube-api-access-pd2dq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:37:45 crc kubenswrapper[4815]: I1207 19:37:45.436235 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86e51347-8e5b-4d00-bfed-fd6a64af7e74-config-data" (OuterVolumeSpecName: "config-data") pod "86e51347-8e5b-4d00-bfed-fd6a64af7e74" (UID: "86e51347-8e5b-4d00-bfed-fd6a64af7e74"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:37:45 crc kubenswrapper[4815]: I1207 19:37:45.451576 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86e51347-8e5b-4d00-bfed-fd6a64af7e74-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "86e51347-8e5b-4d00-bfed-fd6a64af7e74" (UID: "86e51347-8e5b-4d00-bfed-fd6a64af7e74"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:37:45 crc kubenswrapper[4815]: I1207 19:37:45.461662 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86e51347-8e5b-4d00-bfed-fd6a64af7e74-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "86e51347-8e5b-4d00-bfed-fd6a64af7e74" (UID: "86e51347-8e5b-4d00-bfed-fd6a64af7e74"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:37:45 crc kubenswrapper[4815]: I1207 19:37:45.464830 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86e51347-8e5b-4d00-bfed-fd6a64af7e74-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "86e51347-8e5b-4d00-bfed-fd6a64af7e74" (UID: "86e51347-8e5b-4d00-bfed-fd6a64af7e74"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:37:45 crc kubenswrapper[4815]: I1207 19:37:45.497590 4815 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86e51347-8e5b-4d00-bfed-fd6a64af7e74-config-data\") on node \"crc\" DevicePath \"\"" Dec 07 19:37:45 crc kubenswrapper[4815]: I1207 19:37:45.497626 4815 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/86e51347-8e5b-4d00-bfed-fd6a64af7e74-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 07 19:37:45 crc kubenswrapper[4815]: I1207 19:37:45.497636 4815 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/86e51347-8e5b-4d00-bfed-fd6a64af7e74-logs\") on node \"crc\" DevicePath \"\"" Dec 07 19:37:45 crc kubenswrapper[4815]: I1207 19:37:45.497645 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pd2dq\" (UniqueName: \"kubernetes.io/projected/86e51347-8e5b-4d00-bfed-fd6a64af7e74-kube-api-access-pd2dq\") on node \"crc\" DevicePath \"\"" Dec 07 19:37:45 crc kubenswrapper[4815]: I1207 19:37:45.497654 4815 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/86e51347-8e5b-4d00-bfed-fd6a64af7e74-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 07 19:37:45 crc kubenswrapper[4815]: I1207 19:37:45.497661 4815 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86e51347-8e5b-4d00-bfed-fd6a64af7e74-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 07 19:37:46 crc kubenswrapper[4815]: E1207 19:37:46.108309 4815 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d133fd3cbc74694ad05f07bece28185845a5cd322d8c5746ff7a2b0e166535db" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 07 19:37:46 crc kubenswrapper[4815]: E1207 19:37:46.110388 4815 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d133fd3cbc74694ad05f07bece28185845a5cd322d8c5746ff7a2b0e166535db" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 07 19:37:46 crc kubenswrapper[4815]: E1207 19:37:46.113299 4815 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d133fd3cbc74694ad05f07bece28185845a5cd322d8c5746ff7a2b0e166535db" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 07 19:37:46 crc kubenswrapper[4815]: E1207 19:37:46.113360 4815 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="249344af-6484-4afc-9771-4d6fe5cee8e1" containerName="nova-scheduler-scheduler" Dec 07 19:37:46 crc kubenswrapper[4815]: I1207 19:37:46.289160 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 07 19:37:46 crc kubenswrapper[4815]: I1207 19:37:46.313730 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 07 19:37:46 crc kubenswrapper[4815]: I1207 19:37:46.321599 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 07 19:37:46 crc kubenswrapper[4815]: I1207 19:37:46.337140 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 07 19:37:46 crc kubenswrapper[4815]: E1207 19:37:46.337511 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75629150-d865-4487-93f7-5b60e2194ab7" containerName="dnsmasq-dns" Dec 07 19:37:46 crc kubenswrapper[4815]: I1207 19:37:46.337527 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="75629150-d865-4487-93f7-5b60e2194ab7" containerName="dnsmasq-dns" Dec 07 19:37:46 crc kubenswrapper[4815]: E1207 19:37:46.337542 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75629150-d865-4487-93f7-5b60e2194ab7" containerName="init" Dec 07 19:37:46 crc kubenswrapper[4815]: I1207 19:37:46.337548 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="75629150-d865-4487-93f7-5b60e2194ab7" containerName="init" Dec 07 19:37:46 crc kubenswrapper[4815]: E1207 19:37:46.337564 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86e51347-8e5b-4d00-bfed-fd6a64af7e74" containerName="nova-api-api" Dec 07 19:37:46 crc kubenswrapper[4815]: I1207 19:37:46.337569 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="86e51347-8e5b-4d00-bfed-fd6a64af7e74" containerName="nova-api-api" Dec 07 19:37:46 crc kubenswrapper[4815]: E1207 19:37:46.337591 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86e51347-8e5b-4d00-bfed-fd6a64af7e74" containerName="nova-api-log" Dec 07 19:37:46 crc kubenswrapper[4815]: I1207 19:37:46.337597 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="86e51347-8e5b-4d00-bfed-fd6a64af7e74" containerName="nova-api-log" Dec 07 19:37:46 crc kubenswrapper[4815]: E1207 19:37:46.337609 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97bed517-8a3a-42ee-8a74-e9ad9416898a" containerName="nova-manage" Dec 07 19:37:46 crc kubenswrapper[4815]: I1207 19:37:46.337614 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="97bed517-8a3a-42ee-8a74-e9ad9416898a" containerName="nova-manage" Dec 07 19:37:46 crc kubenswrapper[4815]: I1207 19:37:46.337771 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="75629150-d865-4487-93f7-5b60e2194ab7" containerName="dnsmasq-dns" Dec 07 19:37:46 crc kubenswrapper[4815]: I1207 19:37:46.337782 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="97bed517-8a3a-42ee-8a74-e9ad9416898a" containerName="nova-manage" Dec 07 19:37:46 crc kubenswrapper[4815]: I1207 19:37:46.337795 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="86e51347-8e5b-4d00-bfed-fd6a64af7e74" containerName="nova-api-log" Dec 07 19:37:46 crc kubenswrapper[4815]: I1207 19:37:46.337803 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="86e51347-8e5b-4d00-bfed-fd6a64af7e74" containerName="nova-api-api" Dec 07 19:37:46 crc kubenswrapper[4815]: I1207 19:37:46.339041 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 07 19:37:46 crc kubenswrapper[4815]: I1207 19:37:46.341716 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 07 19:37:46 crc kubenswrapper[4815]: I1207 19:37:46.342167 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 07 19:37:46 crc kubenswrapper[4815]: I1207 19:37:46.343341 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 07 19:37:46 crc kubenswrapper[4815]: I1207 19:37:46.357108 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 07 19:37:46 crc kubenswrapper[4815]: I1207 19:37:46.515748 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8df9c38-f244-46b7-a580-c8882a5a73bf-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a8df9c38-f244-46b7-a580-c8882a5a73bf\") " pod="openstack/nova-api-0" Dec 07 19:37:46 crc kubenswrapper[4815]: I1207 19:37:46.515826 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7glw\" (UniqueName: \"kubernetes.io/projected/a8df9c38-f244-46b7-a580-c8882a5a73bf-kube-api-access-s7glw\") pod \"nova-api-0\" (UID: \"a8df9c38-f244-46b7-a580-c8882a5a73bf\") " pod="openstack/nova-api-0" Dec 07 19:37:46 crc kubenswrapper[4815]: I1207 19:37:46.515959 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8df9c38-f244-46b7-a580-c8882a5a73bf-config-data\") pod \"nova-api-0\" (UID: \"a8df9c38-f244-46b7-a580-c8882a5a73bf\") " pod="openstack/nova-api-0" Dec 07 19:37:46 crc kubenswrapper[4815]: I1207 19:37:46.515982 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8df9c38-f244-46b7-a580-c8882a5a73bf-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a8df9c38-f244-46b7-a580-c8882a5a73bf\") " pod="openstack/nova-api-0" Dec 07 19:37:46 crc kubenswrapper[4815]: I1207 19:37:46.516097 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8df9c38-f244-46b7-a580-c8882a5a73bf-public-tls-certs\") pod \"nova-api-0\" (UID: \"a8df9c38-f244-46b7-a580-c8882a5a73bf\") " pod="openstack/nova-api-0" Dec 07 19:37:46 crc kubenswrapper[4815]: I1207 19:37:46.516141 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a8df9c38-f244-46b7-a580-c8882a5a73bf-logs\") pod \"nova-api-0\" (UID: \"a8df9c38-f244-46b7-a580-c8882a5a73bf\") " pod="openstack/nova-api-0" Dec 07 19:37:46 crc kubenswrapper[4815]: I1207 19:37:46.618185 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8df9c38-f244-46b7-a580-c8882a5a73bf-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a8df9c38-f244-46b7-a580-c8882a5a73bf\") " pod="openstack/nova-api-0" Dec 07 19:37:46 crc kubenswrapper[4815]: I1207 19:37:46.618280 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7glw\" (UniqueName: \"kubernetes.io/projected/a8df9c38-f244-46b7-a580-c8882a5a73bf-kube-api-access-s7glw\") pod \"nova-api-0\" (UID: \"a8df9c38-f244-46b7-a580-c8882a5a73bf\") " pod="openstack/nova-api-0" Dec 07 19:37:46 crc kubenswrapper[4815]: I1207 19:37:46.618338 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8df9c38-f244-46b7-a580-c8882a5a73bf-config-data\") pod \"nova-api-0\" (UID: \"a8df9c38-f244-46b7-a580-c8882a5a73bf\") " pod="openstack/nova-api-0" Dec 07 19:37:46 crc kubenswrapper[4815]: I1207 19:37:46.618357 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8df9c38-f244-46b7-a580-c8882a5a73bf-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a8df9c38-f244-46b7-a580-c8882a5a73bf\") " pod="openstack/nova-api-0" Dec 07 19:37:46 crc kubenswrapper[4815]: I1207 19:37:46.618397 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8df9c38-f244-46b7-a580-c8882a5a73bf-public-tls-certs\") pod \"nova-api-0\" (UID: \"a8df9c38-f244-46b7-a580-c8882a5a73bf\") " pod="openstack/nova-api-0" Dec 07 19:37:46 crc kubenswrapper[4815]: I1207 19:37:46.618413 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a8df9c38-f244-46b7-a580-c8882a5a73bf-logs\") pod \"nova-api-0\" (UID: \"a8df9c38-f244-46b7-a580-c8882a5a73bf\") " pod="openstack/nova-api-0" Dec 07 19:37:46 crc kubenswrapper[4815]: I1207 19:37:46.618993 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a8df9c38-f244-46b7-a580-c8882a5a73bf-logs\") pod \"nova-api-0\" (UID: \"a8df9c38-f244-46b7-a580-c8882a5a73bf\") " pod="openstack/nova-api-0" Dec 07 19:37:46 crc kubenswrapper[4815]: I1207 19:37:46.624856 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8df9c38-f244-46b7-a580-c8882a5a73bf-public-tls-certs\") pod \"nova-api-0\" (UID: \"a8df9c38-f244-46b7-a580-c8882a5a73bf\") " pod="openstack/nova-api-0" Dec 07 19:37:46 crc kubenswrapper[4815]: I1207 19:37:46.625484 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8df9c38-f244-46b7-a580-c8882a5a73bf-config-data\") pod \"nova-api-0\" (UID: \"a8df9c38-f244-46b7-a580-c8882a5a73bf\") " pod="openstack/nova-api-0" Dec 07 19:37:46 crc kubenswrapper[4815]: I1207 19:37:46.628482 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8df9c38-f244-46b7-a580-c8882a5a73bf-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a8df9c38-f244-46b7-a580-c8882a5a73bf\") " pod="openstack/nova-api-0" Dec 07 19:37:46 crc kubenswrapper[4815]: I1207 19:37:46.629879 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8df9c38-f244-46b7-a580-c8882a5a73bf-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a8df9c38-f244-46b7-a580-c8882a5a73bf\") " pod="openstack/nova-api-0" Dec 07 19:37:46 crc kubenswrapper[4815]: I1207 19:37:46.656004 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7glw\" (UniqueName: \"kubernetes.io/projected/a8df9c38-f244-46b7-a580-c8882a5a73bf-kube-api-access-s7glw\") pod \"nova-api-0\" (UID: \"a8df9c38-f244-46b7-a580-c8882a5a73bf\") " pod="openstack/nova-api-0" Dec 07 19:37:46 crc kubenswrapper[4815]: I1207 19:37:46.954480 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 07 19:37:47 crc kubenswrapper[4815]: I1207 19:37:47.499119 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 07 19:37:47 crc kubenswrapper[4815]: I1207 19:37:47.674474 4815 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="e4fe96a6-892b-4178-ad60-b1b256140e05" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.176:8775/\": read tcp 10.217.0.2:50766->10.217.0.176:8775: read: connection reset by peer" Dec 07 19:37:47 crc kubenswrapper[4815]: I1207 19:37:47.674503 4815 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="e4fe96a6-892b-4178-ad60-b1b256140e05" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.176:8775/\": read tcp 10.217.0.2:50756->10.217.0.176:8775: read: connection reset by peer" Dec 07 19:37:47 crc kubenswrapper[4815]: I1207 19:37:47.794182 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86e51347-8e5b-4d00-bfed-fd6a64af7e74" path="/var/lib/kubelet/pods/86e51347-8e5b-4d00-bfed-fd6a64af7e74/volumes" Dec 07 19:37:48 crc kubenswrapper[4815]: I1207 19:37:48.220043 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 07 19:37:48 crc kubenswrapper[4815]: I1207 19:37:48.259399 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4fe96a6-892b-4178-ad60-b1b256140e05-config-data\") pod \"e4fe96a6-892b-4178-ad60-b1b256140e05\" (UID: \"e4fe96a6-892b-4178-ad60-b1b256140e05\") " Dec 07 19:37:48 crc kubenswrapper[4815]: I1207 19:37:48.259461 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4fe96a6-892b-4178-ad60-b1b256140e05-logs\") pod \"e4fe96a6-892b-4178-ad60-b1b256140e05\" (UID: \"e4fe96a6-892b-4178-ad60-b1b256140e05\") " Dec 07 19:37:48 crc kubenswrapper[4815]: I1207 19:37:48.259525 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4fe96a6-892b-4178-ad60-b1b256140e05-combined-ca-bundle\") pod \"e4fe96a6-892b-4178-ad60-b1b256140e05\" (UID: \"e4fe96a6-892b-4178-ad60-b1b256140e05\") " Dec 07 19:37:48 crc kubenswrapper[4815]: I1207 19:37:48.259569 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t66xr\" (UniqueName: \"kubernetes.io/projected/e4fe96a6-892b-4178-ad60-b1b256140e05-kube-api-access-t66xr\") pod \"e4fe96a6-892b-4178-ad60-b1b256140e05\" (UID: \"e4fe96a6-892b-4178-ad60-b1b256140e05\") " Dec 07 19:37:48 crc kubenswrapper[4815]: I1207 19:37:48.259605 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4fe96a6-892b-4178-ad60-b1b256140e05-nova-metadata-tls-certs\") pod \"e4fe96a6-892b-4178-ad60-b1b256140e05\" (UID: \"e4fe96a6-892b-4178-ad60-b1b256140e05\") " Dec 07 19:37:48 crc kubenswrapper[4815]: I1207 19:37:48.262301 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4fe96a6-892b-4178-ad60-b1b256140e05-logs" (OuterVolumeSpecName: "logs") pod "e4fe96a6-892b-4178-ad60-b1b256140e05" (UID: "e4fe96a6-892b-4178-ad60-b1b256140e05"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 07 19:37:48 crc kubenswrapper[4815]: I1207 19:37:48.276206 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4fe96a6-892b-4178-ad60-b1b256140e05-kube-api-access-t66xr" (OuterVolumeSpecName: "kube-api-access-t66xr") pod "e4fe96a6-892b-4178-ad60-b1b256140e05" (UID: "e4fe96a6-892b-4178-ad60-b1b256140e05"). InnerVolumeSpecName "kube-api-access-t66xr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:37:48 crc kubenswrapper[4815]: I1207 19:37:48.307293 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4fe96a6-892b-4178-ad60-b1b256140e05-config-data" (OuterVolumeSpecName: "config-data") pod "e4fe96a6-892b-4178-ad60-b1b256140e05" (UID: "e4fe96a6-892b-4178-ad60-b1b256140e05"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:37:48 crc kubenswrapper[4815]: I1207 19:37:48.312325 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4fe96a6-892b-4178-ad60-b1b256140e05-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e4fe96a6-892b-4178-ad60-b1b256140e05" (UID: "e4fe96a6-892b-4178-ad60-b1b256140e05"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:37:48 crc kubenswrapper[4815]: I1207 19:37:48.323278 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a8df9c38-f244-46b7-a580-c8882a5a73bf","Type":"ContainerStarted","Data":"6b4e173143ca710765da9308166683dda00d9d242adfc55a0997189bcdf23fe8"} Dec 07 19:37:48 crc kubenswrapper[4815]: I1207 19:37:48.323311 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a8df9c38-f244-46b7-a580-c8882a5a73bf","Type":"ContainerStarted","Data":"729ca16cc1e57d61a95b416bd19e351a51242bf6320460983b2ac4c8ecdbaea4"} Dec 07 19:37:48 crc kubenswrapper[4815]: I1207 19:37:48.323320 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a8df9c38-f244-46b7-a580-c8882a5a73bf","Type":"ContainerStarted","Data":"2b3d348014b8924500e8a327bc5d373b6d1ce428a7e1507989f0289b3ebf4183"} Dec 07 19:37:48 crc kubenswrapper[4815]: I1207 19:37:48.340653 4815 generic.go:334] "Generic (PLEG): container finished" podID="e4fe96a6-892b-4178-ad60-b1b256140e05" containerID="e264c9708358ce7f36356e8c85bacf46a94553e84b94adfae5b9b6b9d9e47617" exitCode=0 Dec 07 19:37:48 crc kubenswrapper[4815]: I1207 19:37:48.340717 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e4fe96a6-892b-4178-ad60-b1b256140e05","Type":"ContainerDied","Data":"e264c9708358ce7f36356e8c85bacf46a94553e84b94adfae5b9b6b9d9e47617"} Dec 07 19:37:48 crc kubenswrapper[4815]: I1207 19:37:48.340758 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e4fe96a6-892b-4178-ad60-b1b256140e05","Type":"ContainerDied","Data":"3e4c7ddd5f230b428b727311bd44fe62b2de018f4e3af1ebad411eff90270cd6"} Dec 07 19:37:48 crc kubenswrapper[4815]: I1207 19:37:48.340777 4815 scope.go:117] "RemoveContainer" containerID="e264c9708358ce7f36356e8c85bacf46a94553e84b94adfae5b9b6b9d9e47617" Dec 07 19:37:48 crc kubenswrapper[4815]: I1207 19:37:48.340948 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 07 19:37:48 crc kubenswrapper[4815]: I1207 19:37:48.369490 4815 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4fe96a6-892b-4178-ad60-b1b256140e05-config-data\") on node \"crc\" DevicePath \"\"" Dec 07 19:37:48 crc kubenswrapper[4815]: I1207 19:37:48.369526 4815 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4fe96a6-892b-4178-ad60-b1b256140e05-logs\") on node \"crc\" DevicePath \"\"" Dec 07 19:37:48 crc kubenswrapper[4815]: I1207 19:37:48.369540 4815 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4fe96a6-892b-4178-ad60-b1b256140e05-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 07 19:37:48 crc kubenswrapper[4815]: I1207 19:37:48.369554 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t66xr\" (UniqueName: \"kubernetes.io/projected/e4fe96a6-892b-4178-ad60-b1b256140e05-kube-api-access-t66xr\") on node \"crc\" DevicePath \"\"" Dec 07 19:37:48 crc kubenswrapper[4815]: I1207 19:37:48.376247 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.37622564 podStartE2EDuration="2.37622564s" podCreationTimestamp="2025-12-07 19:37:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:37:48.359254671 +0000 UTC m=+1372.938244716" watchObservedRunningTime="2025-12-07 19:37:48.37622564 +0000 UTC m=+1372.955215675" Dec 07 19:37:48 crc kubenswrapper[4815]: I1207 19:37:48.385542 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4fe96a6-892b-4178-ad60-b1b256140e05-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "e4fe96a6-892b-4178-ad60-b1b256140e05" (UID: "e4fe96a6-892b-4178-ad60-b1b256140e05"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:37:48 crc kubenswrapper[4815]: I1207 19:37:48.394397 4815 scope.go:117] "RemoveContainer" containerID="6d7131c14f3d1ff802c3e9200b3b8ce02374bc6eb4749ad06558ac19f7787e89" Dec 07 19:37:48 crc kubenswrapper[4815]: I1207 19:37:48.409463 4815 scope.go:117] "RemoveContainer" containerID="e264c9708358ce7f36356e8c85bacf46a94553e84b94adfae5b9b6b9d9e47617" Dec 07 19:37:48 crc kubenswrapper[4815]: E1207 19:37:48.409906 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e264c9708358ce7f36356e8c85bacf46a94553e84b94adfae5b9b6b9d9e47617\": container with ID starting with e264c9708358ce7f36356e8c85bacf46a94553e84b94adfae5b9b6b9d9e47617 not found: ID does not exist" containerID="e264c9708358ce7f36356e8c85bacf46a94553e84b94adfae5b9b6b9d9e47617" Dec 07 19:37:48 crc kubenswrapper[4815]: I1207 19:37:48.409959 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e264c9708358ce7f36356e8c85bacf46a94553e84b94adfae5b9b6b9d9e47617"} err="failed to get container status \"e264c9708358ce7f36356e8c85bacf46a94553e84b94adfae5b9b6b9d9e47617\": rpc error: code = NotFound desc = could not find container \"e264c9708358ce7f36356e8c85bacf46a94553e84b94adfae5b9b6b9d9e47617\": container with ID starting with e264c9708358ce7f36356e8c85bacf46a94553e84b94adfae5b9b6b9d9e47617 not found: ID does not exist" Dec 07 19:37:48 crc kubenswrapper[4815]: I1207 19:37:48.409988 4815 scope.go:117] "RemoveContainer" containerID="6d7131c14f3d1ff802c3e9200b3b8ce02374bc6eb4749ad06558ac19f7787e89" Dec 07 19:37:48 crc kubenswrapper[4815]: E1207 19:37:48.410209 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d7131c14f3d1ff802c3e9200b3b8ce02374bc6eb4749ad06558ac19f7787e89\": container with ID starting with 6d7131c14f3d1ff802c3e9200b3b8ce02374bc6eb4749ad06558ac19f7787e89 not found: ID does not exist" containerID="6d7131c14f3d1ff802c3e9200b3b8ce02374bc6eb4749ad06558ac19f7787e89" Dec 07 19:37:48 crc kubenswrapper[4815]: I1207 19:37:48.410228 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d7131c14f3d1ff802c3e9200b3b8ce02374bc6eb4749ad06558ac19f7787e89"} err="failed to get container status \"6d7131c14f3d1ff802c3e9200b3b8ce02374bc6eb4749ad06558ac19f7787e89\": rpc error: code = NotFound desc = could not find container \"6d7131c14f3d1ff802c3e9200b3b8ce02374bc6eb4749ad06558ac19f7787e89\": container with ID starting with 6d7131c14f3d1ff802c3e9200b3b8ce02374bc6eb4749ad06558ac19f7787e89 not found: ID does not exist" Dec 07 19:37:48 crc kubenswrapper[4815]: I1207 19:37:48.470490 4815 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4fe96a6-892b-4178-ad60-b1b256140e05-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 07 19:37:48 crc kubenswrapper[4815]: I1207 19:37:48.683159 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 07 19:37:48 crc kubenswrapper[4815]: I1207 19:37:48.694247 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 07 19:37:48 crc kubenswrapper[4815]: I1207 19:37:48.722190 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 07 19:37:48 crc kubenswrapper[4815]: E1207 19:37:48.722876 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4fe96a6-892b-4178-ad60-b1b256140e05" containerName="nova-metadata-metadata" Dec 07 19:37:48 crc kubenswrapper[4815]: I1207 19:37:48.723005 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4fe96a6-892b-4178-ad60-b1b256140e05" containerName="nova-metadata-metadata" Dec 07 19:37:48 crc kubenswrapper[4815]: E1207 19:37:48.723084 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4fe96a6-892b-4178-ad60-b1b256140e05" containerName="nova-metadata-log" Dec 07 19:37:48 crc kubenswrapper[4815]: I1207 19:37:48.723098 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4fe96a6-892b-4178-ad60-b1b256140e05" containerName="nova-metadata-log" Dec 07 19:37:48 crc kubenswrapper[4815]: I1207 19:37:48.723398 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4fe96a6-892b-4178-ad60-b1b256140e05" containerName="nova-metadata-metadata" Dec 07 19:37:48 crc kubenswrapper[4815]: I1207 19:37:48.723457 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4fe96a6-892b-4178-ad60-b1b256140e05" containerName="nova-metadata-log" Dec 07 19:37:48 crc kubenswrapper[4815]: I1207 19:37:48.725066 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 07 19:37:48 crc kubenswrapper[4815]: I1207 19:37:48.727589 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 07 19:37:48 crc kubenswrapper[4815]: I1207 19:37:48.728982 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 07 19:37:48 crc kubenswrapper[4815]: I1207 19:37:48.735705 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 07 19:37:48 crc kubenswrapper[4815]: I1207 19:37:48.776599 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77m2s\" (UniqueName: \"kubernetes.io/projected/809d96c5-69a9-4892-b39c-34cf1313aff6-kube-api-access-77m2s\") pod \"nova-metadata-0\" (UID: \"809d96c5-69a9-4892-b39c-34cf1313aff6\") " pod="openstack/nova-metadata-0" Dec 07 19:37:48 crc kubenswrapper[4815]: I1207 19:37:48.776735 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/809d96c5-69a9-4892-b39c-34cf1313aff6-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"809d96c5-69a9-4892-b39c-34cf1313aff6\") " pod="openstack/nova-metadata-0" Dec 07 19:37:48 crc kubenswrapper[4815]: I1207 19:37:48.776766 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/809d96c5-69a9-4892-b39c-34cf1313aff6-config-data\") pod \"nova-metadata-0\" (UID: \"809d96c5-69a9-4892-b39c-34cf1313aff6\") " pod="openstack/nova-metadata-0" Dec 07 19:37:48 crc kubenswrapper[4815]: I1207 19:37:48.776801 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/809d96c5-69a9-4892-b39c-34cf1313aff6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"809d96c5-69a9-4892-b39c-34cf1313aff6\") " pod="openstack/nova-metadata-0" Dec 07 19:37:48 crc kubenswrapper[4815]: I1207 19:37:48.776831 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/809d96c5-69a9-4892-b39c-34cf1313aff6-logs\") pod \"nova-metadata-0\" (UID: \"809d96c5-69a9-4892-b39c-34cf1313aff6\") " pod="openstack/nova-metadata-0" Dec 07 19:37:48 crc kubenswrapper[4815]: I1207 19:37:48.923452 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/809d96c5-69a9-4892-b39c-34cf1313aff6-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"809d96c5-69a9-4892-b39c-34cf1313aff6\") " pod="openstack/nova-metadata-0" Dec 07 19:37:48 crc kubenswrapper[4815]: I1207 19:37:48.923766 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/809d96c5-69a9-4892-b39c-34cf1313aff6-config-data\") pod \"nova-metadata-0\" (UID: \"809d96c5-69a9-4892-b39c-34cf1313aff6\") " pod="openstack/nova-metadata-0" Dec 07 19:37:48 crc kubenswrapper[4815]: I1207 19:37:48.923844 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/809d96c5-69a9-4892-b39c-34cf1313aff6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"809d96c5-69a9-4892-b39c-34cf1313aff6\") " pod="openstack/nova-metadata-0" Dec 07 19:37:48 crc kubenswrapper[4815]: I1207 19:37:48.923898 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/809d96c5-69a9-4892-b39c-34cf1313aff6-logs\") pod \"nova-metadata-0\" (UID: \"809d96c5-69a9-4892-b39c-34cf1313aff6\") " pod="openstack/nova-metadata-0" Dec 07 19:37:48 crc kubenswrapper[4815]: I1207 19:37:48.923990 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77m2s\" (UniqueName: \"kubernetes.io/projected/809d96c5-69a9-4892-b39c-34cf1313aff6-kube-api-access-77m2s\") pod \"nova-metadata-0\" (UID: \"809d96c5-69a9-4892-b39c-34cf1313aff6\") " pod="openstack/nova-metadata-0" Dec 07 19:37:48 crc kubenswrapper[4815]: I1207 19:37:48.928750 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/809d96c5-69a9-4892-b39c-34cf1313aff6-logs\") pod \"nova-metadata-0\" (UID: \"809d96c5-69a9-4892-b39c-34cf1313aff6\") " pod="openstack/nova-metadata-0" Dec 07 19:37:48 crc kubenswrapper[4815]: I1207 19:37:48.932654 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/809d96c5-69a9-4892-b39c-34cf1313aff6-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"809d96c5-69a9-4892-b39c-34cf1313aff6\") " pod="openstack/nova-metadata-0" Dec 07 19:37:48 crc kubenswrapper[4815]: I1207 19:37:48.932723 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/809d96c5-69a9-4892-b39c-34cf1313aff6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"809d96c5-69a9-4892-b39c-34cf1313aff6\") " pod="openstack/nova-metadata-0" Dec 07 19:37:48 crc kubenswrapper[4815]: I1207 19:37:48.958741 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/809d96c5-69a9-4892-b39c-34cf1313aff6-config-data\") pod \"nova-metadata-0\" (UID: \"809d96c5-69a9-4892-b39c-34cf1313aff6\") " pod="openstack/nova-metadata-0" Dec 07 19:37:48 crc kubenswrapper[4815]: I1207 19:37:48.959473 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77m2s\" (UniqueName: \"kubernetes.io/projected/809d96c5-69a9-4892-b39c-34cf1313aff6-kube-api-access-77m2s\") pod \"nova-metadata-0\" (UID: \"809d96c5-69a9-4892-b39c-34cf1313aff6\") " pod="openstack/nova-metadata-0" Dec 07 19:37:49 crc kubenswrapper[4815]: I1207 19:37:49.042638 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 07 19:37:49 crc kubenswrapper[4815]: I1207 19:37:49.581681 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 07 19:37:49 crc kubenswrapper[4815]: I1207 19:37:49.780091 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4fe96a6-892b-4178-ad60-b1b256140e05" path="/var/lib/kubelet/pods/e4fe96a6-892b-4178-ad60-b1b256140e05/volumes" Dec 07 19:37:50 crc kubenswrapper[4815]: I1207 19:37:50.305777 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 07 19:37:50 crc kubenswrapper[4815]: I1207 19:37:50.373757 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"809d96c5-69a9-4892-b39c-34cf1313aff6","Type":"ContainerStarted","Data":"a2587e945c9f5cd1db4dd8bdcfe0022d74f612d2881572bef1980507561ef197"} Dec 07 19:37:50 crc kubenswrapper[4815]: I1207 19:37:50.373806 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"809d96c5-69a9-4892-b39c-34cf1313aff6","Type":"ContainerStarted","Data":"9b39a3c96de3c1d80be1c8c24eb3ba3689434568a78a7ddca346da335571aa5c"} Dec 07 19:37:50 crc kubenswrapper[4815]: I1207 19:37:50.373819 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"809d96c5-69a9-4892-b39c-34cf1313aff6","Type":"ContainerStarted","Data":"04f495f244abbb9256ab5a8d6907ab5f05fb5b14b883faae0d5e7a6e9eb090a1"} Dec 07 19:37:50 crc kubenswrapper[4815]: I1207 19:37:50.386370 4815 generic.go:334] "Generic (PLEG): container finished" podID="249344af-6484-4afc-9771-4d6fe5cee8e1" containerID="d133fd3cbc74694ad05f07bece28185845a5cd322d8c5746ff7a2b0e166535db" exitCode=0 Dec 07 19:37:50 crc kubenswrapper[4815]: I1207 19:37:50.386439 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 07 19:37:50 crc kubenswrapper[4815]: I1207 19:37:50.386433 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"249344af-6484-4afc-9771-4d6fe5cee8e1","Type":"ContainerDied","Data":"d133fd3cbc74694ad05f07bece28185845a5cd322d8c5746ff7a2b0e166535db"} Dec 07 19:37:50 crc kubenswrapper[4815]: I1207 19:37:50.386810 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"249344af-6484-4afc-9771-4d6fe5cee8e1","Type":"ContainerDied","Data":"70e0cc5bbc95a47a04f90831ef28b1b89f0d7f06f0275da7b7652116390bf354"} Dec 07 19:37:50 crc kubenswrapper[4815]: I1207 19:37:50.386840 4815 scope.go:117] "RemoveContainer" containerID="d133fd3cbc74694ad05f07bece28185845a5cd322d8c5746ff7a2b0e166535db" Dec 07 19:37:50 crc kubenswrapper[4815]: I1207 19:37:50.395591 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.3955462770000002 podStartE2EDuration="2.395546277s" podCreationTimestamp="2025-12-07 19:37:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:37:50.388340254 +0000 UTC m=+1374.967330319" watchObservedRunningTime="2025-12-07 19:37:50.395546277 +0000 UTC m=+1374.974536322" Dec 07 19:37:50 crc kubenswrapper[4815]: I1207 19:37:50.412278 4815 scope.go:117] "RemoveContainer" containerID="d133fd3cbc74694ad05f07bece28185845a5cd322d8c5746ff7a2b0e166535db" Dec 07 19:37:50 crc kubenswrapper[4815]: E1207 19:37:50.412767 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d133fd3cbc74694ad05f07bece28185845a5cd322d8c5746ff7a2b0e166535db\": container with ID starting with d133fd3cbc74694ad05f07bece28185845a5cd322d8c5746ff7a2b0e166535db not found: ID does not exist" containerID="d133fd3cbc74694ad05f07bece28185845a5cd322d8c5746ff7a2b0e166535db" Dec 07 19:37:50 crc kubenswrapper[4815]: I1207 19:37:50.412796 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d133fd3cbc74694ad05f07bece28185845a5cd322d8c5746ff7a2b0e166535db"} err="failed to get container status \"d133fd3cbc74694ad05f07bece28185845a5cd322d8c5746ff7a2b0e166535db\": rpc error: code = NotFound desc = could not find container \"d133fd3cbc74694ad05f07bece28185845a5cd322d8c5746ff7a2b0e166535db\": container with ID starting with d133fd3cbc74694ad05f07bece28185845a5cd322d8c5746ff7a2b0e166535db not found: ID does not exist" Dec 07 19:37:50 crc kubenswrapper[4815]: I1207 19:37:50.453661 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v77q4\" (UniqueName: \"kubernetes.io/projected/249344af-6484-4afc-9771-4d6fe5cee8e1-kube-api-access-v77q4\") pod \"249344af-6484-4afc-9771-4d6fe5cee8e1\" (UID: \"249344af-6484-4afc-9771-4d6fe5cee8e1\") " Dec 07 19:37:50 crc kubenswrapper[4815]: I1207 19:37:50.453724 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/249344af-6484-4afc-9771-4d6fe5cee8e1-config-data\") pod \"249344af-6484-4afc-9771-4d6fe5cee8e1\" (UID: \"249344af-6484-4afc-9771-4d6fe5cee8e1\") " Dec 07 19:37:50 crc kubenswrapper[4815]: I1207 19:37:50.453770 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/249344af-6484-4afc-9771-4d6fe5cee8e1-combined-ca-bundle\") pod \"249344af-6484-4afc-9771-4d6fe5cee8e1\" (UID: \"249344af-6484-4afc-9771-4d6fe5cee8e1\") " Dec 07 19:37:50 crc kubenswrapper[4815]: I1207 19:37:50.458960 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/249344af-6484-4afc-9771-4d6fe5cee8e1-kube-api-access-v77q4" (OuterVolumeSpecName: "kube-api-access-v77q4") pod "249344af-6484-4afc-9771-4d6fe5cee8e1" (UID: "249344af-6484-4afc-9771-4d6fe5cee8e1"). InnerVolumeSpecName "kube-api-access-v77q4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:37:50 crc kubenswrapper[4815]: I1207 19:37:50.486376 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/249344af-6484-4afc-9771-4d6fe5cee8e1-config-data" (OuterVolumeSpecName: "config-data") pod "249344af-6484-4afc-9771-4d6fe5cee8e1" (UID: "249344af-6484-4afc-9771-4d6fe5cee8e1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:37:50 crc kubenswrapper[4815]: I1207 19:37:50.487059 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/249344af-6484-4afc-9771-4d6fe5cee8e1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "249344af-6484-4afc-9771-4d6fe5cee8e1" (UID: "249344af-6484-4afc-9771-4d6fe5cee8e1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:37:50 crc kubenswrapper[4815]: I1207 19:37:50.559960 4815 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/249344af-6484-4afc-9771-4d6fe5cee8e1-config-data\") on node \"crc\" DevicePath \"\"" Dec 07 19:37:50 crc kubenswrapper[4815]: I1207 19:37:50.560000 4815 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/249344af-6484-4afc-9771-4d6fe5cee8e1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 07 19:37:50 crc kubenswrapper[4815]: I1207 19:37:50.560017 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v77q4\" (UniqueName: \"kubernetes.io/projected/249344af-6484-4afc-9771-4d6fe5cee8e1-kube-api-access-v77q4\") on node \"crc\" DevicePath \"\"" Dec 07 19:37:50 crc kubenswrapper[4815]: I1207 19:37:50.733562 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 07 19:37:50 crc kubenswrapper[4815]: I1207 19:37:50.757675 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 07 19:37:50 crc kubenswrapper[4815]: I1207 19:37:50.772301 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 07 19:37:50 crc kubenswrapper[4815]: E1207 19:37:50.772674 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="249344af-6484-4afc-9771-4d6fe5cee8e1" containerName="nova-scheduler-scheduler" Dec 07 19:37:50 crc kubenswrapper[4815]: I1207 19:37:50.772692 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="249344af-6484-4afc-9771-4d6fe5cee8e1" containerName="nova-scheduler-scheduler" Dec 07 19:37:50 crc kubenswrapper[4815]: I1207 19:37:50.772900 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="249344af-6484-4afc-9771-4d6fe5cee8e1" containerName="nova-scheduler-scheduler" Dec 07 19:37:50 crc kubenswrapper[4815]: I1207 19:37:50.773564 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 07 19:37:50 crc kubenswrapper[4815]: I1207 19:37:50.776439 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 07 19:37:50 crc kubenswrapper[4815]: I1207 19:37:50.781855 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 07 19:37:50 crc kubenswrapper[4815]: I1207 19:37:50.966474 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3117427-1c45-4b4b-bf1d-d0ed52ff2bfa-config-data\") pod \"nova-scheduler-0\" (UID: \"a3117427-1c45-4b4b-bf1d-d0ed52ff2bfa\") " pod="openstack/nova-scheduler-0" Dec 07 19:37:50 crc kubenswrapper[4815]: I1207 19:37:50.966881 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3117427-1c45-4b4b-bf1d-d0ed52ff2bfa-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a3117427-1c45-4b4b-bf1d-d0ed52ff2bfa\") " pod="openstack/nova-scheduler-0" Dec 07 19:37:50 crc kubenswrapper[4815]: I1207 19:37:50.967087 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9cjt\" (UniqueName: \"kubernetes.io/projected/a3117427-1c45-4b4b-bf1d-d0ed52ff2bfa-kube-api-access-t9cjt\") pod \"nova-scheduler-0\" (UID: \"a3117427-1c45-4b4b-bf1d-d0ed52ff2bfa\") " pod="openstack/nova-scheduler-0" Dec 07 19:37:51 crc kubenswrapper[4815]: I1207 19:37:51.068991 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3117427-1c45-4b4b-bf1d-d0ed52ff2bfa-config-data\") pod \"nova-scheduler-0\" (UID: \"a3117427-1c45-4b4b-bf1d-d0ed52ff2bfa\") " pod="openstack/nova-scheduler-0" Dec 07 19:37:51 crc kubenswrapper[4815]: I1207 19:37:51.069335 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3117427-1c45-4b4b-bf1d-d0ed52ff2bfa-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a3117427-1c45-4b4b-bf1d-d0ed52ff2bfa\") " pod="openstack/nova-scheduler-0" Dec 07 19:37:51 crc kubenswrapper[4815]: I1207 19:37:51.069462 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9cjt\" (UniqueName: \"kubernetes.io/projected/a3117427-1c45-4b4b-bf1d-d0ed52ff2bfa-kube-api-access-t9cjt\") pod \"nova-scheduler-0\" (UID: \"a3117427-1c45-4b4b-bf1d-d0ed52ff2bfa\") " pod="openstack/nova-scheduler-0" Dec 07 19:37:51 crc kubenswrapper[4815]: I1207 19:37:51.074160 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3117427-1c45-4b4b-bf1d-d0ed52ff2bfa-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a3117427-1c45-4b4b-bf1d-d0ed52ff2bfa\") " pod="openstack/nova-scheduler-0" Dec 07 19:37:51 crc kubenswrapper[4815]: I1207 19:37:51.080614 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3117427-1c45-4b4b-bf1d-d0ed52ff2bfa-config-data\") pod \"nova-scheduler-0\" (UID: \"a3117427-1c45-4b4b-bf1d-d0ed52ff2bfa\") " pod="openstack/nova-scheduler-0" Dec 07 19:37:51 crc kubenswrapper[4815]: I1207 19:37:51.085113 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9cjt\" (UniqueName: \"kubernetes.io/projected/a3117427-1c45-4b4b-bf1d-d0ed52ff2bfa-kube-api-access-t9cjt\") pod \"nova-scheduler-0\" (UID: \"a3117427-1c45-4b4b-bf1d-d0ed52ff2bfa\") " pod="openstack/nova-scheduler-0" Dec 07 19:37:51 crc kubenswrapper[4815]: I1207 19:37:51.089149 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 07 19:37:51 crc kubenswrapper[4815]: I1207 19:37:51.727937 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 07 19:37:51 crc kubenswrapper[4815]: W1207 19:37:51.734249 4815 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda3117427_1c45_4b4b_bf1d_d0ed52ff2bfa.slice/crio-ddde8aa9ed948b3f88a01d23b7d93a2c4aab12d21cf658f4abeda458b21b317b WatchSource:0}: Error finding container ddde8aa9ed948b3f88a01d23b7d93a2c4aab12d21cf658f4abeda458b21b317b: Status 404 returned error can't find the container with id ddde8aa9ed948b3f88a01d23b7d93a2c4aab12d21cf658f4abeda458b21b317b Dec 07 19:37:51 crc kubenswrapper[4815]: I1207 19:37:51.779494 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="249344af-6484-4afc-9771-4d6fe5cee8e1" path="/var/lib/kubelet/pods/249344af-6484-4afc-9771-4d6fe5cee8e1/volumes" Dec 07 19:37:52 crc kubenswrapper[4815]: I1207 19:37:52.407263 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a3117427-1c45-4b4b-bf1d-d0ed52ff2bfa","Type":"ContainerStarted","Data":"ab58e0983476ceff39d2b204086797aedb0be4822b45ef2fd05e3d99e469e631"} Dec 07 19:37:52 crc kubenswrapper[4815]: I1207 19:37:52.407649 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a3117427-1c45-4b4b-bf1d-d0ed52ff2bfa","Type":"ContainerStarted","Data":"ddde8aa9ed948b3f88a01d23b7d93a2c4aab12d21cf658f4abeda458b21b317b"} Dec 07 19:37:52 crc kubenswrapper[4815]: I1207 19:37:52.428184 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.428163001 podStartE2EDuration="2.428163001s" podCreationTimestamp="2025-12-07 19:37:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:37:52.421492292 +0000 UTC m=+1377.000482337" watchObservedRunningTime="2025-12-07 19:37:52.428163001 +0000 UTC m=+1377.007153046" Dec 07 19:37:54 crc kubenswrapper[4815]: I1207 19:37:54.042975 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 07 19:37:54 crc kubenswrapper[4815]: I1207 19:37:54.043332 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 07 19:37:56 crc kubenswrapper[4815]: I1207 19:37:56.089656 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 07 19:37:56 crc kubenswrapper[4815]: I1207 19:37:56.360411 4815 patch_prober.go:28] interesting pod/machine-config-daemon-gkn4h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 07 19:37:56 crc kubenswrapper[4815]: I1207 19:37:56.360516 4815 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 07 19:37:56 crc kubenswrapper[4815]: I1207 19:37:56.955224 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 07 19:37:56 crc kubenswrapper[4815]: I1207 19:37:56.955280 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 07 19:37:57 crc kubenswrapper[4815]: I1207 19:37:57.972185 4815 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a8df9c38-f244-46b7-a580-c8882a5a73bf" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.185:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 07 19:37:57 crc kubenswrapper[4815]: I1207 19:37:57.972197 4815 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a8df9c38-f244-46b7-a580-c8882a5a73bf" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.185:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 07 19:37:58 crc kubenswrapper[4815]: I1207 19:37:58.561797 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fxmg4"] Dec 07 19:37:58 crc kubenswrapper[4815]: I1207 19:37:58.565081 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fxmg4" Dec 07 19:37:58 crc kubenswrapper[4815]: I1207 19:37:58.580923 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fxmg4"] Dec 07 19:37:58 crc kubenswrapper[4815]: I1207 19:37:58.688571 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af15d24d-15ca-4cdf-a2a9-83bd7dae8e78-utilities\") pod \"redhat-operators-fxmg4\" (UID: \"af15d24d-15ca-4cdf-a2a9-83bd7dae8e78\") " pod="openshift-marketplace/redhat-operators-fxmg4" Dec 07 19:37:58 crc kubenswrapper[4815]: I1207 19:37:58.688612 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggcp9\" (UniqueName: \"kubernetes.io/projected/af15d24d-15ca-4cdf-a2a9-83bd7dae8e78-kube-api-access-ggcp9\") pod \"redhat-operators-fxmg4\" (UID: \"af15d24d-15ca-4cdf-a2a9-83bd7dae8e78\") " pod="openshift-marketplace/redhat-operators-fxmg4" Dec 07 19:37:58 crc kubenswrapper[4815]: I1207 19:37:58.688992 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af15d24d-15ca-4cdf-a2a9-83bd7dae8e78-catalog-content\") pod \"redhat-operators-fxmg4\" (UID: \"af15d24d-15ca-4cdf-a2a9-83bd7dae8e78\") " pod="openshift-marketplace/redhat-operators-fxmg4" Dec 07 19:37:58 crc kubenswrapper[4815]: I1207 19:37:58.791179 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af15d24d-15ca-4cdf-a2a9-83bd7dae8e78-utilities\") pod \"redhat-operators-fxmg4\" (UID: \"af15d24d-15ca-4cdf-a2a9-83bd7dae8e78\") " pod="openshift-marketplace/redhat-operators-fxmg4" Dec 07 19:37:58 crc kubenswrapper[4815]: I1207 19:37:58.791224 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggcp9\" (UniqueName: \"kubernetes.io/projected/af15d24d-15ca-4cdf-a2a9-83bd7dae8e78-kube-api-access-ggcp9\") pod \"redhat-operators-fxmg4\" (UID: \"af15d24d-15ca-4cdf-a2a9-83bd7dae8e78\") " pod="openshift-marketplace/redhat-operators-fxmg4" Dec 07 19:37:58 crc kubenswrapper[4815]: I1207 19:37:58.791296 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af15d24d-15ca-4cdf-a2a9-83bd7dae8e78-catalog-content\") pod \"redhat-operators-fxmg4\" (UID: \"af15d24d-15ca-4cdf-a2a9-83bd7dae8e78\") " pod="openshift-marketplace/redhat-operators-fxmg4" Dec 07 19:37:58 crc kubenswrapper[4815]: I1207 19:37:58.791798 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af15d24d-15ca-4cdf-a2a9-83bd7dae8e78-utilities\") pod \"redhat-operators-fxmg4\" (UID: \"af15d24d-15ca-4cdf-a2a9-83bd7dae8e78\") " pod="openshift-marketplace/redhat-operators-fxmg4" Dec 07 19:37:58 crc kubenswrapper[4815]: I1207 19:37:58.791833 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af15d24d-15ca-4cdf-a2a9-83bd7dae8e78-catalog-content\") pod \"redhat-operators-fxmg4\" (UID: \"af15d24d-15ca-4cdf-a2a9-83bd7dae8e78\") " pod="openshift-marketplace/redhat-operators-fxmg4" Dec 07 19:37:58 crc kubenswrapper[4815]: I1207 19:37:58.814955 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggcp9\" (UniqueName: \"kubernetes.io/projected/af15d24d-15ca-4cdf-a2a9-83bd7dae8e78-kube-api-access-ggcp9\") pod \"redhat-operators-fxmg4\" (UID: \"af15d24d-15ca-4cdf-a2a9-83bd7dae8e78\") " pod="openshift-marketplace/redhat-operators-fxmg4" Dec 07 19:37:58 crc kubenswrapper[4815]: I1207 19:37:58.885267 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fxmg4" Dec 07 19:37:59 crc kubenswrapper[4815]: I1207 19:37:59.043035 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 07 19:37:59 crc kubenswrapper[4815]: I1207 19:37:59.044516 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 07 19:37:59 crc kubenswrapper[4815]: I1207 19:37:59.226283 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fxmg4"] Dec 07 19:37:59 crc kubenswrapper[4815]: I1207 19:37:59.477048 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fxmg4" event={"ID":"af15d24d-15ca-4cdf-a2a9-83bd7dae8e78","Type":"ContainerStarted","Data":"9de55260dc6cbdebc9758ef9da724920a6e9b93bf7b2ef0b9eac96671f3f2a5f"} Dec 07 19:38:00 crc kubenswrapper[4815]: I1207 19:38:00.063160 4815 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="809d96c5-69a9-4892-b39c-34cf1313aff6" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.186:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 07 19:38:00 crc kubenswrapper[4815]: I1207 19:38:00.063197 4815 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="809d96c5-69a9-4892-b39c-34cf1313aff6" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.186:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 07 19:38:00 crc kubenswrapper[4815]: I1207 19:38:00.487550 4815 generic.go:334] "Generic (PLEG): container finished" podID="af15d24d-15ca-4cdf-a2a9-83bd7dae8e78" containerID="aa0aeb38119696d988b3616c91665addf10a14a18b6597f860c0e0ccffd1c090" exitCode=0 Dec 07 19:38:00 crc kubenswrapper[4815]: I1207 19:38:00.487592 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fxmg4" event={"ID":"af15d24d-15ca-4cdf-a2a9-83bd7dae8e78","Type":"ContainerDied","Data":"aa0aeb38119696d988b3616c91665addf10a14a18b6597f860c0e0ccffd1c090"} Dec 07 19:38:01 crc kubenswrapper[4815]: I1207 19:38:01.090247 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 07 19:38:01 crc kubenswrapper[4815]: I1207 19:38:01.120135 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 07 19:38:01 crc kubenswrapper[4815]: I1207 19:38:01.497230 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fxmg4" event={"ID":"af15d24d-15ca-4cdf-a2a9-83bd7dae8e78","Type":"ContainerStarted","Data":"1765c04142d3f109b594b9024c60f4d9e97d82d0720d10b0dc429a120fad2e44"} Dec 07 19:38:01 crc kubenswrapper[4815]: I1207 19:38:01.601632 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 07 19:38:05 crc kubenswrapper[4815]: I1207 19:38:05.538733 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 07 19:38:06 crc kubenswrapper[4815]: I1207 19:38:06.962501 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 07 19:38:06 crc kubenswrapper[4815]: I1207 19:38:06.963663 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 07 19:38:06 crc kubenswrapper[4815]: I1207 19:38:06.968533 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 07 19:38:06 crc kubenswrapper[4815]: I1207 19:38:06.971600 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 07 19:38:07 crc kubenswrapper[4815]: I1207 19:38:07.567095 4815 generic.go:334] "Generic (PLEG): container finished" podID="af15d24d-15ca-4cdf-a2a9-83bd7dae8e78" containerID="1765c04142d3f109b594b9024c60f4d9e97d82d0720d10b0dc429a120fad2e44" exitCode=0 Dec 07 19:38:07 crc kubenswrapper[4815]: I1207 19:38:07.569081 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fxmg4" event={"ID":"af15d24d-15ca-4cdf-a2a9-83bd7dae8e78","Type":"ContainerDied","Data":"1765c04142d3f109b594b9024c60f4d9e97d82d0720d10b0dc429a120fad2e44"} Dec 07 19:38:07 crc kubenswrapper[4815]: I1207 19:38:07.569134 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 07 19:38:07 crc kubenswrapper[4815]: I1207 19:38:07.586060 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 07 19:38:08 crc kubenswrapper[4815]: I1207 19:38:08.578793 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fxmg4" event={"ID":"af15d24d-15ca-4cdf-a2a9-83bd7dae8e78","Type":"ContainerStarted","Data":"91886747469c5504616a11c830cfd425d49cfe0c0b435536ab4a95c68cf15c99"} Dec 07 19:38:08 crc kubenswrapper[4815]: I1207 19:38:08.608301 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fxmg4" podStartSLOduration=3.042908541 podStartE2EDuration="10.608278402s" podCreationTimestamp="2025-12-07 19:37:58 +0000 UTC" firstStartedPulling="2025-12-07 19:38:00.48980156 +0000 UTC m=+1385.068791605" lastFinishedPulling="2025-12-07 19:38:08.055171411 +0000 UTC m=+1392.634161466" observedRunningTime="2025-12-07 19:38:08.60003957 +0000 UTC m=+1393.179029615" watchObservedRunningTime="2025-12-07 19:38:08.608278402 +0000 UTC m=+1393.187268437" Dec 07 19:38:08 crc kubenswrapper[4815]: I1207 19:38:08.886007 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fxmg4" Dec 07 19:38:08 crc kubenswrapper[4815]: I1207 19:38:08.886409 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fxmg4" Dec 07 19:38:09 crc kubenswrapper[4815]: I1207 19:38:09.051237 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 07 19:38:09 crc kubenswrapper[4815]: I1207 19:38:09.051290 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 07 19:38:09 crc kubenswrapper[4815]: I1207 19:38:09.074749 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 07 19:38:09 crc kubenswrapper[4815]: I1207 19:38:09.076003 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 07 19:38:09 crc kubenswrapper[4815]: I1207 19:38:09.946539 4815 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fxmg4" podUID="af15d24d-15ca-4cdf-a2a9-83bd7dae8e78" containerName="registry-server" probeResult="failure" output=< Dec 07 19:38:09 crc kubenswrapper[4815]: timeout: failed to connect service ":50051" within 1s Dec 07 19:38:09 crc kubenswrapper[4815]: > Dec 07 19:38:16 crc kubenswrapper[4815]: I1207 19:38:16.967803 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 07 19:38:18 crc kubenswrapper[4815]: I1207 19:38:18.492639 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 07 19:38:18 crc kubenswrapper[4815]: I1207 19:38:18.955316 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fxmg4" Dec 07 19:38:19 crc kubenswrapper[4815]: I1207 19:38:19.022821 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fxmg4" Dec 07 19:38:19 crc kubenswrapper[4815]: I1207 19:38:19.203456 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fxmg4"] Dec 07 19:38:20 crc kubenswrapper[4815]: I1207 19:38:20.674709 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fxmg4" podUID="af15d24d-15ca-4cdf-a2a9-83bd7dae8e78" containerName="registry-server" containerID="cri-o://91886747469c5504616a11c830cfd425d49cfe0c0b435536ab4a95c68cf15c99" gracePeriod=2 Dec 07 19:38:21 crc kubenswrapper[4815]: I1207 19:38:21.123547 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fxmg4" Dec 07 19:38:21 crc kubenswrapper[4815]: I1207 19:38:21.163813 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af15d24d-15ca-4cdf-a2a9-83bd7dae8e78-utilities\") pod \"af15d24d-15ca-4cdf-a2a9-83bd7dae8e78\" (UID: \"af15d24d-15ca-4cdf-a2a9-83bd7dae8e78\") " Dec 07 19:38:21 crc kubenswrapper[4815]: I1207 19:38:21.163872 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ggcp9\" (UniqueName: \"kubernetes.io/projected/af15d24d-15ca-4cdf-a2a9-83bd7dae8e78-kube-api-access-ggcp9\") pod \"af15d24d-15ca-4cdf-a2a9-83bd7dae8e78\" (UID: \"af15d24d-15ca-4cdf-a2a9-83bd7dae8e78\") " Dec 07 19:38:21 crc kubenswrapper[4815]: I1207 19:38:21.163892 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af15d24d-15ca-4cdf-a2a9-83bd7dae8e78-catalog-content\") pod \"af15d24d-15ca-4cdf-a2a9-83bd7dae8e78\" (UID: \"af15d24d-15ca-4cdf-a2a9-83bd7dae8e78\") " Dec 07 19:38:21 crc kubenswrapper[4815]: I1207 19:38:21.165391 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af15d24d-15ca-4cdf-a2a9-83bd7dae8e78-utilities" (OuterVolumeSpecName: "utilities") pod "af15d24d-15ca-4cdf-a2a9-83bd7dae8e78" (UID: "af15d24d-15ca-4cdf-a2a9-83bd7dae8e78"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 07 19:38:21 crc kubenswrapper[4815]: I1207 19:38:21.183767 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af15d24d-15ca-4cdf-a2a9-83bd7dae8e78-kube-api-access-ggcp9" (OuterVolumeSpecName: "kube-api-access-ggcp9") pod "af15d24d-15ca-4cdf-a2a9-83bd7dae8e78" (UID: "af15d24d-15ca-4cdf-a2a9-83bd7dae8e78"). InnerVolumeSpecName "kube-api-access-ggcp9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:38:21 crc kubenswrapper[4815]: I1207 19:38:21.265512 4815 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af15d24d-15ca-4cdf-a2a9-83bd7dae8e78-utilities\") on node \"crc\" DevicePath \"\"" Dec 07 19:38:21 crc kubenswrapper[4815]: I1207 19:38:21.265545 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ggcp9\" (UniqueName: \"kubernetes.io/projected/af15d24d-15ca-4cdf-a2a9-83bd7dae8e78-kube-api-access-ggcp9\") on node \"crc\" DevicePath \"\"" Dec 07 19:38:21 crc kubenswrapper[4815]: I1207 19:38:21.269594 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af15d24d-15ca-4cdf-a2a9-83bd7dae8e78-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "af15d24d-15ca-4cdf-a2a9-83bd7dae8e78" (UID: "af15d24d-15ca-4cdf-a2a9-83bd7dae8e78"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 07 19:38:21 crc kubenswrapper[4815]: I1207 19:38:21.367014 4815 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af15d24d-15ca-4cdf-a2a9-83bd7dae8e78-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 07 19:38:21 crc kubenswrapper[4815]: I1207 19:38:21.388970 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="c2f348fe-7af1-4260-9946-27b3e711400d" containerName="rabbitmq" containerID="cri-o://c6a0f7926e8736e115fd39c81271cae94bd2d46de082517f365745f1b0ebf535" gracePeriod=604796 Dec 07 19:38:21 crc kubenswrapper[4815]: I1207 19:38:21.785171 4815 generic.go:334] "Generic (PLEG): container finished" podID="af15d24d-15ca-4cdf-a2a9-83bd7dae8e78" containerID="91886747469c5504616a11c830cfd425d49cfe0c0b435536ab4a95c68cf15c99" exitCode=0 Dec 07 19:38:21 crc kubenswrapper[4815]: I1207 19:38:21.785294 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fxmg4" Dec 07 19:38:21 crc kubenswrapper[4815]: I1207 19:38:21.823556 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fxmg4" event={"ID":"af15d24d-15ca-4cdf-a2a9-83bd7dae8e78","Type":"ContainerDied","Data":"91886747469c5504616a11c830cfd425d49cfe0c0b435536ab4a95c68cf15c99"} Dec 07 19:38:21 crc kubenswrapper[4815]: I1207 19:38:21.823602 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fxmg4" event={"ID":"af15d24d-15ca-4cdf-a2a9-83bd7dae8e78","Type":"ContainerDied","Data":"9de55260dc6cbdebc9758ef9da724920a6e9b93bf7b2ef0b9eac96671f3f2a5f"} Dec 07 19:38:21 crc kubenswrapper[4815]: I1207 19:38:21.823631 4815 scope.go:117] "RemoveContainer" containerID="91886747469c5504616a11c830cfd425d49cfe0c0b435536ab4a95c68cf15c99" Dec 07 19:38:21 crc kubenswrapper[4815]: I1207 19:38:21.849562 4815 scope.go:117] "RemoveContainer" containerID="1765c04142d3f109b594b9024c60f4d9e97d82d0720d10b0dc429a120fad2e44" Dec 07 19:38:21 crc kubenswrapper[4815]: I1207 19:38:21.855817 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fxmg4"] Dec 07 19:38:21 crc kubenswrapper[4815]: I1207 19:38:21.877149 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fxmg4"] Dec 07 19:38:21 crc kubenswrapper[4815]: I1207 19:38:21.930062 4815 scope.go:117] "RemoveContainer" containerID="aa0aeb38119696d988b3616c91665addf10a14a18b6597f860c0e0ccffd1c090" Dec 07 19:38:21 crc kubenswrapper[4815]: I1207 19:38:21.993088 4815 scope.go:117] "RemoveContainer" containerID="91886747469c5504616a11c830cfd425d49cfe0c0b435536ab4a95c68cf15c99" Dec 07 19:38:21 crc kubenswrapper[4815]: E1207 19:38:21.993495 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91886747469c5504616a11c830cfd425d49cfe0c0b435536ab4a95c68cf15c99\": container with ID starting with 91886747469c5504616a11c830cfd425d49cfe0c0b435536ab4a95c68cf15c99 not found: ID does not exist" containerID="91886747469c5504616a11c830cfd425d49cfe0c0b435536ab4a95c68cf15c99" Dec 07 19:38:21 crc kubenswrapper[4815]: I1207 19:38:21.993524 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91886747469c5504616a11c830cfd425d49cfe0c0b435536ab4a95c68cf15c99"} err="failed to get container status \"91886747469c5504616a11c830cfd425d49cfe0c0b435536ab4a95c68cf15c99\": rpc error: code = NotFound desc = could not find container \"91886747469c5504616a11c830cfd425d49cfe0c0b435536ab4a95c68cf15c99\": container with ID starting with 91886747469c5504616a11c830cfd425d49cfe0c0b435536ab4a95c68cf15c99 not found: ID does not exist" Dec 07 19:38:21 crc kubenswrapper[4815]: I1207 19:38:21.993545 4815 scope.go:117] "RemoveContainer" containerID="1765c04142d3f109b594b9024c60f4d9e97d82d0720d10b0dc429a120fad2e44" Dec 07 19:38:21 crc kubenswrapper[4815]: E1207 19:38:21.993707 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1765c04142d3f109b594b9024c60f4d9e97d82d0720d10b0dc429a120fad2e44\": container with ID starting with 1765c04142d3f109b594b9024c60f4d9e97d82d0720d10b0dc429a120fad2e44 not found: ID does not exist" containerID="1765c04142d3f109b594b9024c60f4d9e97d82d0720d10b0dc429a120fad2e44" Dec 07 19:38:21 crc kubenswrapper[4815]: I1207 19:38:21.993726 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1765c04142d3f109b594b9024c60f4d9e97d82d0720d10b0dc429a120fad2e44"} err="failed to get container status \"1765c04142d3f109b594b9024c60f4d9e97d82d0720d10b0dc429a120fad2e44\": rpc error: code = NotFound desc = could not find container \"1765c04142d3f109b594b9024c60f4d9e97d82d0720d10b0dc429a120fad2e44\": container with ID starting with 1765c04142d3f109b594b9024c60f4d9e97d82d0720d10b0dc429a120fad2e44 not found: ID does not exist" Dec 07 19:38:21 crc kubenswrapper[4815]: I1207 19:38:21.993739 4815 scope.go:117] "RemoveContainer" containerID="aa0aeb38119696d988b3616c91665addf10a14a18b6597f860c0e0ccffd1c090" Dec 07 19:38:21 crc kubenswrapper[4815]: E1207 19:38:21.993892 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa0aeb38119696d988b3616c91665addf10a14a18b6597f860c0e0ccffd1c090\": container with ID starting with aa0aeb38119696d988b3616c91665addf10a14a18b6597f860c0e0ccffd1c090 not found: ID does not exist" containerID="aa0aeb38119696d988b3616c91665addf10a14a18b6597f860c0e0ccffd1c090" Dec 07 19:38:21 crc kubenswrapper[4815]: I1207 19:38:21.993908 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa0aeb38119696d988b3616c91665addf10a14a18b6597f860c0e0ccffd1c090"} err="failed to get container status \"aa0aeb38119696d988b3616c91665addf10a14a18b6597f860c0e0ccffd1c090\": rpc error: code = NotFound desc = could not find container \"aa0aeb38119696d988b3616c91665addf10a14a18b6597f860c0e0ccffd1c090\": container with ID starting with aa0aeb38119696d988b3616c91665addf10a14a18b6597f860c0e0ccffd1c090 not found: ID does not exist" Dec 07 19:38:23 crc kubenswrapper[4815]: I1207 19:38:23.058740 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="814a06c9-c432-4a32-835e-59a4831cf335" containerName="rabbitmq" containerID="cri-o://29919a154299bd8a43d0a6db20c47d16533c78aca191b5f888bd4bdf9b546516" gracePeriod=604796 Dec 07 19:38:23 crc kubenswrapper[4815]: I1207 19:38:23.783501 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af15d24d-15ca-4cdf-a2a9-83bd7dae8e78" path="/var/lib/kubelet/pods/af15d24d-15ca-4cdf-a2a9-83bd7dae8e78/volumes" Dec 07 19:38:26 crc kubenswrapper[4815]: I1207 19:38:26.360180 4815 patch_prober.go:28] interesting pod/machine-config-daemon-gkn4h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 07 19:38:26 crc kubenswrapper[4815]: I1207 19:38:26.360700 4815 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 07 19:38:26 crc kubenswrapper[4815]: I1207 19:38:26.360761 4815 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" Dec 07 19:38:26 crc kubenswrapper[4815]: I1207 19:38:26.361593 4815 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5abc2074a486ad753dc90f2524ca4ee8cd9d4b0e73ed194398bf623a9d215d17"} pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 07 19:38:26 crc kubenswrapper[4815]: I1207 19:38:26.361655 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" containerName="machine-config-daemon" containerID="cri-o://5abc2074a486ad753dc90f2524ca4ee8cd9d4b0e73ed194398bf623a9d215d17" gracePeriod=600 Dec 07 19:38:26 crc kubenswrapper[4815]: I1207 19:38:26.455339 4815 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="c2f348fe-7af1-4260-9946-27b3e711400d" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.99:5671: connect: connection refused" Dec 07 19:38:26 crc kubenswrapper[4815]: I1207 19:38:26.652284 4815 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="814a06c9-c432-4a32-835e-59a4831cf335" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.100:5671: connect: connection refused" Dec 07 19:38:26 crc kubenswrapper[4815]: I1207 19:38:26.980121 4815 generic.go:334] "Generic (PLEG): container finished" podID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" containerID="5abc2074a486ad753dc90f2524ca4ee8cd9d4b0e73ed194398bf623a9d215d17" exitCode=0 Dec 07 19:38:26 crc kubenswrapper[4815]: I1207 19:38:26.980457 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" event={"ID":"3d662ba2-aa03-4eea-bd30-8ad40638f6c7","Type":"ContainerDied","Data":"5abc2074a486ad753dc90f2524ca4ee8cd9d4b0e73ed194398bf623a9d215d17"} Dec 07 19:38:26 crc kubenswrapper[4815]: I1207 19:38:26.980505 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" event={"ID":"3d662ba2-aa03-4eea-bd30-8ad40638f6c7","Type":"ContainerStarted","Data":"f352000dbce74affccf74924ce7b6a53eb5e32c31268c3c83ff0c4e8881fae0d"} Dec 07 19:38:26 crc kubenswrapper[4815]: I1207 19:38:26.980530 4815 scope.go:117] "RemoveContainer" containerID="f9fe1f41b8a7f99c3095c8366f5a8da29f411acbdda680568a628c2a0720c31e" Dec 07 19:38:27 crc kubenswrapper[4815]: I1207 19:38:27.994427 4815 generic.go:334] "Generic (PLEG): container finished" podID="c2f348fe-7af1-4260-9946-27b3e711400d" containerID="c6a0f7926e8736e115fd39c81271cae94bd2d46de082517f365745f1b0ebf535" exitCode=0 Dec 07 19:38:27 crc kubenswrapper[4815]: I1207 19:38:27.995079 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c2f348fe-7af1-4260-9946-27b3e711400d","Type":"ContainerDied","Data":"c6a0f7926e8736e115fd39c81271cae94bd2d46de082517f365745f1b0ebf535"} Dec 07 19:38:27 crc kubenswrapper[4815]: I1207 19:38:27.995109 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c2f348fe-7af1-4260-9946-27b3e711400d","Type":"ContainerDied","Data":"d7fe93dc9250c34e0ee1179b01a5884f7819d6ca8d2a2943b4c221bf26f7d727"} Dec 07 19:38:27 crc kubenswrapper[4815]: I1207 19:38:27.995120 4815 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d7fe93dc9250c34e0ee1179b01a5884f7819d6ca8d2a2943b4c221bf26f7d727" Dec 07 19:38:28 crc kubenswrapper[4815]: I1207 19:38:28.056425 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 07 19:38:28 crc kubenswrapper[4815]: I1207 19:38:28.131544 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c2f348fe-7af1-4260-9946-27b3e711400d-rabbitmq-plugins\") pod \"c2f348fe-7af1-4260-9946-27b3e711400d\" (UID: \"c2f348fe-7af1-4260-9946-27b3e711400d\") " Dec 07 19:38:28 crc kubenswrapper[4815]: I1207 19:38:28.131618 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c2f348fe-7af1-4260-9946-27b3e711400d-plugins-conf\") pod \"c2f348fe-7af1-4260-9946-27b3e711400d\" (UID: \"c2f348fe-7af1-4260-9946-27b3e711400d\") " Dec 07 19:38:28 crc kubenswrapper[4815]: I1207 19:38:28.131675 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c2f348fe-7af1-4260-9946-27b3e711400d-server-conf\") pod \"c2f348fe-7af1-4260-9946-27b3e711400d\" (UID: \"c2f348fe-7af1-4260-9946-27b3e711400d\") " Dec 07 19:38:28 crc kubenswrapper[4815]: I1207 19:38:28.131706 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vh75p\" (UniqueName: \"kubernetes.io/projected/c2f348fe-7af1-4260-9946-27b3e711400d-kube-api-access-vh75p\") pod \"c2f348fe-7af1-4260-9946-27b3e711400d\" (UID: \"c2f348fe-7af1-4260-9946-27b3e711400d\") " Dec 07 19:38:28 crc kubenswrapper[4815]: I1207 19:38:28.131773 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c2f348fe-7af1-4260-9946-27b3e711400d-rabbitmq-erlang-cookie\") pod \"c2f348fe-7af1-4260-9946-27b3e711400d\" (UID: \"c2f348fe-7af1-4260-9946-27b3e711400d\") " Dec 07 19:38:28 crc kubenswrapper[4815]: I1207 19:38:28.131833 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c2f348fe-7af1-4260-9946-27b3e711400d-pod-info\") pod \"c2f348fe-7af1-4260-9946-27b3e711400d\" (UID: \"c2f348fe-7af1-4260-9946-27b3e711400d\") " Dec 07 19:38:28 crc kubenswrapper[4815]: I1207 19:38:28.131887 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"c2f348fe-7af1-4260-9946-27b3e711400d\" (UID: \"c2f348fe-7af1-4260-9946-27b3e711400d\") " Dec 07 19:38:28 crc kubenswrapper[4815]: I1207 19:38:28.131933 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c2f348fe-7af1-4260-9946-27b3e711400d-rabbitmq-tls\") pod \"c2f348fe-7af1-4260-9946-27b3e711400d\" (UID: \"c2f348fe-7af1-4260-9946-27b3e711400d\") " Dec 07 19:38:28 crc kubenswrapper[4815]: I1207 19:38:28.131974 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c2f348fe-7af1-4260-9946-27b3e711400d-erlang-cookie-secret\") pod \"c2f348fe-7af1-4260-9946-27b3e711400d\" (UID: \"c2f348fe-7af1-4260-9946-27b3e711400d\") " Dec 07 19:38:28 crc kubenswrapper[4815]: I1207 19:38:28.132006 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c2f348fe-7af1-4260-9946-27b3e711400d-rabbitmq-confd\") pod \"c2f348fe-7af1-4260-9946-27b3e711400d\" (UID: \"c2f348fe-7af1-4260-9946-27b3e711400d\") " Dec 07 19:38:28 crc kubenswrapper[4815]: I1207 19:38:28.132058 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c2f348fe-7af1-4260-9946-27b3e711400d-config-data\") pod \"c2f348fe-7af1-4260-9946-27b3e711400d\" (UID: \"c2f348fe-7af1-4260-9946-27b3e711400d\") " Dec 07 19:38:28 crc kubenswrapper[4815]: I1207 19:38:28.133011 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2f348fe-7af1-4260-9946-27b3e711400d-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "c2f348fe-7af1-4260-9946-27b3e711400d" (UID: "c2f348fe-7af1-4260-9946-27b3e711400d"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 07 19:38:28 crc kubenswrapper[4815]: I1207 19:38:28.133459 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2f348fe-7af1-4260-9946-27b3e711400d-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "c2f348fe-7af1-4260-9946-27b3e711400d" (UID: "c2f348fe-7af1-4260-9946-27b3e711400d"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 07 19:38:28 crc kubenswrapper[4815]: I1207 19:38:28.133824 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2f348fe-7af1-4260-9946-27b3e711400d-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "c2f348fe-7af1-4260-9946-27b3e711400d" (UID: "c2f348fe-7af1-4260-9946-27b3e711400d"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:38:28 crc kubenswrapper[4815]: I1207 19:38:28.141151 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2f348fe-7af1-4260-9946-27b3e711400d-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "c2f348fe-7af1-4260-9946-27b3e711400d" (UID: "c2f348fe-7af1-4260-9946-27b3e711400d"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:38:28 crc kubenswrapper[4815]: I1207 19:38:28.142930 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2f348fe-7af1-4260-9946-27b3e711400d-kube-api-access-vh75p" (OuterVolumeSpecName: "kube-api-access-vh75p") pod "c2f348fe-7af1-4260-9946-27b3e711400d" (UID: "c2f348fe-7af1-4260-9946-27b3e711400d"). InnerVolumeSpecName "kube-api-access-vh75p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:38:28 crc kubenswrapper[4815]: I1207 19:38:28.161149 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2f348fe-7af1-4260-9946-27b3e711400d-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "c2f348fe-7af1-4260-9946-27b3e711400d" (UID: "c2f348fe-7af1-4260-9946-27b3e711400d"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:38:28 crc kubenswrapper[4815]: I1207 19:38:28.161449 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "persistence") pod "c2f348fe-7af1-4260-9946-27b3e711400d" (UID: "c2f348fe-7af1-4260-9946-27b3e711400d"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 07 19:38:28 crc kubenswrapper[4815]: I1207 19:38:28.164084 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/c2f348fe-7af1-4260-9946-27b3e711400d-pod-info" (OuterVolumeSpecName: "pod-info") pod "c2f348fe-7af1-4260-9946-27b3e711400d" (UID: "c2f348fe-7af1-4260-9946-27b3e711400d"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 07 19:38:28 crc kubenswrapper[4815]: I1207 19:38:28.197542 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2f348fe-7af1-4260-9946-27b3e711400d-config-data" (OuterVolumeSpecName: "config-data") pod "c2f348fe-7af1-4260-9946-27b3e711400d" (UID: "c2f348fe-7af1-4260-9946-27b3e711400d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:38:28 crc kubenswrapper[4815]: I1207 19:38:28.238887 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vh75p\" (UniqueName: \"kubernetes.io/projected/c2f348fe-7af1-4260-9946-27b3e711400d-kube-api-access-vh75p\") on node \"crc\" DevicePath \"\"" Dec 07 19:38:28 crc kubenswrapper[4815]: I1207 19:38:28.238968 4815 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c2f348fe-7af1-4260-9946-27b3e711400d-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 07 19:38:28 crc kubenswrapper[4815]: I1207 19:38:28.238977 4815 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c2f348fe-7af1-4260-9946-27b3e711400d-pod-info\") on node \"crc\" DevicePath \"\"" Dec 07 19:38:28 crc kubenswrapper[4815]: I1207 19:38:28.239045 4815 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Dec 07 19:38:28 crc kubenswrapper[4815]: I1207 19:38:28.239057 4815 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c2f348fe-7af1-4260-9946-27b3e711400d-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 07 19:38:28 crc kubenswrapper[4815]: I1207 19:38:28.239066 4815 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c2f348fe-7af1-4260-9946-27b3e711400d-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 07 19:38:28 crc kubenswrapper[4815]: I1207 19:38:28.239093 4815 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c2f348fe-7af1-4260-9946-27b3e711400d-config-data\") on node \"crc\" DevicePath \"\"" Dec 07 19:38:28 crc kubenswrapper[4815]: I1207 19:38:28.239101 4815 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c2f348fe-7af1-4260-9946-27b3e711400d-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 07 19:38:28 crc kubenswrapper[4815]: I1207 19:38:28.239109 4815 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c2f348fe-7af1-4260-9946-27b3e711400d-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 07 19:38:28 crc kubenswrapper[4815]: I1207 19:38:28.255295 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2f348fe-7af1-4260-9946-27b3e711400d-server-conf" (OuterVolumeSpecName: "server-conf") pod "c2f348fe-7af1-4260-9946-27b3e711400d" (UID: "c2f348fe-7af1-4260-9946-27b3e711400d"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:38:28 crc kubenswrapper[4815]: I1207 19:38:28.274127 4815 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Dec 07 19:38:28 crc kubenswrapper[4815]: I1207 19:38:28.308119 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2f348fe-7af1-4260-9946-27b3e711400d-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "c2f348fe-7af1-4260-9946-27b3e711400d" (UID: "c2f348fe-7af1-4260-9946-27b3e711400d"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:38:28 crc kubenswrapper[4815]: I1207 19:38:28.341292 4815 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c2f348fe-7af1-4260-9946-27b3e711400d-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 07 19:38:28 crc kubenswrapper[4815]: I1207 19:38:28.341326 4815 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c2f348fe-7af1-4260-9946-27b3e711400d-server-conf\") on node \"crc\" DevicePath \"\"" Dec 07 19:38:28 crc kubenswrapper[4815]: I1207 19:38:28.341340 4815 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Dec 07 19:38:29 crc kubenswrapper[4815]: I1207 19:38:29.011814 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 07 19:38:29 crc kubenswrapper[4815]: I1207 19:38:29.054611 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 07 19:38:29 crc kubenswrapper[4815]: I1207 19:38:29.065286 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 07 19:38:29 crc kubenswrapper[4815]: I1207 19:38:29.087246 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 07 19:38:29 crc kubenswrapper[4815]: E1207 19:38:29.087725 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2f348fe-7af1-4260-9946-27b3e711400d" containerName="rabbitmq" Dec 07 19:38:29 crc kubenswrapper[4815]: I1207 19:38:29.087750 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2f348fe-7af1-4260-9946-27b3e711400d" containerName="rabbitmq" Dec 07 19:38:29 crc kubenswrapper[4815]: E1207 19:38:29.087783 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af15d24d-15ca-4cdf-a2a9-83bd7dae8e78" containerName="extract-utilities" Dec 07 19:38:29 crc kubenswrapper[4815]: I1207 19:38:29.087794 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="af15d24d-15ca-4cdf-a2a9-83bd7dae8e78" containerName="extract-utilities" Dec 07 19:38:29 crc kubenswrapper[4815]: E1207 19:38:29.087822 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af15d24d-15ca-4cdf-a2a9-83bd7dae8e78" containerName="registry-server" Dec 07 19:38:29 crc kubenswrapper[4815]: I1207 19:38:29.087833 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="af15d24d-15ca-4cdf-a2a9-83bd7dae8e78" containerName="registry-server" Dec 07 19:38:29 crc kubenswrapper[4815]: E1207 19:38:29.087850 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2f348fe-7af1-4260-9946-27b3e711400d" containerName="setup-container" Dec 07 19:38:29 crc kubenswrapper[4815]: I1207 19:38:29.087858 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2f348fe-7af1-4260-9946-27b3e711400d" containerName="setup-container" Dec 07 19:38:29 crc kubenswrapper[4815]: E1207 19:38:29.087877 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af15d24d-15ca-4cdf-a2a9-83bd7dae8e78" containerName="extract-content" Dec 07 19:38:29 crc kubenswrapper[4815]: I1207 19:38:29.087886 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="af15d24d-15ca-4cdf-a2a9-83bd7dae8e78" containerName="extract-content" Dec 07 19:38:29 crc kubenswrapper[4815]: I1207 19:38:29.088152 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="af15d24d-15ca-4cdf-a2a9-83bd7dae8e78" containerName="registry-server" Dec 07 19:38:29 crc kubenswrapper[4815]: I1207 19:38:29.088213 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2f348fe-7af1-4260-9946-27b3e711400d" containerName="rabbitmq" Dec 07 19:38:29 crc kubenswrapper[4815]: I1207 19:38:29.089397 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 07 19:38:29 crc kubenswrapper[4815]: I1207 19:38:29.091747 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 07 19:38:29 crc kubenswrapper[4815]: I1207 19:38:29.092641 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 07 19:38:29 crc kubenswrapper[4815]: I1207 19:38:29.093106 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 07 19:38:29 crc kubenswrapper[4815]: I1207 19:38:29.093236 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-ccddp" Dec 07 19:38:29 crc kubenswrapper[4815]: I1207 19:38:29.093432 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 07 19:38:29 crc kubenswrapper[4815]: I1207 19:38:29.094741 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 07 19:38:29 crc kubenswrapper[4815]: I1207 19:38:29.094785 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 07 19:38:29 crc kubenswrapper[4815]: I1207 19:38:29.121307 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 07 19:38:29 crc kubenswrapper[4815]: I1207 19:38:29.157893 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/95704672-75eb-411c-a866-09ed671263f7-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"95704672-75eb-411c-a866-09ed671263f7\") " pod="openstack/rabbitmq-server-0" Dec 07 19:38:29 crc kubenswrapper[4815]: I1207 19:38:29.157973 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/95704672-75eb-411c-a866-09ed671263f7-server-conf\") pod \"rabbitmq-server-0\" (UID: \"95704672-75eb-411c-a866-09ed671263f7\") " pod="openstack/rabbitmq-server-0" Dec 07 19:38:29 crc kubenswrapper[4815]: I1207 19:38:29.158015 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/95704672-75eb-411c-a866-09ed671263f7-pod-info\") pod \"rabbitmq-server-0\" (UID: \"95704672-75eb-411c-a866-09ed671263f7\") " pod="openstack/rabbitmq-server-0" Dec 07 19:38:29 crc kubenswrapper[4815]: I1207 19:38:29.158049 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzvcj\" (UniqueName: \"kubernetes.io/projected/95704672-75eb-411c-a866-09ed671263f7-kube-api-access-kzvcj\") pod \"rabbitmq-server-0\" (UID: \"95704672-75eb-411c-a866-09ed671263f7\") " pod="openstack/rabbitmq-server-0" Dec 07 19:38:29 crc kubenswrapper[4815]: I1207 19:38:29.158103 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/95704672-75eb-411c-a866-09ed671263f7-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"95704672-75eb-411c-a866-09ed671263f7\") " pod="openstack/rabbitmq-server-0" Dec 07 19:38:29 crc kubenswrapper[4815]: I1207 19:38:29.158137 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/95704672-75eb-411c-a866-09ed671263f7-config-data\") pod \"rabbitmq-server-0\" (UID: \"95704672-75eb-411c-a866-09ed671263f7\") " pod="openstack/rabbitmq-server-0" Dec 07 19:38:29 crc kubenswrapper[4815]: I1207 19:38:29.158175 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/95704672-75eb-411c-a866-09ed671263f7-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"95704672-75eb-411c-a866-09ed671263f7\") " pod="openstack/rabbitmq-server-0" Dec 07 19:38:29 crc kubenswrapper[4815]: I1207 19:38:29.158198 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/95704672-75eb-411c-a866-09ed671263f7-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"95704672-75eb-411c-a866-09ed671263f7\") " pod="openstack/rabbitmq-server-0" Dec 07 19:38:29 crc kubenswrapper[4815]: I1207 19:38:29.158243 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"95704672-75eb-411c-a866-09ed671263f7\") " pod="openstack/rabbitmq-server-0" Dec 07 19:38:29 crc kubenswrapper[4815]: I1207 19:38:29.158266 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/95704672-75eb-411c-a866-09ed671263f7-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"95704672-75eb-411c-a866-09ed671263f7\") " pod="openstack/rabbitmq-server-0" Dec 07 19:38:29 crc kubenswrapper[4815]: I1207 19:38:29.158306 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/95704672-75eb-411c-a866-09ed671263f7-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"95704672-75eb-411c-a866-09ed671263f7\") " pod="openstack/rabbitmq-server-0" Dec 07 19:38:29 crc kubenswrapper[4815]: I1207 19:38:29.259950 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/95704672-75eb-411c-a866-09ed671263f7-config-data\") pod \"rabbitmq-server-0\" (UID: \"95704672-75eb-411c-a866-09ed671263f7\") " pod="openstack/rabbitmq-server-0" Dec 07 19:38:29 crc kubenswrapper[4815]: I1207 19:38:29.260019 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/95704672-75eb-411c-a866-09ed671263f7-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"95704672-75eb-411c-a866-09ed671263f7\") " pod="openstack/rabbitmq-server-0" Dec 07 19:38:29 crc kubenswrapper[4815]: I1207 19:38:29.260051 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/95704672-75eb-411c-a866-09ed671263f7-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"95704672-75eb-411c-a866-09ed671263f7\") " pod="openstack/rabbitmq-server-0" Dec 07 19:38:29 crc kubenswrapper[4815]: I1207 19:38:29.260107 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"95704672-75eb-411c-a866-09ed671263f7\") " pod="openstack/rabbitmq-server-0" Dec 07 19:38:29 crc kubenswrapper[4815]: I1207 19:38:29.260133 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/95704672-75eb-411c-a866-09ed671263f7-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"95704672-75eb-411c-a866-09ed671263f7\") " pod="openstack/rabbitmq-server-0" Dec 07 19:38:29 crc kubenswrapper[4815]: I1207 19:38:29.260170 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/95704672-75eb-411c-a866-09ed671263f7-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"95704672-75eb-411c-a866-09ed671263f7\") " pod="openstack/rabbitmq-server-0" Dec 07 19:38:29 crc kubenswrapper[4815]: I1207 19:38:29.260199 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/95704672-75eb-411c-a866-09ed671263f7-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"95704672-75eb-411c-a866-09ed671263f7\") " pod="openstack/rabbitmq-server-0" Dec 07 19:38:29 crc kubenswrapper[4815]: I1207 19:38:29.260234 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/95704672-75eb-411c-a866-09ed671263f7-server-conf\") pod \"rabbitmq-server-0\" (UID: \"95704672-75eb-411c-a866-09ed671263f7\") " pod="openstack/rabbitmq-server-0" Dec 07 19:38:29 crc kubenswrapper[4815]: I1207 19:38:29.260266 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/95704672-75eb-411c-a866-09ed671263f7-pod-info\") pod \"rabbitmq-server-0\" (UID: \"95704672-75eb-411c-a866-09ed671263f7\") " pod="openstack/rabbitmq-server-0" Dec 07 19:38:29 crc kubenswrapper[4815]: I1207 19:38:29.260296 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzvcj\" (UniqueName: \"kubernetes.io/projected/95704672-75eb-411c-a866-09ed671263f7-kube-api-access-kzvcj\") pod \"rabbitmq-server-0\" (UID: \"95704672-75eb-411c-a866-09ed671263f7\") " pod="openstack/rabbitmq-server-0" Dec 07 19:38:29 crc kubenswrapper[4815]: I1207 19:38:29.260347 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/95704672-75eb-411c-a866-09ed671263f7-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"95704672-75eb-411c-a866-09ed671263f7\") " pod="openstack/rabbitmq-server-0" Dec 07 19:38:29 crc kubenswrapper[4815]: I1207 19:38:29.260844 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/95704672-75eb-411c-a866-09ed671263f7-config-data\") pod \"rabbitmq-server-0\" (UID: \"95704672-75eb-411c-a866-09ed671263f7\") " pod="openstack/rabbitmq-server-0" Dec 07 19:38:29 crc kubenswrapper[4815]: I1207 19:38:29.261878 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/95704672-75eb-411c-a866-09ed671263f7-server-conf\") pod \"rabbitmq-server-0\" (UID: \"95704672-75eb-411c-a866-09ed671263f7\") " pod="openstack/rabbitmq-server-0" Dec 07 19:38:29 crc kubenswrapper[4815]: I1207 19:38:29.261589 4815 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"95704672-75eb-411c-a866-09ed671263f7\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-server-0" Dec 07 19:38:29 crc kubenswrapper[4815]: I1207 19:38:29.261791 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/95704672-75eb-411c-a866-09ed671263f7-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"95704672-75eb-411c-a866-09ed671263f7\") " pod="openstack/rabbitmq-server-0" Dec 07 19:38:29 crc kubenswrapper[4815]: I1207 19:38:29.261860 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/95704672-75eb-411c-a866-09ed671263f7-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"95704672-75eb-411c-a866-09ed671263f7\") " pod="openstack/rabbitmq-server-0" Dec 07 19:38:29 crc kubenswrapper[4815]: I1207 19:38:29.261257 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/95704672-75eb-411c-a866-09ed671263f7-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"95704672-75eb-411c-a866-09ed671263f7\") " pod="openstack/rabbitmq-server-0" Dec 07 19:38:29 crc kubenswrapper[4815]: I1207 19:38:29.267371 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/95704672-75eb-411c-a866-09ed671263f7-pod-info\") pod \"rabbitmq-server-0\" (UID: \"95704672-75eb-411c-a866-09ed671263f7\") " pod="openstack/rabbitmq-server-0" Dec 07 19:38:29 crc kubenswrapper[4815]: I1207 19:38:29.303599 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/95704672-75eb-411c-a866-09ed671263f7-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"95704672-75eb-411c-a866-09ed671263f7\") " pod="openstack/rabbitmq-server-0" Dec 07 19:38:29 crc kubenswrapper[4815]: I1207 19:38:29.303992 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/95704672-75eb-411c-a866-09ed671263f7-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"95704672-75eb-411c-a866-09ed671263f7\") " pod="openstack/rabbitmq-server-0" Dec 07 19:38:29 crc kubenswrapper[4815]: I1207 19:38:29.311953 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/95704672-75eb-411c-a866-09ed671263f7-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"95704672-75eb-411c-a866-09ed671263f7\") " pod="openstack/rabbitmq-server-0" Dec 07 19:38:29 crc kubenswrapper[4815]: I1207 19:38:29.315695 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzvcj\" (UniqueName: \"kubernetes.io/projected/95704672-75eb-411c-a866-09ed671263f7-kube-api-access-kzvcj\") pod \"rabbitmq-server-0\" (UID: \"95704672-75eb-411c-a866-09ed671263f7\") " pod="openstack/rabbitmq-server-0" Dec 07 19:38:29 crc kubenswrapper[4815]: I1207 19:38:29.321521 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"95704672-75eb-411c-a866-09ed671263f7\") " pod="openstack/rabbitmq-server-0" Dec 07 19:38:29 crc kubenswrapper[4815]: I1207 19:38:29.410242 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 07 19:38:29 crc kubenswrapper[4815]: I1207 19:38:29.648280 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 07 19:38:29 crc kubenswrapper[4815]: I1207 19:38:29.769900 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/814a06c9-c432-4a32-835e-59a4831cf335-rabbitmq-confd\") pod \"814a06c9-c432-4a32-835e-59a4831cf335\" (UID: \"814a06c9-c432-4a32-835e-59a4831cf335\") " Dec 07 19:38:29 crc kubenswrapper[4815]: I1207 19:38:29.769954 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/814a06c9-c432-4a32-835e-59a4831cf335-server-conf\") pod \"814a06c9-c432-4a32-835e-59a4831cf335\" (UID: \"814a06c9-c432-4a32-835e-59a4831cf335\") " Dec 07 19:38:29 crc kubenswrapper[4815]: I1207 19:38:29.769971 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/814a06c9-c432-4a32-835e-59a4831cf335-erlang-cookie-secret\") pod \"814a06c9-c432-4a32-835e-59a4831cf335\" (UID: \"814a06c9-c432-4a32-835e-59a4831cf335\") " Dec 07 19:38:29 crc kubenswrapper[4815]: I1207 19:38:29.769989 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75xhf\" (UniqueName: \"kubernetes.io/projected/814a06c9-c432-4a32-835e-59a4831cf335-kube-api-access-75xhf\") pod \"814a06c9-c432-4a32-835e-59a4831cf335\" (UID: \"814a06c9-c432-4a32-835e-59a4831cf335\") " Dec 07 19:38:29 crc kubenswrapper[4815]: I1207 19:38:29.770018 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/814a06c9-c432-4a32-835e-59a4831cf335-rabbitmq-plugins\") pod \"814a06c9-c432-4a32-835e-59a4831cf335\" (UID: \"814a06c9-c432-4a32-835e-59a4831cf335\") " Dec 07 19:38:29 crc kubenswrapper[4815]: I1207 19:38:29.770054 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/814a06c9-c432-4a32-835e-59a4831cf335-plugins-conf\") pod \"814a06c9-c432-4a32-835e-59a4831cf335\" (UID: \"814a06c9-c432-4a32-835e-59a4831cf335\") " Dec 07 19:38:29 crc kubenswrapper[4815]: I1207 19:38:29.770215 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"814a06c9-c432-4a32-835e-59a4831cf335\" (UID: \"814a06c9-c432-4a32-835e-59a4831cf335\") " Dec 07 19:38:29 crc kubenswrapper[4815]: I1207 19:38:29.770255 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/814a06c9-c432-4a32-835e-59a4831cf335-rabbitmq-erlang-cookie\") pod \"814a06c9-c432-4a32-835e-59a4831cf335\" (UID: \"814a06c9-c432-4a32-835e-59a4831cf335\") " Dec 07 19:38:29 crc kubenswrapper[4815]: I1207 19:38:29.770281 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/814a06c9-c432-4a32-835e-59a4831cf335-pod-info\") pod \"814a06c9-c432-4a32-835e-59a4831cf335\" (UID: \"814a06c9-c432-4a32-835e-59a4831cf335\") " Dec 07 19:38:29 crc kubenswrapper[4815]: I1207 19:38:29.770301 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/814a06c9-c432-4a32-835e-59a4831cf335-rabbitmq-tls\") pod \"814a06c9-c432-4a32-835e-59a4831cf335\" (UID: \"814a06c9-c432-4a32-835e-59a4831cf335\") " Dec 07 19:38:29 crc kubenswrapper[4815]: I1207 19:38:29.770327 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/814a06c9-c432-4a32-835e-59a4831cf335-config-data\") pod \"814a06c9-c432-4a32-835e-59a4831cf335\" (UID: \"814a06c9-c432-4a32-835e-59a4831cf335\") " Dec 07 19:38:29 crc kubenswrapper[4815]: I1207 19:38:29.771416 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/814a06c9-c432-4a32-835e-59a4831cf335-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "814a06c9-c432-4a32-835e-59a4831cf335" (UID: "814a06c9-c432-4a32-835e-59a4831cf335"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 07 19:38:29 crc kubenswrapper[4815]: I1207 19:38:29.773593 4815 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/814a06c9-c432-4a32-835e-59a4831cf335-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 07 19:38:29 crc kubenswrapper[4815]: I1207 19:38:29.788950 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/814a06c9-c432-4a32-835e-59a4831cf335-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "814a06c9-c432-4a32-835e-59a4831cf335" (UID: "814a06c9-c432-4a32-835e-59a4831cf335"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:38:29 crc kubenswrapper[4815]: I1207 19:38:29.794975 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/814a06c9-c432-4a32-835e-59a4831cf335-kube-api-access-75xhf" (OuterVolumeSpecName: "kube-api-access-75xhf") pod "814a06c9-c432-4a32-835e-59a4831cf335" (UID: "814a06c9-c432-4a32-835e-59a4831cf335"). InnerVolumeSpecName "kube-api-access-75xhf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:38:29 crc kubenswrapper[4815]: I1207 19:38:29.846708 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/814a06c9-c432-4a32-835e-59a4831cf335-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "814a06c9-c432-4a32-835e-59a4831cf335" (UID: "814a06c9-c432-4a32-835e-59a4831cf335"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 07 19:38:29 crc kubenswrapper[4815]: I1207 19:38:29.893168 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/814a06c9-c432-4a32-835e-59a4831cf335-pod-info" (OuterVolumeSpecName: "pod-info") pod "814a06c9-c432-4a32-835e-59a4831cf335" (UID: "814a06c9-c432-4a32-835e-59a4831cf335"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 07 19:38:29 crc kubenswrapper[4815]: I1207 19:38:29.893310 4815 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/814a06c9-c432-4a32-835e-59a4831cf335-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 07 19:38:29 crc kubenswrapper[4815]: I1207 19:38:29.893343 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-75xhf\" (UniqueName: \"kubernetes.io/projected/814a06c9-c432-4a32-835e-59a4831cf335-kube-api-access-75xhf\") on node \"crc\" DevicePath \"\"" Dec 07 19:38:29 crc kubenswrapper[4815]: I1207 19:38:29.893354 4815 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/814a06c9-c432-4a32-835e-59a4831cf335-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 07 19:38:29 crc kubenswrapper[4815]: I1207 19:38:29.919065 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2f348fe-7af1-4260-9946-27b3e711400d" path="/var/lib/kubelet/pods/c2f348fe-7af1-4260-9946-27b3e711400d/volumes" Dec 07 19:38:29 crc kubenswrapper[4815]: I1207 19:38:29.922183 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/814a06c9-c432-4a32-835e-59a4831cf335-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "814a06c9-c432-4a32-835e-59a4831cf335" (UID: "814a06c9-c432-4a32-835e-59a4831cf335"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:38:29 crc kubenswrapper[4815]: I1207 19:38:29.922280 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "persistence") pod "814a06c9-c432-4a32-835e-59a4831cf335" (UID: "814a06c9-c432-4a32-835e-59a4831cf335"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 07 19:38:29 crc kubenswrapper[4815]: I1207 19:38:29.922743 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/814a06c9-c432-4a32-835e-59a4831cf335-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "814a06c9-c432-4a32-835e-59a4831cf335" (UID: "814a06c9-c432-4a32-835e-59a4831cf335"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:38:29 crc kubenswrapper[4815]: W1207 19:38:29.951149 4815 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95704672_75eb_411c_a866_09ed671263f7.slice/crio-51ae2fdd6335d16e1166c5737fe2613b0a8d850ad6a646ffd7fb4b27314dba16 WatchSource:0}: Error finding container 51ae2fdd6335d16e1166c5737fe2613b0a8d850ad6a646ffd7fb4b27314dba16: Status 404 returned error can't find the container with id 51ae2fdd6335d16e1166c5737fe2613b0a8d850ad6a646ffd7fb4b27314dba16 Dec 07 19:38:29 crc kubenswrapper[4815]: I1207 19:38:29.955257 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/814a06c9-c432-4a32-835e-59a4831cf335-server-conf" (OuterVolumeSpecName: "server-conf") pod "814a06c9-c432-4a32-835e-59a4831cf335" (UID: "814a06c9-c432-4a32-835e-59a4831cf335"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:38:29 crc kubenswrapper[4815]: I1207 19:38:29.967777 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/814a06c9-c432-4a32-835e-59a4831cf335-config-data" (OuterVolumeSpecName: "config-data") pod "814a06c9-c432-4a32-835e-59a4831cf335" (UID: "814a06c9-c432-4a32-835e-59a4831cf335"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:38:29 crc kubenswrapper[4815]: I1207 19:38:29.995277 4815 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Dec 07 19:38:29 crc kubenswrapper[4815]: I1207 19:38:29.995324 4815 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/814a06c9-c432-4a32-835e-59a4831cf335-pod-info\") on node \"crc\" DevicePath \"\"" Dec 07 19:38:29 crc kubenswrapper[4815]: I1207 19:38:29.995335 4815 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/814a06c9-c432-4a32-835e-59a4831cf335-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 07 19:38:29 crc kubenswrapper[4815]: I1207 19:38:29.995345 4815 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/814a06c9-c432-4a32-835e-59a4831cf335-config-data\") on node \"crc\" DevicePath \"\"" Dec 07 19:38:29 crc kubenswrapper[4815]: I1207 19:38:29.995353 4815 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/814a06c9-c432-4a32-835e-59a4831cf335-server-conf\") on node \"crc\" DevicePath \"\"" Dec 07 19:38:29 crc kubenswrapper[4815]: I1207 19:38:29.995361 4815 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/814a06c9-c432-4a32-835e-59a4831cf335-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 07 19:38:30 crc kubenswrapper[4815]: I1207 19:38:30.011077 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 07 19:38:30 crc kubenswrapper[4815]: I1207 19:38:30.018287 4815 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Dec 07 19:38:30 crc kubenswrapper[4815]: I1207 19:38:30.025531 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"95704672-75eb-411c-a866-09ed671263f7","Type":"ContainerStarted","Data":"51ae2fdd6335d16e1166c5737fe2613b0a8d850ad6a646ffd7fb4b27314dba16"} Dec 07 19:38:30 crc kubenswrapper[4815]: I1207 19:38:30.027848 4815 generic.go:334] "Generic (PLEG): container finished" podID="814a06c9-c432-4a32-835e-59a4831cf335" containerID="29919a154299bd8a43d0a6db20c47d16533c78aca191b5f888bd4bdf9b546516" exitCode=0 Dec 07 19:38:30 crc kubenswrapper[4815]: I1207 19:38:30.027876 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"814a06c9-c432-4a32-835e-59a4831cf335","Type":"ContainerDied","Data":"29919a154299bd8a43d0a6db20c47d16533c78aca191b5f888bd4bdf9b546516"} Dec 07 19:38:30 crc kubenswrapper[4815]: I1207 19:38:30.027892 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"814a06c9-c432-4a32-835e-59a4831cf335","Type":"ContainerDied","Data":"809efcbb49440b5061e03637b002d559b4f1fef02a6cea690806e9481f79a192"} Dec 07 19:38:30 crc kubenswrapper[4815]: I1207 19:38:30.027909 4815 scope.go:117] "RemoveContainer" containerID="29919a154299bd8a43d0a6db20c47d16533c78aca191b5f888bd4bdf9b546516" Dec 07 19:38:30 crc kubenswrapper[4815]: I1207 19:38:30.027964 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 07 19:38:30 crc kubenswrapper[4815]: I1207 19:38:30.030655 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/814a06c9-c432-4a32-835e-59a4831cf335-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "814a06c9-c432-4a32-835e-59a4831cf335" (UID: "814a06c9-c432-4a32-835e-59a4831cf335"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:38:30 crc kubenswrapper[4815]: I1207 19:38:30.050367 4815 scope.go:117] "RemoveContainer" containerID="27aedc7db2bbcd1f9de2ea9311a9535e98297dae22558748a941d3049bf6da65" Dec 07 19:38:30 crc kubenswrapper[4815]: I1207 19:38:30.079485 4815 scope.go:117] "RemoveContainer" containerID="29919a154299bd8a43d0a6db20c47d16533c78aca191b5f888bd4bdf9b546516" Dec 07 19:38:30 crc kubenswrapper[4815]: E1207 19:38:30.079808 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29919a154299bd8a43d0a6db20c47d16533c78aca191b5f888bd4bdf9b546516\": container with ID starting with 29919a154299bd8a43d0a6db20c47d16533c78aca191b5f888bd4bdf9b546516 not found: ID does not exist" containerID="29919a154299bd8a43d0a6db20c47d16533c78aca191b5f888bd4bdf9b546516" Dec 07 19:38:30 crc kubenswrapper[4815]: I1207 19:38:30.079837 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29919a154299bd8a43d0a6db20c47d16533c78aca191b5f888bd4bdf9b546516"} err="failed to get container status \"29919a154299bd8a43d0a6db20c47d16533c78aca191b5f888bd4bdf9b546516\": rpc error: code = NotFound desc = could not find container \"29919a154299bd8a43d0a6db20c47d16533c78aca191b5f888bd4bdf9b546516\": container with ID starting with 29919a154299bd8a43d0a6db20c47d16533c78aca191b5f888bd4bdf9b546516 not found: ID does not exist" Dec 07 19:38:30 crc kubenswrapper[4815]: I1207 19:38:30.079858 4815 scope.go:117] "RemoveContainer" containerID="27aedc7db2bbcd1f9de2ea9311a9535e98297dae22558748a941d3049bf6da65" Dec 07 19:38:30 crc kubenswrapper[4815]: E1207 19:38:30.080210 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27aedc7db2bbcd1f9de2ea9311a9535e98297dae22558748a941d3049bf6da65\": container with ID starting with 27aedc7db2bbcd1f9de2ea9311a9535e98297dae22558748a941d3049bf6da65 not found: ID does not exist" containerID="27aedc7db2bbcd1f9de2ea9311a9535e98297dae22558748a941d3049bf6da65" Dec 07 19:38:30 crc kubenswrapper[4815]: I1207 19:38:30.080231 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27aedc7db2bbcd1f9de2ea9311a9535e98297dae22558748a941d3049bf6da65"} err="failed to get container status \"27aedc7db2bbcd1f9de2ea9311a9535e98297dae22558748a941d3049bf6da65\": rpc error: code = NotFound desc = could not find container \"27aedc7db2bbcd1f9de2ea9311a9535e98297dae22558748a941d3049bf6da65\": container with ID starting with 27aedc7db2bbcd1f9de2ea9311a9535e98297dae22558748a941d3049bf6da65 not found: ID does not exist" Dec 07 19:38:30 crc kubenswrapper[4815]: I1207 19:38:30.097814 4815 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/814a06c9-c432-4a32-835e-59a4831cf335-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 07 19:38:30 crc kubenswrapper[4815]: I1207 19:38:30.098124 4815 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Dec 07 19:38:30 crc kubenswrapper[4815]: I1207 19:38:30.385697 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 07 19:38:30 crc kubenswrapper[4815]: I1207 19:38:30.396788 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 07 19:38:30 crc kubenswrapper[4815]: I1207 19:38:30.424808 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 07 19:38:30 crc kubenswrapper[4815]: E1207 19:38:30.425175 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="814a06c9-c432-4a32-835e-59a4831cf335" containerName="setup-container" Dec 07 19:38:30 crc kubenswrapper[4815]: I1207 19:38:30.425192 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="814a06c9-c432-4a32-835e-59a4831cf335" containerName="setup-container" Dec 07 19:38:30 crc kubenswrapper[4815]: E1207 19:38:30.425206 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="814a06c9-c432-4a32-835e-59a4831cf335" containerName="rabbitmq" Dec 07 19:38:30 crc kubenswrapper[4815]: I1207 19:38:30.425218 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="814a06c9-c432-4a32-835e-59a4831cf335" containerName="rabbitmq" Dec 07 19:38:30 crc kubenswrapper[4815]: I1207 19:38:30.425421 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="814a06c9-c432-4a32-835e-59a4831cf335" containerName="rabbitmq" Dec 07 19:38:30 crc kubenswrapper[4815]: I1207 19:38:30.428205 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 07 19:38:30 crc kubenswrapper[4815]: I1207 19:38:30.430643 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 07 19:38:30 crc kubenswrapper[4815]: I1207 19:38:30.430899 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 07 19:38:30 crc kubenswrapper[4815]: I1207 19:38:30.431596 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 07 19:38:30 crc kubenswrapper[4815]: I1207 19:38:30.431800 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 07 19:38:30 crc kubenswrapper[4815]: I1207 19:38:30.431668 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 07 19:38:30 crc kubenswrapper[4815]: I1207 19:38:30.432810 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-mpdsj" Dec 07 19:38:30 crc kubenswrapper[4815]: I1207 19:38:30.435890 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 07 19:38:30 crc kubenswrapper[4815]: I1207 19:38:30.460884 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 07 19:38:30 crc kubenswrapper[4815]: I1207 19:38:30.605078 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gm4zb\" (UniqueName: \"kubernetes.io/projected/22bd77dc-c382-467f-928a-4be062c951ca-kube-api-access-gm4zb\") pod \"rabbitmq-cell1-server-0\" (UID: \"22bd77dc-c382-467f-928a-4be062c951ca\") " pod="openstack/rabbitmq-cell1-server-0" Dec 07 19:38:30 crc kubenswrapper[4815]: I1207 19:38:30.605158 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/22bd77dc-c382-467f-928a-4be062c951ca-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"22bd77dc-c382-467f-928a-4be062c951ca\") " pod="openstack/rabbitmq-cell1-server-0" Dec 07 19:38:30 crc kubenswrapper[4815]: I1207 19:38:30.605187 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/22bd77dc-c382-467f-928a-4be062c951ca-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"22bd77dc-c382-467f-928a-4be062c951ca\") " pod="openstack/rabbitmq-cell1-server-0" Dec 07 19:38:30 crc kubenswrapper[4815]: I1207 19:38:30.605255 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/22bd77dc-c382-467f-928a-4be062c951ca-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"22bd77dc-c382-467f-928a-4be062c951ca\") " pod="openstack/rabbitmq-cell1-server-0" Dec 07 19:38:30 crc kubenswrapper[4815]: I1207 19:38:30.605325 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/22bd77dc-c382-467f-928a-4be062c951ca-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"22bd77dc-c382-467f-928a-4be062c951ca\") " pod="openstack/rabbitmq-cell1-server-0" Dec 07 19:38:30 crc kubenswrapper[4815]: I1207 19:38:30.605362 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/22bd77dc-c382-467f-928a-4be062c951ca-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"22bd77dc-c382-467f-928a-4be062c951ca\") " pod="openstack/rabbitmq-cell1-server-0" Dec 07 19:38:30 crc kubenswrapper[4815]: I1207 19:38:30.605390 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/22bd77dc-c382-467f-928a-4be062c951ca-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"22bd77dc-c382-467f-928a-4be062c951ca\") " pod="openstack/rabbitmq-cell1-server-0" Dec 07 19:38:30 crc kubenswrapper[4815]: I1207 19:38:30.605435 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/22bd77dc-c382-467f-928a-4be062c951ca-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"22bd77dc-c382-467f-928a-4be062c951ca\") " pod="openstack/rabbitmq-cell1-server-0" Dec 07 19:38:30 crc kubenswrapper[4815]: I1207 19:38:30.605450 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/22bd77dc-c382-467f-928a-4be062c951ca-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"22bd77dc-c382-467f-928a-4be062c951ca\") " pod="openstack/rabbitmq-cell1-server-0" Dec 07 19:38:30 crc kubenswrapper[4815]: I1207 19:38:30.605470 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/22bd77dc-c382-467f-928a-4be062c951ca-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"22bd77dc-c382-467f-928a-4be062c951ca\") " pod="openstack/rabbitmq-cell1-server-0" Dec 07 19:38:30 crc kubenswrapper[4815]: I1207 19:38:30.605586 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"22bd77dc-c382-467f-928a-4be062c951ca\") " pod="openstack/rabbitmq-cell1-server-0" Dec 07 19:38:30 crc kubenswrapper[4815]: I1207 19:38:30.707784 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/22bd77dc-c382-467f-928a-4be062c951ca-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"22bd77dc-c382-467f-928a-4be062c951ca\") " pod="openstack/rabbitmq-cell1-server-0" Dec 07 19:38:30 crc kubenswrapper[4815]: I1207 19:38:30.708063 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/22bd77dc-c382-467f-928a-4be062c951ca-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"22bd77dc-c382-467f-928a-4be062c951ca\") " pod="openstack/rabbitmq-cell1-server-0" Dec 07 19:38:30 crc kubenswrapper[4815]: I1207 19:38:30.708196 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/22bd77dc-c382-467f-928a-4be062c951ca-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"22bd77dc-c382-467f-928a-4be062c951ca\") " pod="openstack/rabbitmq-cell1-server-0" Dec 07 19:38:30 crc kubenswrapper[4815]: I1207 19:38:30.708300 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/22bd77dc-c382-467f-928a-4be062c951ca-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"22bd77dc-c382-467f-928a-4be062c951ca\") " pod="openstack/rabbitmq-cell1-server-0" Dec 07 19:38:30 crc kubenswrapper[4815]: I1207 19:38:30.708390 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/22bd77dc-c382-467f-928a-4be062c951ca-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"22bd77dc-c382-467f-928a-4be062c951ca\") " pod="openstack/rabbitmq-cell1-server-0" Dec 07 19:38:30 crc kubenswrapper[4815]: I1207 19:38:30.708539 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"22bd77dc-c382-467f-928a-4be062c951ca\") " pod="openstack/rabbitmq-cell1-server-0" Dec 07 19:38:30 crc kubenswrapper[4815]: I1207 19:38:30.708664 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gm4zb\" (UniqueName: \"kubernetes.io/projected/22bd77dc-c382-467f-928a-4be062c951ca-kube-api-access-gm4zb\") pod \"rabbitmq-cell1-server-0\" (UID: \"22bd77dc-c382-467f-928a-4be062c951ca\") " pod="openstack/rabbitmq-cell1-server-0" Dec 07 19:38:30 crc kubenswrapper[4815]: I1207 19:38:30.708822 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/22bd77dc-c382-467f-928a-4be062c951ca-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"22bd77dc-c382-467f-928a-4be062c951ca\") " pod="openstack/rabbitmq-cell1-server-0" Dec 07 19:38:30 crc kubenswrapper[4815]: I1207 19:38:30.708962 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/22bd77dc-c382-467f-928a-4be062c951ca-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"22bd77dc-c382-467f-928a-4be062c951ca\") " pod="openstack/rabbitmq-cell1-server-0" Dec 07 19:38:30 crc kubenswrapper[4815]: I1207 19:38:30.709093 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/22bd77dc-c382-467f-928a-4be062c951ca-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"22bd77dc-c382-467f-928a-4be062c951ca\") " pod="openstack/rabbitmq-cell1-server-0" Dec 07 19:38:30 crc kubenswrapper[4815]: I1207 19:38:30.709268 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/22bd77dc-c382-467f-928a-4be062c951ca-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"22bd77dc-c382-467f-928a-4be062c951ca\") " pod="openstack/rabbitmq-cell1-server-0" Dec 07 19:38:30 crc kubenswrapper[4815]: I1207 19:38:30.709495 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/22bd77dc-c382-467f-928a-4be062c951ca-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"22bd77dc-c382-467f-928a-4be062c951ca\") " pod="openstack/rabbitmq-cell1-server-0" Dec 07 19:38:30 crc kubenswrapper[4815]: I1207 19:38:30.709593 4815 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"22bd77dc-c382-467f-928a-4be062c951ca\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/rabbitmq-cell1-server-0" Dec 07 19:38:30 crc kubenswrapper[4815]: I1207 19:38:30.711542 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/22bd77dc-c382-467f-928a-4be062c951ca-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"22bd77dc-c382-467f-928a-4be062c951ca\") " pod="openstack/rabbitmq-cell1-server-0" Dec 07 19:38:30 crc kubenswrapper[4815]: I1207 19:38:30.709160 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/22bd77dc-c382-467f-928a-4be062c951ca-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"22bd77dc-c382-467f-928a-4be062c951ca\") " pod="openstack/rabbitmq-cell1-server-0" Dec 07 19:38:30 crc kubenswrapper[4815]: I1207 19:38:30.711593 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/22bd77dc-c382-467f-928a-4be062c951ca-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"22bd77dc-c382-467f-928a-4be062c951ca\") " pod="openstack/rabbitmq-cell1-server-0" Dec 07 19:38:30 crc kubenswrapper[4815]: I1207 19:38:30.712856 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/22bd77dc-c382-467f-928a-4be062c951ca-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"22bd77dc-c382-467f-928a-4be062c951ca\") " pod="openstack/rabbitmq-cell1-server-0" Dec 07 19:38:30 crc kubenswrapper[4815]: I1207 19:38:30.713637 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/22bd77dc-c382-467f-928a-4be062c951ca-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"22bd77dc-c382-467f-928a-4be062c951ca\") " pod="openstack/rabbitmq-cell1-server-0" Dec 07 19:38:30 crc kubenswrapper[4815]: I1207 19:38:30.714223 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/22bd77dc-c382-467f-928a-4be062c951ca-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"22bd77dc-c382-467f-928a-4be062c951ca\") " pod="openstack/rabbitmq-cell1-server-0" Dec 07 19:38:30 crc kubenswrapper[4815]: I1207 19:38:30.714860 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/22bd77dc-c382-467f-928a-4be062c951ca-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"22bd77dc-c382-467f-928a-4be062c951ca\") " pod="openstack/rabbitmq-cell1-server-0" Dec 07 19:38:30 crc kubenswrapper[4815]: I1207 19:38:30.731807 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/22bd77dc-c382-467f-928a-4be062c951ca-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"22bd77dc-c382-467f-928a-4be062c951ca\") " pod="openstack/rabbitmq-cell1-server-0" Dec 07 19:38:30 crc kubenswrapper[4815]: I1207 19:38:30.732212 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gm4zb\" (UniqueName: \"kubernetes.io/projected/22bd77dc-c382-467f-928a-4be062c951ca-kube-api-access-gm4zb\") pod \"rabbitmq-cell1-server-0\" (UID: \"22bd77dc-c382-467f-928a-4be062c951ca\") " pod="openstack/rabbitmq-cell1-server-0" Dec 07 19:38:30 crc kubenswrapper[4815]: I1207 19:38:30.746064 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"22bd77dc-c382-467f-928a-4be062c951ca\") " pod="openstack/rabbitmq-cell1-server-0" Dec 07 19:38:31 crc kubenswrapper[4815]: I1207 19:38:31.047459 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 07 19:38:31 crc kubenswrapper[4815]: I1207 19:38:31.682560 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 07 19:38:31 crc kubenswrapper[4815]: I1207 19:38:31.783344 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="814a06c9-c432-4a32-835e-59a4831cf335" path="/var/lib/kubelet/pods/814a06c9-c432-4a32-835e-59a4831cf335/volumes" Dec 07 19:38:32 crc kubenswrapper[4815]: I1207 19:38:32.094555 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"22bd77dc-c382-467f-928a-4be062c951ca","Type":"ContainerStarted","Data":"b41c138347c0d5671bdc19bf521bf9eac95f78f3b810a28b549edbac0bffc0c9"} Dec 07 19:38:32 crc kubenswrapper[4815]: I1207 19:38:32.098097 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"95704672-75eb-411c-a866-09ed671263f7","Type":"ContainerStarted","Data":"0ca442a9c02607374b4ba35d970d6f6cff0c4b5861aef886ae7eb5461a7a3eab"} Dec 07 19:38:32 crc kubenswrapper[4815]: I1207 19:38:32.429503 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-578b8d767c-2g7pv"] Dec 07 19:38:32 crc kubenswrapper[4815]: I1207 19:38:32.431776 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-578b8d767c-2g7pv" Dec 07 19:38:32 crc kubenswrapper[4815]: I1207 19:38:32.438457 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Dec 07 19:38:32 crc kubenswrapper[4815]: I1207 19:38:32.446548 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-578b8d767c-2g7pv"] Dec 07 19:38:32 crc kubenswrapper[4815]: I1207 19:38:32.587463 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ba6c8996-b9bd-4d4d-8fbd-04855bff690d-ovsdbserver-nb\") pod \"dnsmasq-dns-578b8d767c-2g7pv\" (UID: \"ba6c8996-b9bd-4d4d-8fbd-04855bff690d\") " pod="openstack/dnsmasq-dns-578b8d767c-2g7pv" Dec 07 19:38:32 crc kubenswrapper[4815]: I1207 19:38:32.587509 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba6c8996-b9bd-4d4d-8fbd-04855bff690d-config\") pod \"dnsmasq-dns-578b8d767c-2g7pv\" (UID: \"ba6c8996-b9bd-4d4d-8fbd-04855bff690d\") " pod="openstack/dnsmasq-dns-578b8d767c-2g7pv" Dec 07 19:38:32 crc kubenswrapper[4815]: I1207 19:38:32.587531 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ba6c8996-b9bd-4d4d-8fbd-04855bff690d-ovsdbserver-sb\") pod \"dnsmasq-dns-578b8d767c-2g7pv\" (UID: \"ba6c8996-b9bd-4d4d-8fbd-04855bff690d\") " pod="openstack/dnsmasq-dns-578b8d767c-2g7pv" Dec 07 19:38:32 crc kubenswrapper[4815]: I1207 19:38:32.587567 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfc9l\" (UniqueName: \"kubernetes.io/projected/ba6c8996-b9bd-4d4d-8fbd-04855bff690d-kube-api-access-hfc9l\") pod \"dnsmasq-dns-578b8d767c-2g7pv\" (UID: \"ba6c8996-b9bd-4d4d-8fbd-04855bff690d\") " pod="openstack/dnsmasq-dns-578b8d767c-2g7pv" Dec 07 19:38:32 crc kubenswrapper[4815]: I1207 19:38:32.587625 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/ba6c8996-b9bd-4d4d-8fbd-04855bff690d-openstack-edpm-ipam\") pod \"dnsmasq-dns-578b8d767c-2g7pv\" (UID: \"ba6c8996-b9bd-4d4d-8fbd-04855bff690d\") " pod="openstack/dnsmasq-dns-578b8d767c-2g7pv" Dec 07 19:38:32 crc kubenswrapper[4815]: I1207 19:38:32.587649 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ba6c8996-b9bd-4d4d-8fbd-04855bff690d-dns-svc\") pod \"dnsmasq-dns-578b8d767c-2g7pv\" (UID: \"ba6c8996-b9bd-4d4d-8fbd-04855bff690d\") " pod="openstack/dnsmasq-dns-578b8d767c-2g7pv" Dec 07 19:38:32 crc kubenswrapper[4815]: I1207 19:38:32.688876 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ba6c8996-b9bd-4d4d-8fbd-04855bff690d-ovsdbserver-nb\") pod \"dnsmasq-dns-578b8d767c-2g7pv\" (UID: \"ba6c8996-b9bd-4d4d-8fbd-04855bff690d\") " pod="openstack/dnsmasq-dns-578b8d767c-2g7pv" Dec 07 19:38:32 crc kubenswrapper[4815]: I1207 19:38:32.688963 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba6c8996-b9bd-4d4d-8fbd-04855bff690d-config\") pod \"dnsmasq-dns-578b8d767c-2g7pv\" (UID: \"ba6c8996-b9bd-4d4d-8fbd-04855bff690d\") " pod="openstack/dnsmasq-dns-578b8d767c-2g7pv" Dec 07 19:38:32 crc kubenswrapper[4815]: I1207 19:38:32.688984 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ba6c8996-b9bd-4d4d-8fbd-04855bff690d-ovsdbserver-sb\") pod \"dnsmasq-dns-578b8d767c-2g7pv\" (UID: \"ba6c8996-b9bd-4d4d-8fbd-04855bff690d\") " pod="openstack/dnsmasq-dns-578b8d767c-2g7pv" Dec 07 19:38:32 crc kubenswrapper[4815]: I1207 19:38:32.689034 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfc9l\" (UniqueName: \"kubernetes.io/projected/ba6c8996-b9bd-4d4d-8fbd-04855bff690d-kube-api-access-hfc9l\") pod \"dnsmasq-dns-578b8d767c-2g7pv\" (UID: \"ba6c8996-b9bd-4d4d-8fbd-04855bff690d\") " pod="openstack/dnsmasq-dns-578b8d767c-2g7pv" Dec 07 19:38:32 crc kubenswrapper[4815]: I1207 19:38:32.689116 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/ba6c8996-b9bd-4d4d-8fbd-04855bff690d-openstack-edpm-ipam\") pod \"dnsmasq-dns-578b8d767c-2g7pv\" (UID: \"ba6c8996-b9bd-4d4d-8fbd-04855bff690d\") " pod="openstack/dnsmasq-dns-578b8d767c-2g7pv" Dec 07 19:38:32 crc kubenswrapper[4815]: I1207 19:38:32.689150 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ba6c8996-b9bd-4d4d-8fbd-04855bff690d-dns-svc\") pod \"dnsmasq-dns-578b8d767c-2g7pv\" (UID: \"ba6c8996-b9bd-4d4d-8fbd-04855bff690d\") " pod="openstack/dnsmasq-dns-578b8d767c-2g7pv" Dec 07 19:38:32 crc kubenswrapper[4815]: I1207 19:38:32.689992 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba6c8996-b9bd-4d4d-8fbd-04855bff690d-config\") pod \"dnsmasq-dns-578b8d767c-2g7pv\" (UID: \"ba6c8996-b9bd-4d4d-8fbd-04855bff690d\") " pod="openstack/dnsmasq-dns-578b8d767c-2g7pv" Dec 07 19:38:32 crc kubenswrapper[4815]: I1207 19:38:32.690056 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ba6c8996-b9bd-4d4d-8fbd-04855bff690d-ovsdbserver-sb\") pod \"dnsmasq-dns-578b8d767c-2g7pv\" (UID: \"ba6c8996-b9bd-4d4d-8fbd-04855bff690d\") " pod="openstack/dnsmasq-dns-578b8d767c-2g7pv" Dec 07 19:38:32 crc kubenswrapper[4815]: I1207 19:38:32.690098 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/ba6c8996-b9bd-4d4d-8fbd-04855bff690d-openstack-edpm-ipam\") pod \"dnsmasq-dns-578b8d767c-2g7pv\" (UID: \"ba6c8996-b9bd-4d4d-8fbd-04855bff690d\") " pod="openstack/dnsmasq-dns-578b8d767c-2g7pv" Dec 07 19:38:32 crc kubenswrapper[4815]: I1207 19:38:32.690191 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ba6c8996-b9bd-4d4d-8fbd-04855bff690d-dns-svc\") pod \"dnsmasq-dns-578b8d767c-2g7pv\" (UID: \"ba6c8996-b9bd-4d4d-8fbd-04855bff690d\") " pod="openstack/dnsmasq-dns-578b8d767c-2g7pv" Dec 07 19:38:32 crc kubenswrapper[4815]: I1207 19:38:32.690250 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ba6c8996-b9bd-4d4d-8fbd-04855bff690d-ovsdbserver-nb\") pod \"dnsmasq-dns-578b8d767c-2g7pv\" (UID: \"ba6c8996-b9bd-4d4d-8fbd-04855bff690d\") " pod="openstack/dnsmasq-dns-578b8d767c-2g7pv" Dec 07 19:38:32 crc kubenswrapper[4815]: I1207 19:38:32.710005 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfc9l\" (UniqueName: \"kubernetes.io/projected/ba6c8996-b9bd-4d4d-8fbd-04855bff690d-kube-api-access-hfc9l\") pod \"dnsmasq-dns-578b8d767c-2g7pv\" (UID: \"ba6c8996-b9bd-4d4d-8fbd-04855bff690d\") " pod="openstack/dnsmasq-dns-578b8d767c-2g7pv" Dec 07 19:38:32 crc kubenswrapper[4815]: I1207 19:38:32.749275 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-578b8d767c-2g7pv" Dec 07 19:38:33 crc kubenswrapper[4815]: I1207 19:38:33.271257 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-578b8d767c-2g7pv"] Dec 07 19:38:34 crc kubenswrapper[4815]: I1207 19:38:34.143841 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"22bd77dc-c382-467f-928a-4be062c951ca","Type":"ContainerStarted","Data":"90c8e9e035b5996e7f0803bae2feb01203a53975cda814d9769899e4a01b3847"} Dec 07 19:38:34 crc kubenswrapper[4815]: I1207 19:38:34.147112 4815 generic.go:334] "Generic (PLEG): container finished" podID="ba6c8996-b9bd-4d4d-8fbd-04855bff690d" containerID="f9eb5a5745ed52280a58811b8dd11d3b160101444711baa53fad35937c336f29" exitCode=0 Dec 07 19:38:34 crc kubenswrapper[4815]: I1207 19:38:34.147167 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-578b8d767c-2g7pv" event={"ID":"ba6c8996-b9bd-4d4d-8fbd-04855bff690d","Type":"ContainerDied","Data":"f9eb5a5745ed52280a58811b8dd11d3b160101444711baa53fad35937c336f29"} Dec 07 19:38:34 crc kubenswrapper[4815]: I1207 19:38:34.147197 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-578b8d767c-2g7pv" event={"ID":"ba6c8996-b9bd-4d4d-8fbd-04855bff690d","Type":"ContainerStarted","Data":"e91da692d35c49721faca3c4a4152cc88521aa12876c80867b401127e25e17bb"} Dec 07 19:38:35 crc kubenswrapper[4815]: I1207 19:38:35.165091 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-578b8d767c-2g7pv" event={"ID":"ba6c8996-b9bd-4d4d-8fbd-04855bff690d","Type":"ContainerStarted","Data":"1c711034b28e7dbe0ad62c04bb1307e0656b19c1fac62613936c7f6b2394ab9d"} Dec 07 19:38:35 crc kubenswrapper[4815]: I1207 19:38:35.165762 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-578b8d767c-2g7pv" Dec 07 19:38:35 crc kubenswrapper[4815]: I1207 19:38:35.211168 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-578b8d767c-2g7pv" podStartSLOduration=3.21113922 podStartE2EDuration="3.21113922s" podCreationTimestamp="2025-12-07 19:38:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:38:35.193953085 +0000 UTC m=+1419.772943140" watchObservedRunningTime="2025-12-07 19:38:35.21113922 +0000 UTC m=+1419.790129285" Dec 07 19:38:42 crc kubenswrapper[4815]: I1207 19:38:42.751205 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-578b8d767c-2g7pv" Dec 07 19:38:42 crc kubenswrapper[4815]: I1207 19:38:42.842865 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68d4b6d797-ts764"] Dec 07 19:38:42 crc kubenswrapper[4815]: I1207 19:38:42.843312 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-68d4b6d797-ts764" podUID="8a98cb42-d20e-491f-ba28-8202658078bf" containerName="dnsmasq-dns" containerID="cri-o://c7f8405a81363d17c00f1f7d8b5278c6a1267b3075b9ad696128b7600991c6f4" gracePeriod=10 Dec 07 19:38:42 crc kubenswrapper[4815]: I1207 19:38:42.992457 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-667ff9c869-wkhmg"] Dec 07 19:38:42 crc kubenswrapper[4815]: I1207 19:38:42.993869 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-667ff9c869-wkhmg" Dec 07 19:38:43 crc kubenswrapper[4815]: I1207 19:38:43.032309 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-667ff9c869-wkhmg"] Dec 07 19:38:43 crc kubenswrapper[4815]: I1207 19:38:43.068161 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8f7331f8-b581-46c1-9e19-4c4fc88a2f29-ovsdbserver-sb\") pod \"dnsmasq-dns-667ff9c869-wkhmg\" (UID: \"8f7331f8-b581-46c1-9e19-4c4fc88a2f29\") " pod="openstack/dnsmasq-dns-667ff9c869-wkhmg" Dec 07 19:38:43 crc kubenswrapper[4815]: I1207 19:38:43.068229 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8f7331f8-b581-46c1-9e19-4c4fc88a2f29-dns-svc\") pod \"dnsmasq-dns-667ff9c869-wkhmg\" (UID: \"8f7331f8-b581-46c1-9e19-4c4fc88a2f29\") " pod="openstack/dnsmasq-dns-667ff9c869-wkhmg" Dec 07 19:38:43 crc kubenswrapper[4815]: I1207 19:38:43.068278 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8f7331f8-b581-46c1-9e19-4c4fc88a2f29-ovsdbserver-nb\") pod \"dnsmasq-dns-667ff9c869-wkhmg\" (UID: \"8f7331f8-b581-46c1-9e19-4c4fc88a2f29\") " pod="openstack/dnsmasq-dns-667ff9c869-wkhmg" Dec 07 19:38:43 crc kubenswrapper[4815]: I1207 19:38:43.068303 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f7331f8-b581-46c1-9e19-4c4fc88a2f29-config\") pod \"dnsmasq-dns-667ff9c869-wkhmg\" (UID: \"8f7331f8-b581-46c1-9e19-4c4fc88a2f29\") " pod="openstack/dnsmasq-dns-667ff9c869-wkhmg" Dec 07 19:38:43 crc kubenswrapper[4815]: I1207 19:38:43.068355 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/8f7331f8-b581-46c1-9e19-4c4fc88a2f29-openstack-edpm-ipam\") pod \"dnsmasq-dns-667ff9c869-wkhmg\" (UID: \"8f7331f8-b581-46c1-9e19-4c4fc88a2f29\") " pod="openstack/dnsmasq-dns-667ff9c869-wkhmg" Dec 07 19:38:43 crc kubenswrapper[4815]: I1207 19:38:43.068375 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sv6dn\" (UniqueName: \"kubernetes.io/projected/8f7331f8-b581-46c1-9e19-4c4fc88a2f29-kube-api-access-sv6dn\") pod \"dnsmasq-dns-667ff9c869-wkhmg\" (UID: \"8f7331f8-b581-46c1-9e19-4c4fc88a2f29\") " pod="openstack/dnsmasq-dns-667ff9c869-wkhmg" Dec 07 19:38:43 crc kubenswrapper[4815]: I1207 19:38:43.170273 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f7331f8-b581-46c1-9e19-4c4fc88a2f29-config\") pod \"dnsmasq-dns-667ff9c869-wkhmg\" (UID: \"8f7331f8-b581-46c1-9e19-4c4fc88a2f29\") " pod="openstack/dnsmasq-dns-667ff9c869-wkhmg" Dec 07 19:38:43 crc kubenswrapper[4815]: I1207 19:38:43.170462 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/8f7331f8-b581-46c1-9e19-4c4fc88a2f29-openstack-edpm-ipam\") pod \"dnsmasq-dns-667ff9c869-wkhmg\" (UID: \"8f7331f8-b581-46c1-9e19-4c4fc88a2f29\") " pod="openstack/dnsmasq-dns-667ff9c869-wkhmg" Dec 07 19:38:43 crc kubenswrapper[4815]: I1207 19:38:43.170510 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sv6dn\" (UniqueName: \"kubernetes.io/projected/8f7331f8-b581-46c1-9e19-4c4fc88a2f29-kube-api-access-sv6dn\") pod \"dnsmasq-dns-667ff9c869-wkhmg\" (UID: \"8f7331f8-b581-46c1-9e19-4c4fc88a2f29\") " pod="openstack/dnsmasq-dns-667ff9c869-wkhmg" Dec 07 19:38:43 crc kubenswrapper[4815]: I1207 19:38:43.170596 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8f7331f8-b581-46c1-9e19-4c4fc88a2f29-ovsdbserver-sb\") pod \"dnsmasq-dns-667ff9c869-wkhmg\" (UID: \"8f7331f8-b581-46c1-9e19-4c4fc88a2f29\") " pod="openstack/dnsmasq-dns-667ff9c869-wkhmg" Dec 07 19:38:43 crc kubenswrapper[4815]: I1207 19:38:43.170712 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8f7331f8-b581-46c1-9e19-4c4fc88a2f29-dns-svc\") pod \"dnsmasq-dns-667ff9c869-wkhmg\" (UID: \"8f7331f8-b581-46c1-9e19-4c4fc88a2f29\") " pod="openstack/dnsmasq-dns-667ff9c869-wkhmg" Dec 07 19:38:43 crc kubenswrapper[4815]: I1207 19:38:43.170803 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8f7331f8-b581-46c1-9e19-4c4fc88a2f29-ovsdbserver-nb\") pod \"dnsmasq-dns-667ff9c869-wkhmg\" (UID: \"8f7331f8-b581-46c1-9e19-4c4fc88a2f29\") " pod="openstack/dnsmasq-dns-667ff9c869-wkhmg" Dec 07 19:38:43 crc kubenswrapper[4815]: I1207 19:38:43.171303 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f7331f8-b581-46c1-9e19-4c4fc88a2f29-config\") pod \"dnsmasq-dns-667ff9c869-wkhmg\" (UID: \"8f7331f8-b581-46c1-9e19-4c4fc88a2f29\") " pod="openstack/dnsmasq-dns-667ff9c869-wkhmg" Dec 07 19:38:43 crc kubenswrapper[4815]: I1207 19:38:43.171533 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8f7331f8-b581-46c1-9e19-4c4fc88a2f29-ovsdbserver-sb\") pod \"dnsmasq-dns-667ff9c869-wkhmg\" (UID: \"8f7331f8-b581-46c1-9e19-4c4fc88a2f29\") " pod="openstack/dnsmasq-dns-667ff9c869-wkhmg" Dec 07 19:38:43 crc kubenswrapper[4815]: I1207 19:38:43.172107 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8f7331f8-b581-46c1-9e19-4c4fc88a2f29-dns-svc\") pod \"dnsmasq-dns-667ff9c869-wkhmg\" (UID: \"8f7331f8-b581-46c1-9e19-4c4fc88a2f29\") " pod="openstack/dnsmasq-dns-667ff9c869-wkhmg" Dec 07 19:38:43 crc kubenswrapper[4815]: I1207 19:38:43.172213 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8f7331f8-b581-46c1-9e19-4c4fc88a2f29-ovsdbserver-nb\") pod \"dnsmasq-dns-667ff9c869-wkhmg\" (UID: \"8f7331f8-b581-46c1-9e19-4c4fc88a2f29\") " pod="openstack/dnsmasq-dns-667ff9c869-wkhmg" Dec 07 19:38:43 crc kubenswrapper[4815]: I1207 19:38:43.172709 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/8f7331f8-b581-46c1-9e19-4c4fc88a2f29-openstack-edpm-ipam\") pod \"dnsmasq-dns-667ff9c869-wkhmg\" (UID: \"8f7331f8-b581-46c1-9e19-4c4fc88a2f29\") " pod="openstack/dnsmasq-dns-667ff9c869-wkhmg" Dec 07 19:38:43 crc kubenswrapper[4815]: I1207 19:38:43.195146 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sv6dn\" (UniqueName: \"kubernetes.io/projected/8f7331f8-b581-46c1-9e19-4c4fc88a2f29-kube-api-access-sv6dn\") pod \"dnsmasq-dns-667ff9c869-wkhmg\" (UID: \"8f7331f8-b581-46c1-9e19-4c4fc88a2f29\") " pod="openstack/dnsmasq-dns-667ff9c869-wkhmg" Dec 07 19:38:43 crc kubenswrapper[4815]: I1207 19:38:43.236496 4815 generic.go:334] "Generic (PLEG): container finished" podID="8a98cb42-d20e-491f-ba28-8202658078bf" containerID="c7f8405a81363d17c00f1f7d8b5278c6a1267b3075b9ad696128b7600991c6f4" exitCode=0 Dec 07 19:38:43 crc kubenswrapper[4815]: I1207 19:38:43.236736 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68d4b6d797-ts764" event={"ID":"8a98cb42-d20e-491f-ba28-8202658078bf","Type":"ContainerDied","Data":"c7f8405a81363d17c00f1f7d8b5278c6a1267b3075b9ad696128b7600991c6f4"} Dec 07 19:38:43 crc kubenswrapper[4815]: I1207 19:38:43.337809 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-667ff9c869-wkhmg" Dec 07 19:38:43 crc kubenswrapper[4815]: I1207 19:38:43.435518 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68d4b6d797-ts764" Dec 07 19:38:43 crc kubenswrapper[4815]: I1207 19:38:43.584448 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a98cb42-d20e-491f-ba28-8202658078bf-config\") pod \"8a98cb42-d20e-491f-ba28-8202658078bf\" (UID: \"8a98cb42-d20e-491f-ba28-8202658078bf\") " Dec 07 19:38:43 crc kubenswrapper[4815]: I1207 19:38:43.584976 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8a98cb42-d20e-491f-ba28-8202658078bf-ovsdbserver-sb\") pod \"8a98cb42-d20e-491f-ba28-8202658078bf\" (UID: \"8a98cb42-d20e-491f-ba28-8202658078bf\") " Dec 07 19:38:43 crc kubenswrapper[4815]: I1207 19:38:43.585153 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8a98cb42-d20e-491f-ba28-8202658078bf-dns-svc\") pod \"8a98cb42-d20e-491f-ba28-8202658078bf\" (UID: \"8a98cb42-d20e-491f-ba28-8202658078bf\") " Dec 07 19:38:43 crc kubenswrapper[4815]: I1207 19:38:43.585263 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b6sjk\" (UniqueName: \"kubernetes.io/projected/8a98cb42-d20e-491f-ba28-8202658078bf-kube-api-access-b6sjk\") pod \"8a98cb42-d20e-491f-ba28-8202658078bf\" (UID: \"8a98cb42-d20e-491f-ba28-8202658078bf\") " Dec 07 19:38:43 crc kubenswrapper[4815]: I1207 19:38:43.585400 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8a98cb42-d20e-491f-ba28-8202658078bf-ovsdbserver-nb\") pod \"8a98cb42-d20e-491f-ba28-8202658078bf\" (UID: \"8a98cb42-d20e-491f-ba28-8202658078bf\") " Dec 07 19:38:43 crc kubenswrapper[4815]: I1207 19:38:43.614550 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a98cb42-d20e-491f-ba28-8202658078bf-kube-api-access-b6sjk" (OuterVolumeSpecName: "kube-api-access-b6sjk") pod "8a98cb42-d20e-491f-ba28-8202658078bf" (UID: "8a98cb42-d20e-491f-ba28-8202658078bf"). InnerVolumeSpecName "kube-api-access-b6sjk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:38:43 crc kubenswrapper[4815]: I1207 19:38:43.656132 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a98cb42-d20e-491f-ba28-8202658078bf-config" (OuterVolumeSpecName: "config") pod "8a98cb42-d20e-491f-ba28-8202658078bf" (UID: "8a98cb42-d20e-491f-ba28-8202658078bf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:38:43 crc kubenswrapper[4815]: I1207 19:38:43.670174 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a98cb42-d20e-491f-ba28-8202658078bf-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8a98cb42-d20e-491f-ba28-8202658078bf" (UID: "8a98cb42-d20e-491f-ba28-8202658078bf"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:38:43 crc kubenswrapper[4815]: I1207 19:38:43.678558 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a98cb42-d20e-491f-ba28-8202658078bf-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8a98cb42-d20e-491f-ba28-8202658078bf" (UID: "8a98cb42-d20e-491f-ba28-8202658078bf"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:38:43 crc kubenswrapper[4815]: I1207 19:38:43.680393 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a98cb42-d20e-491f-ba28-8202658078bf-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8a98cb42-d20e-491f-ba28-8202658078bf" (UID: "8a98cb42-d20e-491f-ba28-8202658078bf"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:38:43 crc kubenswrapper[4815]: I1207 19:38:43.688351 4815 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a98cb42-d20e-491f-ba28-8202658078bf-config\") on node \"crc\" DevicePath \"\"" Dec 07 19:38:43 crc kubenswrapper[4815]: I1207 19:38:43.688378 4815 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8a98cb42-d20e-491f-ba28-8202658078bf-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 07 19:38:43 crc kubenswrapper[4815]: I1207 19:38:43.688389 4815 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8a98cb42-d20e-491f-ba28-8202658078bf-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 07 19:38:43 crc kubenswrapper[4815]: I1207 19:38:43.688399 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b6sjk\" (UniqueName: \"kubernetes.io/projected/8a98cb42-d20e-491f-ba28-8202658078bf-kube-api-access-b6sjk\") on node \"crc\" DevicePath \"\"" Dec 07 19:38:43 crc kubenswrapper[4815]: I1207 19:38:43.688408 4815 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8a98cb42-d20e-491f-ba28-8202658078bf-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 07 19:38:43 crc kubenswrapper[4815]: I1207 19:38:43.837787 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-667ff9c869-wkhmg"] Dec 07 19:38:43 crc kubenswrapper[4815]: W1207 19:38:43.842150 4815 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8f7331f8_b581_46c1_9e19_4c4fc88a2f29.slice/crio-a435beca8c17b7e969a4bb90cda56dd5e9a669a5ed0b79ad35badc809cff5750 WatchSource:0}: Error finding container a435beca8c17b7e969a4bb90cda56dd5e9a669a5ed0b79ad35badc809cff5750: Status 404 returned error can't find the container with id a435beca8c17b7e969a4bb90cda56dd5e9a669a5ed0b79ad35badc809cff5750 Dec 07 19:38:44 crc kubenswrapper[4815]: I1207 19:38:44.244831 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68d4b6d797-ts764" event={"ID":"8a98cb42-d20e-491f-ba28-8202658078bf","Type":"ContainerDied","Data":"d84f7168523e803bf032d86274498bd529c26d406ba525b76b8ca64f8ab90151"} Dec 07 19:38:44 crc kubenswrapper[4815]: I1207 19:38:44.245230 4815 scope.go:117] "RemoveContainer" containerID="c7f8405a81363d17c00f1f7d8b5278c6a1267b3075b9ad696128b7600991c6f4" Dec 07 19:38:44 crc kubenswrapper[4815]: I1207 19:38:44.245087 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68d4b6d797-ts764" Dec 07 19:38:44 crc kubenswrapper[4815]: I1207 19:38:44.246531 4815 generic.go:334] "Generic (PLEG): container finished" podID="8f7331f8-b581-46c1-9e19-4c4fc88a2f29" containerID="f0f56e13d446cbb23da80d40cbd2856f2a72a72e0aa1ec6d91819dae4800e4a7" exitCode=0 Dec 07 19:38:44 crc kubenswrapper[4815]: I1207 19:38:44.246558 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-667ff9c869-wkhmg" event={"ID":"8f7331f8-b581-46c1-9e19-4c4fc88a2f29","Type":"ContainerDied","Data":"f0f56e13d446cbb23da80d40cbd2856f2a72a72e0aa1ec6d91819dae4800e4a7"} Dec 07 19:38:44 crc kubenswrapper[4815]: I1207 19:38:44.246574 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-667ff9c869-wkhmg" event={"ID":"8f7331f8-b581-46c1-9e19-4c4fc88a2f29","Type":"ContainerStarted","Data":"a435beca8c17b7e969a4bb90cda56dd5e9a669a5ed0b79ad35badc809cff5750"} Dec 07 19:38:44 crc kubenswrapper[4815]: I1207 19:38:44.266857 4815 scope.go:117] "RemoveContainer" containerID="cfdadee2334c72a0d3566bf5a6d4759a9585f63447a15b08434ff5b6bf8d778c" Dec 07 19:38:44 crc kubenswrapper[4815]: I1207 19:38:44.310306 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68d4b6d797-ts764"] Dec 07 19:38:44 crc kubenswrapper[4815]: I1207 19:38:44.320908 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-68d4b6d797-ts764"] Dec 07 19:38:45 crc kubenswrapper[4815]: I1207 19:38:45.258794 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-667ff9c869-wkhmg" event={"ID":"8f7331f8-b581-46c1-9e19-4c4fc88a2f29","Type":"ContainerStarted","Data":"945f84b9600f38b4143ff79b7574519084687ca44a01694833ff2c1379786103"} Dec 07 19:38:45 crc kubenswrapper[4815]: I1207 19:38:45.260172 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-667ff9c869-wkhmg" Dec 07 19:38:45 crc kubenswrapper[4815]: I1207 19:38:45.284912 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-667ff9c869-wkhmg" podStartSLOduration=3.284888109 podStartE2EDuration="3.284888109s" podCreationTimestamp="2025-12-07 19:38:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:38:45.282617105 +0000 UTC m=+1429.861607180" watchObservedRunningTime="2025-12-07 19:38:45.284888109 +0000 UTC m=+1429.863878184" Dec 07 19:38:45 crc kubenswrapper[4815]: I1207 19:38:45.781651 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a98cb42-d20e-491f-ba28-8202658078bf" path="/var/lib/kubelet/pods/8a98cb42-d20e-491f-ba28-8202658078bf/volumes" Dec 07 19:38:53 crc kubenswrapper[4815]: I1207 19:38:53.339230 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-667ff9c869-wkhmg" Dec 07 19:38:53 crc kubenswrapper[4815]: I1207 19:38:53.418047 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-578b8d767c-2g7pv"] Dec 07 19:38:53 crc kubenswrapper[4815]: I1207 19:38:53.419736 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-578b8d767c-2g7pv" podUID="ba6c8996-b9bd-4d4d-8fbd-04855bff690d" containerName="dnsmasq-dns" containerID="cri-o://1c711034b28e7dbe0ad62c04bb1307e0656b19c1fac62613936c7f6b2394ab9d" gracePeriod=10 Dec 07 19:38:53 crc kubenswrapper[4815]: I1207 19:38:53.891101 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-578b8d767c-2g7pv" Dec 07 19:38:53 crc kubenswrapper[4815]: I1207 19:38:53.981361 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ba6c8996-b9bd-4d4d-8fbd-04855bff690d-ovsdbserver-sb\") pod \"ba6c8996-b9bd-4d4d-8fbd-04855bff690d\" (UID: \"ba6c8996-b9bd-4d4d-8fbd-04855bff690d\") " Dec 07 19:38:53 crc kubenswrapper[4815]: I1207 19:38:53.981430 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ba6c8996-b9bd-4d4d-8fbd-04855bff690d-ovsdbserver-nb\") pod \"ba6c8996-b9bd-4d4d-8fbd-04855bff690d\" (UID: \"ba6c8996-b9bd-4d4d-8fbd-04855bff690d\") " Dec 07 19:38:53 crc kubenswrapper[4815]: I1207 19:38:53.981511 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba6c8996-b9bd-4d4d-8fbd-04855bff690d-config\") pod \"ba6c8996-b9bd-4d4d-8fbd-04855bff690d\" (UID: \"ba6c8996-b9bd-4d4d-8fbd-04855bff690d\") " Dec 07 19:38:53 crc kubenswrapper[4815]: I1207 19:38:53.981533 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hfc9l\" (UniqueName: \"kubernetes.io/projected/ba6c8996-b9bd-4d4d-8fbd-04855bff690d-kube-api-access-hfc9l\") pod \"ba6c8996-b9bd-4d4d-8fbd-04855bff690d\" (UID: \"ba6c8996-b9bd-4d4d-8fbd-04855bff690d\") " Dec 07 19:38:53 crc kubenswrapper[4815]: I1207 19:38:53.981577 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/ba6c8996-b9bd-4d4d-8fbd-04855bff690d-openstack-edpm-ipam\") pod \"ba6c8996-b9bd-4d4d-8fbd-04855bff690d\" (UID: \"ba6c8996-b9bd-4d4d-8fbd-04855bff690d\") " Dec 07 19:38:53 crc kubenswrapper[4815]: I1207 19:38:53.981670 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ba6c8996-b9bd-4d4d-8fbd-04855bff690d-dns-svc\") pod \"ba6c8996-b9bd-4d4d-8fbd-04855bff690d\" (UID: \"ba6c8996-b9bd-4d4d-8fbd-04855bff690d\") " Dec 07 19:38:53 crc kubenswrapper[4815]: I1207 19:38:53.990235 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba6c8996-b9bd-4d4d-8fbd-04855bff690d-kube-api-access-hfc9l" (OuterVolumeSpecName: "kube-api-access-hfc9l") pod "ba6c8996-b9bd-4d4d-8fbd-04855bff690d" (UID: "ba6c8996-b9bd-4d4d-8fbd-04855bff690d"). InnerVolumeSpecName "kube-api-access-hfc9l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:38:54 crc kubenswrapper[4815]: I1207 19:38:54.045065 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba6c8996-b9bd-4d4d-8fbd-04855bff690d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ba6c8996-b9bd-4d4d-8fbd-04855bff690d" (UID: "ba6c8996-b9bd-4d4d-8fbd-04855bff690d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:38:54 crc kubenswrapper[4815]: I1207 19:38:54.046205 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba6c8996-b9bd-4d4d-8fbd-04855bff690d-config" (OuterVolumeSpecName: "config") pod "ba6c8996-b9bd-4d4d-8fbd-04855bff690d" (UID: "ba6c8996-b9bd-4d4d-8fbd-04855bff690d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:38:54 crc kubenswrapper[4815]: I1207 19:38:54.052604 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba6c8996-b9bd-4d4d-8fbd-04855bff690d-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "ba6c8996-b9bd-4d4d-8fbd-04855bff690d" (UID: "ba6c8996-b9bd-4d4d-8fbd-04855bff690d"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:38:54 crc kubenswrapper[4815]: I1207 19:38:54.054115 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba6c8996-b9bd-4d4d-8fbd-04855bff690d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ba6c8996-b9bd-4d4d-8fbd-04855bff690d" (UID: "ba6c8996-b9bd-4d4d-8fbd-04855bff690d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:38:54 crc kubenswrapper[4815]: I1207 19:38:54.064300 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba6c8996-b9bd-4d4d-8fbd-04855bff690d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ba6c8996-b9bd-4d4d-8fbd-04855bff690d" (UID: "ba6c8996-b9bd-4d4d-8fbd-04855bff690d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:38:54 crc kubenswrapper[4815]: I1207 19:38:54.084503 4815 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ba6c8996-b9bd-4d4d-8fbd-04855bff690d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 07 19:38:54 crc kubenswrapper[4815]: I1207 19:38:54.084535 4815 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ba6c8996-b9bd-4d4d-8fbd-04855bff690d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 07 19:38:54 crc kubenswrapper[4815]: I1207 19:38:54.084547 4815 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba6c8996-b9bd-4d4d-8fbd-04855bff690d-config\") on node \"crc\" DevicePath \"\"" Dec 07 19:38:54 crc kubenswrapper[4815]: I1207 19:38:54.084581 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hfc9l\" (UniqueName: \"kubernetes.io/projected/ba6c8996-b9bd-4d4d-8fbd-04855bff690d-kube-api-access-hfc9l\") on node \"crc\" DevicePath \"\"" Dec 07 19:38:54 crc kubenswrapper[4815]: I1207 19:38:54.084592 4815 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/ba6c8996-b9bd-4d4d-8fbd-04855bff690d-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 07 19:38:54 crc kubenswrapper[4815]: I1207 19:38:54.084600 4815 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ba6c8996-b9bd-4d4d-8fbd-04855bff690d-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 07 19:38:54 crc kubenswrapper[4815]: I1207 19:38:54.358675 4815 generic.go:334] "Generic (PLEG): container finished" podID="ba6c8996-b9bd-4d4d-8fbd-04855bff690d" containerID="1c711034b28e7dbe0ad62c04bb1307e0656b19c1fac62613936c7f6b2394ab9d" exitCode=0 Dec 07 19:38:54 crc kubenswrapper[4815]: I1207 19:38:54.358767 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-578b8d767c-2g7pv" Dec 07 19:38:54 crc kubenswrapper[4815]: I1207 19:38:54.358769 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-578b8d767c-2g7pv" event={"ID":"ba6c8996-b9bd-4d4d-8fbd-04855bff690d","Type":"ContainerDied","Data":"1c711034b28e7dbe0ad62c04bb1307e0656b19c1fac62613936c7f6b2394ab9d"} Dec 07 19:38:54 crc kubenswrapper[4815]: I1207 19:38:54.359317 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-578b8d767c-2g7pv" event={"ID":"ba6c8996-b9bd-4d4d-8fbd-04855bff690d","Type":"ContainerDied","Data":"e91da692d35c49721faca3c4a4152cc88521aa12876c80867b401127e25e17bb"} Dec 07 19:38:54 crc kubenswrapper[4815]: I1207 19:38:54.359366 4815 scope.go:117] "RemoveContainer" containerID="1c711034b28e7dbe0ad62c04bb1307e0656b19c1fac62613936c7f6b2394ab9d" Dec 07 19:38:54 crc kubenswrapper[4815]: I1207 19:38:54.407275 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-578b8d767c-2g7pv"] Dec 07 19:38:54 crc kubenswrapper[4815]: I1207 19:38:54.417328 4815 scope.go:117] "RemoveContainer" containerID="f9eb5a5745ed52280a58811b8dd11d3b160101444711baa53fad35937c336f29" Dec 07 19:38:54 crc kubenswrapper[4815]: I1207 19:38:54.422732 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-578b8d767c-2g7pv"] Dec 07 19:38:54 crc kubenswrapper[4815]: I1207 19:38:54.446702 4815 scope.go:117] "RemoveContainer" containerID="1c711034b28e7dbe0ad62c04bb1307e0656b19c1fac62613936c7f6b2394ab9d" Dec 07 19:38:54 crc kubenswrapper[4815]: E1207 19:38:54.447305 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c711034b28e7dbe0ad62c04bb1307e0656b19c1fac62613936c7f6b2394ab9d\": container with ID starting with 1c711034b28e7dbe0ad62c04bb1307e0656b19c1fac62613936c7f6b2394ab9d not found: ID does not exist" containerID="1c711034b28e7dbe0ad62c04bb1307e0656b19c1fac62613936c7f6b2394ab9d" Dec 07 19:38:54 crc kubenswrapper[4815]: I1207 19:38:54.447343 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c711034b28e7dbe0ad62c04bb1307e0656b19c1fac62613936c7f6b2394ab9d"} err="failed to get container status \"1c711034b28e7dbe0ad62c04bb1307e0656b19c1fac62613936c7f6b2394ab9d\": rpc error: code = NotFound desc = could not find container \"1c711034b28e7dbe0ad62c04bb1307e0656b19c1fac62613936c7f6b2394ab9d\": container with ID starting with 1c711034b28e7dbe0ad62c04bb1307e0656b19c1fac62613936c7f6b2394ab9d not found: ID does not exist" Dec 07 19:38:54 crc kubenswrapper[4815]: I1207 19:38:54.447371 4815 scope.go:117] "RemoveContainer" containerID="f9eb5a5745ed52280a58811b8dd11d3b160101444711baa53fad35937c336f29" Dec 07 19:38:54 crc kubenswrapper[4815]: E1207 19:38:54.447761 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9eb5a5745ed52280a58811b8dd11d3b160101444711baa53fad35937c336f29\": container with ID starting with f9eb5a5745ed52280a58811b8dd11d3b160101444711baa53fad35937c336f29 not found: ID does not exist" containerID="f9eb5a5745ed52280a58811b8dd11d3b160101444711baa53fad35937c336f29" Dec 07 19:38:54 crc kubenswrapper[4815]: I1207 19:38:54.447791 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9eb5a5745ed52280a58811b8dd11d3b160101444711baa53fad35937c336f29"} err="failed to get container status \"f9eb5a5745ed52280a58811b8dd11d3b160101444711baa53fad35937c336f29\": rpc error: code = NotFound desc = could not find container \"f9eb5a5745ed52280a58811b8dd11d3b160101444711baa53fad35937c336f29\": container with ID starting with f9eb5a5745ed52280a58811b8dd11d3b160101444711baa53fad35937c336f29 not found: ID does not exist" Dec 07 19:38:55 crc kubenswrapper[4815]: I1207 19:38:55.819380 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba6c8996-b9bd-4d4d-8fbd-04855bff690d" path="/var/lib/kubelet/pods/ba6c8996-b9bd-4d4d-8fbd-04855bff690d/volumes" Dec 07 19:39:03 crc kubenswrapper[4815]: I1207 19:39:03.467981 4815 generic.go:334] "Generic (PLEG): container finished" podID="95704672-75eb-411c-a866-09ed671263f7" containerID="0ca442a9c02607374b4ba35d970d6f6cff0c4b5861aef886ae7eb5461a7a3eab" exitCode=0 Dec 07 19:39:03 crc kubenswrapper[4815]: I1207 19:39:03.468041 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"95704672-75eb-411c-a866-09ed671263f7","Type":"ContainerDied","Data":"0ca442a9c02607374b4ba35d970d6f6cff0c4b5861aef886ae7eb5461a7a3eab"} Dec 07 19:39:03 crc kubenswrapper[4815]: I1207 19:39:03.567881 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-g84n8"] Dec 07 19:39:03 crc kubenswrapper[4815]: E1207 19:39:03.568628 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba6c8996-b9bd-4d4d-8fbd-04855bff690d" containerName="init" Dec 07 19:39:03 crc kubenswrapper[4815]: I1207 19:39:03.568736 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba6c8996-b9bd-4d4d-8fbd-04855bff690d" containerName="init" Dec 07 19:39:03 crc kubenswrapper[4815]: E1207 19:39:03.569549 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a98cb42-d20e-491f-ba28-8202658078bf" containerName="dnsmasq-dns" Dec 07 19:39:03 crc kubenswrapper[4815]: I1207 19:39:03.569652 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a98cb42-d20e-491f-ba28-8202658078bf" containerName="dnsmasq-dns" Dec 07 19:39:03 crc kubenswrapper[4815]: E1207 19:39:03.569754 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba6c8996-b9bd-4d4d-8fbd-04855bff690d" containerName="dnsmasq-dns" Dec 07 19:39:03 crc kubenswrapper[4815]: I1207 19:39:03.569834 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba6c8996-b9bd-4d4d-8fbd-04855bff690d" containerName="dnsmasq-dns" Dec 07 19:39:03 crc kubenswrapper[4815]: E1207 19:39:03.569938 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a98cb42-d20e-491f-ba28-8202658078bf" containerName="init" Dec 07 19:39:03 crc kubenswrapper[4815]: I1207 19:39:03.570164 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a98cb42-d20e-491f-ba28-8202658078bf" containerName="init" Dec 07 19:39:03 crc kubenswrapper[4815]: I1207 19:39:03.570514 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a98cb42-d20e-491f-ba28-8202658078bf" containerName="dnsmasq-dns" Dec 07 19:39:03 crc kubenswrapper[4815]: I1207 19:39:03.570628 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba6c8996-b9bd-4d4d-8fbd-04855bff690d" containerName="dnsmasq-dns" Dec 07 19:39:03 crc kubenswrapper[4815]: I1207 19:39:03.572156 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-g84n8" Dec 07 19:39:03 crc kubenswrapper[4815]: I1207 19:39:03.576320 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 07 19:39:03 crc kubenswrapper[4815]: I1207 19:39:03.576520 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 07 19:39:03 crc kubenswrapper[4815]: I1207 19:39:03.576785 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rvv2t" Dec 07 19:39:03 crc kubenswrapper[4815]: I1207 19:39:03.576473 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 07 19:39:03 crc kubenswrapper[4815]: I1207 19:39:03.593569 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-g84n8"] Dec 07 19:39:03 crc kubenswrapper[4815]: I1207 19:39:03.600658 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6db783ed-5795-4761-8176-3d425073a274-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-g84n8\" (UID: \"6db783ed-5795-4761-8176-3d425073a274\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-g84n8" Dec 07 19:39:03 crc kubenswrapper[4815]: I1207 19:39:03.601166 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6fsk\" (UniqueName: \"kubernetes.io/projected/6db783ed-5795-4761-8176-3d425073a274-kube-api-access-c6fsk\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-g84n8\" (UID: \"6db783ed-5795-4761-8176-3d425073a274\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-g84n8" Dec 07 19:39:03 crc kubenswrapper[4815]: I1207 19:39:03.601387 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6db783ed-5795-4761-8176-3d425073a274-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-g84n8\" (UID: \"6db783ed-5795-4761-8176-3d425073a274\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-g84n8" Dec 07 19:39:03 crc kubenswrapper[4815]: I1207 19:39:03.601523 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6db783ed-5795-4761-8176-3d425073a274-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-g84n8\" (UID: \"6db783ed-5795-4761-8176-3d425073a274\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-g84n8" Dec 07 19:39:03 crc kubenswrapper[4815]: I1207 19:39:03.702895 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6db783ed-5795-4761-8176-3d425073a274-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-g84n8\" (UID: \"6db783ed-5795-4761-8176-3d425073a274\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-g84n8" Dec 07 19:39:03 crc kubenswrapper[4815]: I1207 19:39:03.702953 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6fsk\" (UniqueName: \"kubernetes.io/projected/6db783ed-5795-4761-8176-3d425073a274-kube-api-access-c6fsk\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-g84n8\" (UID: \"6db783ed-5795-4761-8176-3d425073a274\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-g84n8" Dec 07 19:39:03 crc kubenswrapper[4815]: I1207 19:39:03.702991 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6db783ed-5795-4761-8176-3d425073a274-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-g84n8\" (UID: \"6db783ed-5795-4761-8176-3d425073a274\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-g84n8" Dec 07 19:39:03 crc kubenswrapper[4815]: I1207 19:39:03.703007 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6db783ed-5795-4761-8176-3d425073a274-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-g84n8\" (UID: \"6db783ed-5795-4761-8176-3d425073a274\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-g84n8" Dec 07 19:39:03 crc kubenswrapper[4815]: I1207 19:39:03.706509 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6db783ed-5795-4761-8176-3d425073a274-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-g84n8\" (UID: \"6db783ed-5795-4761-8176-3d425073a274\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-g84n8" Dec 07 19:39:03 crc kubenswrapper[4815]: I1207 19:39:03.709511 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6db783ed-5795-4761-8176-3d425073a274-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-g84n8\" (UID: \"6db783ed-5795-4761-8176-3d425073a274\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-g84n8" Dec 07 19:39:03 crc kubenswrapper[4815]: I1207 19:39:03.710014 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6db783ed-5795-4761-8176-3d425073a274-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-g84n8\" (UID: \"6db783ed-5795-4761-8176-3d425073a274\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-g84n8" Dec 07 19:39:03 crc kubenswrapper[4815]: I1207 19:39:03.719273 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6fsk\" (UniqueName: \"kubernetes.io/projected/6db783ed-5795-4761-8176-3d425073a274-kube-api-access-c6fsk\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-g84n8\" (UID: \"6db783ed-5795-4761-8176-3d425073a274\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-g84n8" Dec 07 19:39:03 crc kubenswrapper[4815]: I1207 19:39:03.768386 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-g84n8" Dec 07 19:39:04 crc kubenswrapper[4815]: I1207 19:39:04.168860 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-g84n8"] Dec 07 19:39:04 crc kubenswrapper[4815]: W1207 19:39:04.178882 4815 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6db783ed_5795_4761_8176_3d425073a274.slice/crio-0dac048ea6f80ef8f4ce7c065b436589aa58fafeb3ec6104f27f4477d3c01918 WatchSource:0}: Error finding container 0dac048ea6f80ef8f4ce7c065b436589aa58fafeb3ec6104f27f4477d3c01918: Status 404 returned error can't find the container with id 0dac048ea6f80ef8f4ce7c065b436589aa58fafeb3ec6104f27f4477d3c01918 Dec 07 19:39:04 crc kubenswrapper[4815]: I1207 19:39:04.479340 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-g84n8" event={"ID":"6db783ed-5795-4761-8176-3d425073a274","Type":"ContainerStarted","Data":"0dac048ea6f80ef8f4ce7c065b436589aa58fafeb3ec6104f27f4477d3c01918"} Dec 07 19:39:04 crc kubenswrapper[4815]: I1207 19:39:04.481809 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"95704672-75eb-411c-a866-09ed671263f7","Type":"ContainerStarted","Data":"7713f5ca3e2711393cb92f6717f63a73c72b4abbffa4db39b078d3fb095896d3"} Dec 07 19:39:04 crc kubenswrapper[4815]: I1207 19:39:04.482210 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 07 19:39:04 crc kubenswrapper[4815]: I1207 19:39:04.522741 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=35.522706103 podStartE2EDuration="35.522706103s" podCreationTimestamp="2025-12-07 19:38:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:39:04.504531531 +0000 UTC m=+1449.083521596" watchObservedRunningTime="2025-12-07 19:39:04.522706103 +0000 UTC m=+1449.101696148" Dec 07 19:39:07 crc kubenswrapper[4815]: I1207 19:39:07.556692 4815 generic.go:334] "Generic (PLEG): container finished" podID="22bd77dc-c382-467f-928a-4be062c951ca" containerID="90c8e9e035b5996e7f0803bae2feb01203a53975cda814d9769899e4a01b3847" exitCode=0 Dec 07 19:39:07 crc kubenswrapper[4815]: I1207 19:39:07.557013 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"22bd77dc-c382-467f-928a-4be062c951ca","Type":"ContainerDied","Data":"90c8e9e035b5996e7f0803bae2feb01203a53975cda814d9769899e4a01b3847"} Dec 07 19:39:08 crc kubenswrapper[4815]: I1207 19:39:08.568482 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"22bd77dc-c382-467f-928a-4be062c951ca","Type":"ContainerStarted","Data":"c6399db99e8110d50627957d20e3f79a9bf5a09d382046b2590efc7736ea61a7"} Dec 07 19:39:08 crc kubenswrapper[4815]: I1207 19:39:08.569014 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 07 19:39:08 crc kubenswrapper[4815]: I1207 19:39:08.603076 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=38.603048296 podStartE2EDuration="38.603048296s" podCreationTimestamp="2025-12-07 19:38:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 19:39:08.599365292 +0000 UTC m=+1453.178355337" watchObservedRunningTime="2025-12-07 19:39:08.603048296 +0000 UTC m=+1453.182038381" Dec 07 19:39:18 crc kubenswrapper[4815]: I1207 19:39:18.670171 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-g84n8" event={"ID":"6db783ed-5795-4761-8176-3d425073a274","Type":"ContainerStarted","Data":"39e2a6c7b7707aba18f665b328318dcd739226131ea920c1cc7eef1f4742e96b"} Dec 07 19:39:18 crc kubenswrapper[4815]: I1207 19:39:18.698985 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-g84n8" podStartSLOduration=1.794918808 podStartE2EDuration="15.698967005s" podCreationTimestamp="2025-12-07 19:39:03 +0000 UTC" firstStartedPulling="2025-12-07 19:39:04.180462266 +0000 UTC m=+1448.759452311" lastFinishedPulling="2025-12-07 19:39:18.084510463 +0000 UTC m=+1462.663500508" observedRunningTime="2025-12-07 19:39:18.688600293 +0000 UTC m=+1463.267590338" watchObservedRunningTime="2025-12-07 19:39:18.698967005 +0000 UTC m=+1463.277957050" Dec 07 19:39:19 crc kubenswrapper[4815]: I1207 19:39:19.414361 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 07 19:39:21 crc kubenswrapper[4815]: I1207 19:39:21.053166 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 07 19:39:31 crc kubenswrapper[4815]: I1207 19:39:31.783370 4815 generic.go:334] "Generic (PLEG): container finished" podID="6db783ed-5795-4761-8176-3d425073a274" containerID="39e2a6c7b7707aba18f665b328318dcd739226131ea920c1cc7eef1f4742e96b" exitCode=0 Dec 07 19:39:31 crc kubenswrapper[4815]: I1207 19:39:31.785030 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-g84n8" event={"ID":"6db783ed-5795-4761-8176-3d425073a274","Type":"ContainerDied","Data":"39e2a6c7b7707aba18f665b328318dcd739226131ea920c1cc7eef1f4742e96b"} Dec 07 19:39:33 crc kubenswrapper[4815]: I1207 19:39:33.244807 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-g84n8" Dec 07 19:39:33 crc kubenswrapper[4815]: I1207 19:39:33.387334 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6db783ed-5795-4761-8176-3d425073a274-inventory\") pod \"6db783ed-5795-4761-8176-3d425073a274\" (UID: \"6db783ed-5795-4761-8176-3d425073a274\") " Dec 07 19:39:33 crc kubenswrapper[4815]: I1207 19:39:33.387483 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6db783ed-5795-4761-8176-3d425073a274-ssh-key\") pod \"6db783ed-5795-4761-8176-3d425073a274\" (UID: \"6db783ed-5795-4761-8176-3d425073a274\") " Dec 07 19:39:33 crc kubenswrapper[4815]: I1207 19:39:33.387554 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6db783ed-5795-4761-8176-3d425073a274-repo-setup-combined-ca-bundle\") pod \"6db783ed-5795-4761-8176-3d425073a274\" (UID: \"6db783ed-5795-4761-8176-3d425073a274\") " Dec 07 19:39:33 crc kubenswrapper[4815]: I1207 19:39:33.387623 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6fsk\" (UniqueName: \"kubernetes.io/projected/6db783ed-5795-4761-8176-3d425073a274-kube-api-access-c6fsk\") pod \"6db783ed-5795-4761-8176-3d425073a274\" (UID: \"6db783ed-5795-4761-8176-3d425073a274\") " Dec 07 19:39:33 crc kubenswrapper[4815]: I1207 19:39:33.396307 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6db783ed-5795-4761-8176-3d425073a274-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "6db783ed-5795-4761-8176-3d425073a274" (UID: "6db783ed-5795-4761-8176-3d425073a274"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:39:33 crc kubenswrapper[4815]: I1207 19:39:33.396537 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6db783ed-5795-4761-8176-3d425073a274-kube-api-access-c6fsk" (OuterVolumeSpecName: "kube-api-access-c6fsk") pod "6db783ed-5795-4761-8176-3d425073a274" (UID: "6db783ed-5795-4761-8176-3d425073a274"). InnerVolumeSpecName "kube-api-access-c6fsk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:39:33 crc kubenswrapper[4815]: I1207 19:39:33.417951 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6db783ed-5795-4761-8176-3d425073a274-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "6db783ed-5795-4761-8176-3d425073a274" (UID: "6db783ed-5795-4761-8176-3d425073a274"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:39:33 crc kubenswrapper[4815]: I1207 19:39:33.423627 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6db783ed-5795-4761-8176-3d425073a274-inventory" (OuterVolumeSpecName: "inventory") pod "6db783ed-5795-4761-8176-3d425073a274" (UID: "6db783ed-5795-4761-8176-3d425073a274"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:39:33 crc kubenswrapper[4815]: I1207 19:39:33.490146 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c6fsk\" (UniqueName: \"kubernetes.io/projected/6db783ed-5795-4761-8176-3d425073a274-kube-api-access-c6fsk\") on node \"crc\" DevicePath \"\"" Dec 07 19:39:33 crc kubenswrapper[4815]: I1207 19:39:33.490175 4815 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6db783ed-5795-4761-8176-3d425073a274-inventory\") on node \"crc\" DevicePath \"\"" Dec 07 19:39:33 crc kubenswrapper[4815]: I1207 19:39:33.490183 4815 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6db783ed-5795-4761-8176-3d425073a274-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 07 19:39:33 crc kubenswrapper[4815]: I1207 19:39:33.490193 4815 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6db783ed-5795-4761-8176-3d425073a274-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 07 19:39:33 crc kubenswrapper[4815]: I1207 19:39:33.808199 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-g84n8" event={"ID":"6db783ed-5795-4761-8176-3d425073a274","Type":"ContainerDied","Data":"0dac048ea6f80ef8f4ce7c065b436589aa58fafeb3ec6104f27f4477d3c01918"} Dec 07 19:39:33 crc kubenswrapper[4815]: I1207 19:39:33.808490 4815 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0dac048ea6f80ef8f4ce7c065b436589aa58fafeb3ec6104f27f4477d3c01918" Dec 07 19:39:33 crc kubenswrapper[4815]: I1207 19:39:33.808263 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-g84n8" Dec 07 19:39:33 crc kubenswrapper[4815]: I1207 19:39:33.994798 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zcf4l"] Dec 07 19:39:33 crc kubenswrapper[4815]: E1207 19:39:33.995368 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6db783ed-5795-4761-8176-3d425073a274" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 07 19:39:33 crc kubenswrapper[4815]: I1207 19:39:33.995391 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="6db783ed-5795-4761-8176-3d425073a274" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 07 19:39:33 crc kubenswrapper[4815]: I1207 19:39:33.995620 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="6db783ed-5795-4761-8176-3d425073a274" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 07 19:39:33 crc kubenswrapper[4815]: I1207 19:39:33.996362 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zcf4l" Dec 07 19:39:33 crc kubenswrapper[4815]: I1207 19:39:33.999313 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 07 19:39:33 crc kubenswrapper[4815]: I1207 19:39:33.999420 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ee1d4597-bd09-4333-b5bd-e50c52c92cd3-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zcf4l\" (UID: \"ee1d4597-bd09-4333-b5bd-e50c52c92cd3\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zcf4l" Dec 07 19:39:33 crc kubenswrapper[4815]: I1207 19:39:33.999508 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qphs4\" (UniqueName: \"kubernetes.io/projected/ee1d4597-bd09-4333-b5bd-e50c52c92cd3-kube-api-access-qphs4\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zcf4l\" (UID: \"ee1d4597-bd09-4333-b5bd-e50c52c92cd3\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zcf4l" Dec 07 19:39:33 crc kubenswrapper[4815]: I1207 19:39:33.999540 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ee1d4597-bd09-4333-b5bd-e50c52c92cd3-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zcf4l\" (UID: \"ee1d4597-bd09-4333-b5bd-e50c52c92cd3\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zcf4l" Dec 07 19:39:33 crc kubenswrapper[4815]: I1207 19:39:33.999807 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee1d4597-bd09-4333-b5bd-e50c52c92cd3-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zcf4l\" (UID: \"ee1d4597-bd09-4333-b5bd-e50c52c92cd3\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zcf4l" Dec 07 19:39:34 crc kubenswrapper[4815]: I1207 19:39:34.000077 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rvv2t" Dec 07 19:39:34 crc kubenswrapper[4815]: I1207 19:39:34.001944 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 07 19:39:34 crc kubenswrapper[4815]: I1207 19:39:34.002120 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 07 19:39:34 crc kubenswrapper[4815]: I1207 19:39:34.018201 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zcf4l"] Dec 07 19:39:34 crc kubenswrapper[4815]: I1207 19:39:34.101504 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee1d4597-bd09-4333-b5bd-e50c52c92cd3-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zcf4l\" (UID: \"ee1d4597-bd09-4333-b5bd-e50c52c92cd3\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zcf4l" Dec 07 19:39:34 crc kubenswrapper[4815]: I1207 19:39:34.101604 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ee1d4597-bd09-4333-b5bd-e50c52c92cd3-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zcf4l\" (UID: \"ee1d4597-bd09-4333-b5bd-e50c52c92cd3\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zcf4l" Dec 07 19:39:34 crc kubenswrapper[4815]: I1207 19:39:34.101700 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qphs4\" (UniqueName: \"kubernetes.io/projected/ee1d4597-bd09-4333-b5bd-e50c52c92cd3-kube-api-access-qphs4\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zcf4l\" (UID: \"ee1d4597-bd09-4333-b5bd-e50c52c92cd3\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zcf4l" Dec 07 19:39:34 crc kubenswrapper[4815]: I1207 19:39:34.101720 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ee1d4597-bd09-4333-b5bd-e50c52c92cd3-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zcf4l\" (UID: \"ee1d4597-bd09-4333-b5bd-e50c52c92cd3\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zcf4l" Dec 07 19:39:34 crc kubenswrapper[4815]: I1207 19:39:34.107604 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ee1d4597-bd09-4333-b5bd-e50c52c92cd3-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zcf4l\" (UID: \"ee1d4597-bd09-4333-b5bd-e50c52c92cd3\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zcf4l" Dec 07 19:39:34 crc kubenswrapper[4815]: I1207 19:39:34.107793 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ee1d4597-bd09-4333-b5bd-e50c52c92cd3-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zcf4l\" (UID: \"ee1d4597-bd09-4333-b5bd-e50c52c92cd3\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zcf4l" Dec 07 19:39:34 crc kubenswrapper[4815]: I1207 19:39:34.114287 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee1d4597-bd09-4333-b5bd-e50c52c92cd3-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zcf4l\" (UID: \"ee1d4597-bd09-4333-b5bd-e50c52c92cd3\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zcf4l" Dec 07 19:39:34 crc kubenswrapper[4815]: I1207 19:39:34.119545 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qphs4\" (UniqueName: \"kubernetes.io/projected/ee1d4597-bd09-4333-b5bd-e50c52c92cd3-kube-api-access-qphs4\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zcf4l\" (UID: \"ee1d4597-bd09-4333-b5bd-e50c52c92cd3\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zcf4l" Dec 07 19:39:34 crc kubenswrapper[4815]: I1207 19:39:34.361956 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zcf4l" Dec 07 19:39:34 crc kubenswrapper[4815]: I1207 19:39:34.892780 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zcf4l"] Dec 07 19:39:35 crc kubenswrapper[4815]: I1207 19:39:35.828296 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zcf4l" event={"ID":"ee1d4597-bd09-4333-b5bd-e50c52c92cd3","Type":"ContainerStarted","Data":"3e43f36a7c67aaa817a84cd9b0c06d463555e74cd0d7dd42f4028fcf42f30af2"} Dec 07 19:39:35 crc kubenswrapper[4815]: I1207 19:39:35.828856 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zcf4l" event={"ID":"ee1d4597-bd09-4333-b5bd-e50c52c92cd3","Type":"ContainerStarted","Data":"41863916b3c039cd2ab481f643c14a675c55790d660bc478677bd40be953770e"} Dec 07 19:39:35 crc kubenswrapper[4815]: I1207 19:39:35.862567 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zcf4l" podStartSLOduration=2.387611249 podStartE2EDuration="2.862551063s" podCreationTimestamp="2025-12-07 19:39:33 +0000 UTC" firstStartedPulling="2025-12-07 19:39:34.9170902 +0000 UTC m=+1479.496080245" lastFinishedPulling="2025-12-07 19:39:35.392030014 +0000 UTC m=+1479.971020059" observedRunningTime="2025-12-07 19:39:35.854710052 +0000 UTC m=+1480.433700097" watchObservedRunningTime="2025-12-07 19:39:35.862551063 +0000 UTC m=+1480.441541108" Dec 07 19:40:21 crc kubenswrapper[4815]: I1207 19:40:21.639564 4815 scope.go:117] "RemoveContainer" containerID="35fe694f36a5b968fea28439d6a0189c15b3d94d39753db9a98857fefc277512" Dec 07 19:40:21 crc kubenswrapper[4815]: I1207 19:40:21.668403 4815 scope.go:117] "RemoveContainer" containerID="37ffa573132baeeedd24de71eaf28cc651f659e5ea501f45e91697b3ce504cab" Dec 07 19:40:21 crc kubenswrapper[4815]: I1207 19:40:21.729732 4815 scope.go:117] "RemoveContainer" containerID="c6a0f7926e8736e115fd39c81271cae94bd2d46de082517f365745f1b0ebf535" Dec 07 19:40:26 crc kubenswrapper[4815]: I1207 19:40:26.360149 4815 patch_prober.go:28] interesting pod/machine-config-daemon-gkn4h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 07 19:40:26 crc kubenswrapper[4815]: I1207 19:40:26.361884 4815 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 07 19:40:56 crc kubenswrapper[4815]: I1207 19:40:56.360093 4815 patch_prober.go:28] interesting pod/machine-config-daemon-gkn4h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 07 19:40:56 crc kubenswrapper[4815]: I1207 19:40:56.360959 4815 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 07 19:41:21 crc kubenswrapper[4815]: I1207 19:41:21.858501 4815 scope.go:117] "RemoveContainer" containerID="0ee1085ac12971aaec815c4546ec9f44dc4c3d76c683ba6fb5073a35a0387092" Dec 07 19:41:26 crc kubenswrapper[4815]: I1207 19:41:26.360172 4815 patch_prober.go:28] interesting pod/machine-config-daemon-gkn4h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 07 19:41:26 crc kubenswrapper[4815]: I1207 19:41:26.360737 4815 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 07 19:41:26 crc kubenswrapper[4815]: I1207 19:41:26.360782 4815 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" Dec 07 19:41:26 crc kubenswrapper[4815]: I1207 19:41:26.361275 4815 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f352000dbce74affccf74924ce7b6a53eb5e32c31268c3c83ff0c4e8881fae0d"} pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 07 19:41:26 crc kubenswrapper[4815]: I1207 19:41:26.361332 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" containerName="machine-config-daemon" containerID="cri-o://f352000dbce74affccf74924ce7b6a53eb5e32c31268c3c83ff0c4e8881fae0d" gracePeriod=600 Dec 07 19:41:26 crc kubenswrapper[4815]: E1207 19:41:26.504977 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gkn4h_openshift-machine-config-operator(3d662ba2-aa03-4eea-bd30-8ad40638f6c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" Dec 07 19:41:27 crc kubenswrapper[4815]: I1207 19:41:27.077881 4815 generic.go:334] "Generic (PLEG): container finished" podID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" containerID="f352000dbce74affccf74924ce7b6a53eb5e32c31268c3c83ff0c4e8881fae0d" exitCode=0 Dec 07 19:41:27 crc kubenswrapper[4815]: I1207 19:41:27.077982 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" event={"ID":"3d662ba2-aa03-4eea-bd30-8ad40638f6c7","Type":"ContainerDied","Data":"f352000dbce74affccf74924ce7b6a53eb5e32c31268c3c83ff0c4e8881fae0d"} Dec 07 19:41:27 crc kubenswrapper[4815]: I1207 19:41:27.078290 4815 scope.go:117] "RemoveContainer" containerID="5abc2074a486ad753dc90f2524ca4ee8cd9d4b0e73ed194398bf623a9d215d17" Dec 07 19:41:27 crc kubenswrapper[4815]: I1207 19:41:27.078997 4815 scope.go:117] "RemoveContainer" containerID="f352000dbce74affccf74924ce7b6a53eb5e32c31268c3c83ff0c4e8881fae0d" Dec 07 19:41:27 crc kubenswrapper[4815]: E1207 19:41:27.079275 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gkn4h_openshift-machine-config-operator(3d662ba2-aa03-4eea-bd30-8ad40638f6c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" Dec 07 19:41:38 crc kubenswrapper[4815]: I1207 19:41:38.769444 4815 scope.go:117] "RemoveContainer" containerID="f352000dbce74affccf74924ce7b6a53eb5e32c31268c3c83ff0c4e8881fae0d" Dec 07 19:41:38 crc kubenswrapper[4815]: E1207 19:41:38.771177 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gkn4h_openshift-machine-config-operator(3d662ba2-aa03-4eea-bd30-8ad40638f6c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" Dec 07 19:41:42 crc kubenswrapper[4815]: I1207 19:41:42.613733 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-fttjl"] Dec 07 19:41:42 crc kubenswrapper[4815]: I1207 19:41:42.617986 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fttjl" Dec 07 19:41:42 crc kubenswrapper[4815]: I1207 19:41:42.657687 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fttjl"] Dec 07 19:41:42 crc kubenswrapper[4815]: I1207 19:41:42.723693 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4b8371a-e1cd-448a-91f1-0fd4a7ae8bf8-catalog-content\") pod \"redhat-marketplace-fttjl\" (UID: \"b4b8371a-e1cd-448a-91f1-0fd4a7ae8bf8\") " pod="openshift-marketplace/redhat-marketplace-fttjl" Dec 07 19:41:42 crc kubenswrapper[4815]: I1207 19:41:42.723777 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4b8371a-e1cd-448a-91f1-0fd4a7ae8bf8-utilities\") pod \"redhat-marketplace-fttjl\" (UID: \"b4b8371a-e1cd-448a-91f1-0fd4a7ae8bf8\") " pod="openshift-marketplace/redhat-marketplace-fttjl" Dec 07 19:41:42 crc kubenswrapper[4815]: I1207 19:41:42.724137 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-479df\" (UniqueName: \"kubernetes.io/projected/b4b8371a-e1cd-448a-91f1-0fd4a7ae8bf8-kube-api-access-479df\") pod \"redhat-marketplace-fttjl\" (UID: \"b4b8371a-e1cd-448a-91f1-0fd4a7ae8bf8\") " pod="openshift-marketplace/redhat-marketplace-fttjl" Dec 07 19:41:42 crc kubenswrapper[4815]: I1207 19:41:42.848582 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4b8371a-e1cd-448a-91f1-0fd4a7ae8bf8-catalog-content\") pod \"redhat-marketplace-fttjl\" (UID: \"b4b8371a-e1cd-448a-91f1-0fd4a7ae8bf8\") " pod="openshift-marketplace/redhat-marketplace-fttjl" Dec 07 19:41:42 crc kubenswrapper[4815]: I1207 19:41:42.848742 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4b8371a-e1cd-448a-91f1-0fd4a7ae8bf8-utilities\") pod \"redhat-marketplace-fttjl\" (UID: \"b4b8371a-e1cd-448a-91f1-0fd4a7ae8bf8\") " pod="openshift-marketplace/redhat-marketplace-fttjl" Dec 07 19:41:42 crc kubenswrapper[4815]: I1207 19:41:42.848773 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-479df\" (UniqueName: \"kubernetes.io/projected/b4b8371a-e1cd-448a-91f1-0fd4a7ae8bf8-kube-api-access-479df\") pod \"redhat-marketplace-fttjl\" (UID: \"b4b8371a-e1cd-448a-91f1-0fd4a7ae8bf8\") " pod="openshift-marketplace/redhat-marketplace-fttjl" Dec 07 19:41:42 crc kubenswrapper[4815]: I1207 19:41:42.849610 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4b8371a-e1cd-448a-91f1-0fd4a7ae8bf8-catalog-content\") pod \"redhat-marketplace-fttjl\" (UID: \"b4b8371a-e1cd-448a-91f1-0fd4a7ae8bf8\") " pod="openshift-marketplace/redhat-marketplace-fttjl" Dec 07 19:41:42 crc kubenswrapper[4815]: I1207 19:41:42.849656 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4b8371a-e1cd-448a-91f1-0fd4a7ae8bf8-utilities\") pod \"redhat-marketplace-fttjl\" (UID: \"b4b8371a-e1cd-448a-91f1-0fd4a7ae8bf8\") " pod="openshift-marketplace/redhat-marketplace-fttjl" Dec 07 19:41:42 crc kubenswrapper[4815]: I1207 19:41:42.868482 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-479df\" (UniqueName: \"kubernetes.io/projected/b4b8371a-e1cd-448a-91f1-0fd4a7ae8bf8-kube-api-access-479df\") pod \"redhat-marketplace-fttjl\" (UID: \"b4b8371a-e1cd-448a-91f1-0fd4a7ae8bf8\") " pod="openshift-marketplace/redhat-marketplace-fttjl" Dec 07 19:41:42 crc kubenswrapper[4815]: I1207 19:41:42.945427 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fttjl" Dec 07 19:41:43 crc kubenswrapper[4815]: I1207 19:41:43.449246 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fttjl"] Dec 07 19:41:44 crc kubenswrapper[4815]: I1207 19:41:44.323848 4815 generic.go:334] "Generic (PLEG): container finished" podID="b4b8371a-e1cd-448a-91f1-0fd4a7ae8bf8" containerID="a2e6d7f427bf98cf478f32655a14f22d886268e4871e202d69b065bfc3d8cf74" exitCode=0 Dec 07 19:41:44 crc kubenswrapper[4815]: I1207 19:41:44.323899 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fttjl" event={"ID":"b4b8371a-e1cd-448a-91f1-0fd4a7ae8bf8","Type":"ContainerDied","Data":"a2e6d7f427bf98cf478f32655a14f22d886268e4871e202d69b065bfc3d8cf74"} Dec 07 19:41:44 crc kubenswrapper[4815]: I1207 19:41:44.323993 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fttjl" event={"ID":"b4b8371a-e1cd-448a-91f1-0fd4a7ae8bf8","Type":"ContainerStarted","Data":"4f15befd41f3115c057b9cfd80e0d9ec74d7c13ff293ceb6b94e83e56a016990"} Dec 07 19:41:45 crc kubenswrapper[4815]: I1207 19:41:45.333820 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fttjl" event={"ID":"b4b8371a-e1cd-448a-91f1-0fd4a7ae8bf8","Type":"ContainerStarted","Data":"95f53a2b7af66fcbd263acdbfc3aa22f7c67799263d66bc8cf1d54bc4d439508"} Dec 07 19:41:46 crc kubenswrapper[4815]: I1207 19:41:46.352218 4815 generic.go:334] "Generic (PLEG): container finished" podID="b4b8371a-e1cd-448a-91f1-0fd4a7ae8bf8" containerID="95f53a2b7af66fcbd263acdbfc3aa22f7c67799263d66bc8cf1d54bc4d439508" exitCode=0 Dec 07 19:41:46 crc kubenswrapper[4815]: I1207 19:41:46.352342 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fttjl" event={"ID":"b4b8371a-e1cd-448a-91f1-0fd4a7ae8bf8","Type":"ContainerDied","Data":"95f53a2b7af66fcbd263acdbfc3aa22f7c67799263d66bc8cf1d54bc4d439508"} Dec 07 19:41:47 crc kubenswrapper[4815]: I1207 19:41:47.366305 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fttjl" event={"ID":"b4b8371a-e1cd-448a-91f1-0fd4a7ae8bf8","Type":"ContainerStarted","Data":"b09a385406a54e720da0f3343435bf4e08a3c520bdad60c66c03a8792cf9cff1"} Dec 07 19:41:47 crc kubenswrapper[4815]: I1207 19:41:47.393115 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-fttjl" podStartSLOduration=2.838374843 podStartE2EDuration="5.393087773s" podCreationTimestamp="2025-12-07 19:41:42 +0000 UTC" firstStartedPulling="2025-12-07 19:41:44.325658231 +0000 UTC m=+1608.904648286" lastFinishedPulling="2025-12-07 19:41:46.880371171 +0000 UTC m=+1611.459361216" observedRunningTime="2025-12-07 19:41:47.384540003 +0000 UTC m=+1611.963530048" watchObservedRunningTime="2025-12-07 19:41:47.393087773 +0000 UTC m=+1611.972077818" Dec 07 19:41:52 crc kubenswrapper[4815]: I1207 19:41:52.946169 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-fttjl" Dec 07 19:41:52 crc kubenswrapper[4815]: I1207 19:41:52.947691 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-fttjl" Dec 07 19:41:53 crc kubenswrapper[4815]: I1207 19:41:53.003706 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-fttjl" Dec 07 19:41:53 crc kubenswrapper[4815]: I1207 19:41:53.478874 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-fttjl" Dec 07 19:41:53 crc kubenswrapper[4815]: I1207 19:41:53.522706 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fttjl"] Dec 07 19:41:53 crc kubenswrapper[4815]: I1207 19:41:53.770308 4815 scope.go:117] "RemoveContainer" containerID="f352000dbce74affccf74924ce7b6a53eb5e32c31268c3c83ff0c4e8881fae0d" Dec 07 19:41:53 crc kubenswrapper[4815]: E1207 19:41:53.770562 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gkn4h_openshift-machine-config-operator(3d662ba2-aa03-4eea-bd30-8ad40638f6c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" Dec 07 19:41:55 crc kubenswrapper[4815]: I1207 19:41:55.454382 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-fttjl" podUID="b4b8371a-e1cd-448a-91f1-0fd4a7ae8bf8" containerName="registry-server" containerID="cri-o://b09a385406a54e720da0f3343435bf4e08a3c520bdad60c66c03a8792cf9cff1" gracePeriod=2 Dec 07 19:41:55 crc kubenswrapper[4815]: I1207 19:41:55.867160 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fttjl" Dec 07 19:41:55 crc kubenswrapper[4815]: I1207 19:41:55.918624 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4b8371a-e1cd-448a-91f1-0fd4a7ae8bf8-utilities\") pod \"b4b8371a-e1cd-448a-91f1-0fd4a7ae8bf8\" (UID: \"b4b8371a-e1cd-448a-91f1-0fd4a7ae8bf8\") " Dec 07 19:41:55 crc kubenswrapper[4815]: I1207 19:41:55.918811 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4b8371a-e1cd-448a-91f1-0fd4a7ae8bf8-catalog-content\") pod \"b4b8371a-e1cd-448a-91f1-0fd4a7ae8bf8\" (UID: \"b4b8371a-e1cd-448a-91f1-0fd4a7ae8bf8\") " Dec 07 19:41:55 crc kubenswrapper[4815]: I1207 19:41:55.918891 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-479df\" (UniqueName: \"kubernetes.io/projected/b4b8371a-e1cd-448a-91f1-0fd4a7ae8bf8-kube-api-access-479df\") pod \"b4b8371a-e1cd-448a-91f1-0fd4a7ae8bf8\" (UID: \"b4b8371a-e1cd-448a-91f1-0fd4a7ae8bf8\") " Dec 07 19:41:55 crc kubenswrapper[4815]: I1207 19:41:55.919829 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4b8371a-e1cd-448a-91f1-0fd4a7ae8bf8-utilities" (OuterVolumeSpecName: "utilities") pod "b4b8371a-e1cd-448a-91f1-0fd4a7ae8bf8" (UID: "b4b8371a-e1cd-448a-91f1-0fd4a7ae8bf8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 07 19:41:55 crc kubenswrapper[4815]: I1207 19:41:55.920208 4815 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4b8371a-e1cd-448a-91f1-0fd4a7ae8bf8-utilities\") on node \"crc\" DevicePath \"\"" Dec 07 19:41:55 crc kubenswrapper[4815]: I1207 19:41:55.935205 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4b8371a-e1cd-448a-91f1-0fd4a7ae8bf8-kube-api-access-479df" (OuterVolumeSpecName: "kube-api-access-479df") pod "b4b8371a-e1cd-448a-91f1-0fd4a7ae8bf8" (UID: "b4b8371a-e1cd-448a-91f1-0fd4a7ae8bf8"). InnerVolumeSpecName "kube-api-access-479df". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:41:55 crc kubenswrapper[4815]: I1207 19:41:55.948661 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4b8371a-e1cd-448a-91f1-0fd4a7ae8bf8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b4b8371a-e1cd-448a-91f1-0fd4a7ae8bf8" (UID: "b4b8371a-e1cd-448a-91f1-0fd4a7ae8bf8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 07 19:41:56 crc kubenswrapper[4815]: I1207 19:41:56.021221 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-479df\" (UniqueName: \"kubernetes.io/projected/b4b8371a-e1cd-448a-91f1-0fd4a7ae8bf8-kube-api-access-479df\") on node \"crc\" DevicePath \"\"" Dec 07 19:41:56 crc kubenswrapper[4815]: I1207 19:41:56.021451 4815 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4b8371a-e1cd-448a-91f1-0fd4a7ae8bf8-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 07 19:41:56 crc kubenswrapper[4815]: I1207 19:41:56.468204 4815 generic.go:334] "Generic (PLEG): container finished" podID="b4b8371a-e1cd-448a-91f1-0fd4a7ae8bf8" containerID="b09a385406a54e720da0f3343435bf4e08a3c520bdad60c66c03a8792cf9cff1" exitCode=0 Dec 07 19:41:56 crc kubenswrapper[4815]: I1207 19:41:56.468287 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fttjl" event={"ID":"b4b8371a-e1cd-448a-91f1-0fd4a7ae8bf8","Type":"ContainerDied","Data":"b09a385406a54e720da0f3343435bf4e08a3c520bdad60c66c03a8792cf9cff1"} Dec 07 19:41:56 crc kubenswrapper[4815]: I1207 19:41:56.468331 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fttjl" Dec 07 19:41:56 crc kubenswrapper[4815]: I1207 19:41:56.468363 4815 scope.go:117] "RemoveContainer" containerID="b09a385406a54e720da0f3343435bf4e08a3c520bdad60c66c03a8792cf9cff1" Dec 07 19:41:56 crc kubenswrapper[4815]: I1207 19:41:56.468339 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fttjl" event={"ID":"b4b8371a-e1cd-448a-91f1-0fd4a7ae8bf8","Type":"ContainerDied","Data":"4f15befd41f3115c057b9cfd80e0d9ec74d7c13ff293ceb6b94e83e56a016990"} Dec 07 19:41:56 crc kubenswrapper[4815]: I1207 19:41:56.493620 4815 scope.go:117] "RemoveContainer" containerID="95f53a2b7af66fcbd263acdbfc3aa22f7c67799263d66bc8cf1d54bc4d439508" Dec 07 19:41:56 crc kubenswrapper[4815]: I1207 19:41:56.524556 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fttjl"] Dec 07 19:41:56 crc kubenswrapper[4815]: I1207 19:41:56.527200 4815 scope.go:117] "RemoveContainer" containerID="a2e6d7f427bf98cf478f32655a14f22d886268e4871e202d69b065bfc3d8cf74" Dec 07 19:41:56 crc kubenswrapper[4815]: I1207 19:41:56.535175 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-fttjl"] Dec 07 19:41:56 crc kubenswrapper[4815]: I1207 19:41:56.569200 4815 scope.go:117] "RemoveContainer" containerID="b09a385406a54e720da0f3343435bf4e08a3c520bdad60c66c03a8792cf9cff1" Dec 07 19:41:56 crc kubenswrapper[4815]: E1207 19:41:56.569739 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b09a385406a54e720da0f3343435bf4e08a3c520bdad60c66c03a8792cf9cff1\": container with ID starting with b09a385406a54e720da0f3343435bf4e08a3c520bdad60c66c03a8792cf9cff1 not found: ID does not exist" containerID="b09a385406a54e720da0f3343435bf4e08a3c520bdad60c66c03a8792cf9cff1" Dec 07 19:41:56 crc kubenswrapper[4815]: I1207 19:41:56.569785 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b09a385406a54e720da0f3343435bf4e08a3c520bdad60c66c03a8792cf9cff1"} err="failed to get container status \"b09a385406a54e720da0f3343435bf4e08a3c520bdad60c66c03a8792cf9cff1\": rpc error: code = NotFound desc = could not find container \"b09a385406a54e720da0f3343435bf4e08a3c520bdad60c66c03a8792cf9cff1\": container with ID starting with b09a385406a54e720da0f3343435bf4e08a3c520bdad60c66c03a8792cf9cff1 not found: ID does not exist" Dec 07 19:41:56 crc kubenswrapper[4815]: I1207 19:41:56.569812 4815 scope.go:117] "RemoveContainer" containerID="95f53a2b7af66fcbd263acdbfc3aa22f7c67799263d66bc8cf1d54bc4d439508" Dec 07 19:41:56 crc kubenswrapper[4815]: E1207 19:41:56.570270 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95f53a2b7af66fcbd263acdbfc3aa22f7c67799263d66bc8cf1d54bc4d439508\": container with ID starting with 95f53a2b7af66fcbd263acdbfc3aa22f7c67799263d66bc8cf1d54bc4d439508 not found: ID does not exist" containerID="95f53a2b7af66fcbd263acdbfc3aa22f7c67799263d66bc8cf1d54bc4d439508" Dec 07 19:41:56 crc kubenswrapper[4815]: I1207 19:41:56.570290 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95f53a2b7af66fcbd263acdbfc3aa22f7c67799263d66bc8cf1d54bc4d439508"} err="failed to get container status \"95f53a2b7af66fcbd263acdbfc3aa22f7c67799263d66bc8cf1d54bc4d439508\": rpc error: code = NotFound desc = could not find container \"95f53a2b7af66fcbd263acdbfc3aa22f7c67799263d66bc8cf1d54bc4d439508\": container with ID starting with 95f53a2b7af66fcbd263acdbfc3aa22f7c67799263d66bc8cf1d54bc4d439508 not found: ID does not exist" Dec 07 19:41:56 crc kubenswrapper[4815]: I1207 19:41:56.570304 4815 scope.go:117] "RemoveContainer" containerID="a2e6d7f427bf98cf478f32655a14f22d886268e4871e202d69b065bfc3d8cf74" Dec 07 19:41:56 crc kubenswrapper[4815]: E1207 19:41:56.570649 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2e6d7f427bf98cf478f32655a14f22d886268e4871e202d69b065bfc3d8cf74\": container with ID starting with a2e6d7f427bf98cf478f32655a14f22d886268e4871e202d69b065bfc3d8cf74 not found: ID does not exist" containerID="a2e6d7f427bf98cf478f32655a14f22d886268e4871e202d69b065bfc3d8cf74" Dec 07 19:41:56 crc kubenswrapper[4815]: I1207 19:41:56.570678 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2e6d7f427bf98cf478f32655a14f22d886268e4871e202d69b065bfc3d8cf74"} err="failed to get container status \"a2e6d7f427bf98cf478f32655a14f22d886268e4871e202d69b065bfc3d8cf74\": rpc error: code = NotFound desc = could not find container \"a2e6d7f427bf98cf478f32655a14f22d886268e4871e202d69b065bfc3d8cf74\": container with ID starting with a2e6d7f427bf98cf478f32655a14f22d886268e4871e202d69b065bfc3d8cf74 not found: ID does not exist" Dec 07 19:41:57 crc kubenswrapper[4815]: I1207 19:41:57.787357 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4b8371a-e1cd-448a-91f1-0fd4a7ae8bf8" path="/var/lib/kubelet/pods/b4b8371a-e1cd-448a-91f1-0fd4a7ae8bf8/volumes" Dec 07 19:42:02 crc kubenswrapper[4815]: I1207 19:42:02.873802 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6x8vm"] Dec 07 19:42:02 crc kubenswrapper[4815]: E1207 19:42:02.874850 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4b8371a-e1cd-448a-91f1-0fd4a7ae8bf8" containerName="extract-utilities" Dec 07 19:42:02 crc kubenswrapper[4815]: I1207 19:42:02.874868 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4b8371a-e1cd-448a-91f1-0fd4a7ae8bf8" containerName="extract-utilities" Dec 07 19:42:02 crc kubenswrapper[4815]: E1207 19:42:02.874906 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4b8371a-e1cd-448a-91f1-0fd4a7ae8bf8" containerName="registry-server" Dec 07 19:42:02 crc kubenswrapper[4815]: I1207 19:42:02.874930 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4b8371a-e1cd-448a-91f1-0fd4a7ae8bf8" containerName="registry-server" Dec 07 19:42:02 crc kubenswrapper[4815]: E1207 19:42:02.874950 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4b8371a-e1cd-448a-91f1-0fd4a7ae8bf8" containerName="extract-content" Dec 07 19:42:02 crc kubenswrapper[4815]: I1207 19:42:02.874958 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4b8371a-e1cd-448a-91f1-0fd4a7ae8bf8" containerName="extract-content" Dec 07 19:42:02 crc kubenswrapper[4815]: I1207 19:42:02.875213 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4b8371a-e1cd-448a-91f1-0fd4a7ae8bf8" containerName="registry-server" Dec 07 19:42:02 crc kubenswrapper[4815]: I1207 19:42:02.876642 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6x8vm" Dec 07 19:42:02 crc kubenswrapper[4815]: I1207 19:42:02.891942 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6x8vm"] Dec 07 19:42:02 crc kubenswrapper[4815]: I1207 19:42:02.970858 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqrlx\" (UniqueName: \"kubernetes.io/projected/ec376742-d931-4439-a8ef-5edd028476f1-kube-api-access-pqrlx\") pod \"certified-operators-6x8vm\" (UID: \"ec376742-d931-4439-a8ef-5edd028476f1\") " pod="openshift-marketplace/certified-operators-6x8vm" Dec 07 19:42:02 crc kubenswrapper[4815]: I1207 19:42:02.971105 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec376742-d931-4439-a8ef-5edd028476f1-catalog-content\") pod \"certified-operators-6x8vm\" (UID: \"ec376742-d931-4439-a8ef-5edd028476f1\") " pod="openshift-marketplace/certified-operators-6x8vm" Dec 07 19:42:02 crc kubenswrapper[4815]: I1207 19:42:02.971271 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec376742-d931-4439-a8ef-5edd028476f1-utilities\") pod \"certified-operators-6x8vm\" (UID: \"ec376742-d931-4439-a8ef-5edd028476f1\") " pod="openshift-marketplace/certified-operators-6x8vm" Dec 07 19:42:03 crc kubenswrapper[4815]: I1207 19:42:03.073548 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec376742-d931-4439-a8ef-5edd028476f1-catalog-content\") pod \"certified-operators-6x8vm\" (UID: \"ec376742-d931-4439-a8ef-5edd028476f1\") " pod="openshift-marketplace/certified-operators-6x8vm" Dec 07 19:42:03 crc kubenswrapper[4815]: I1207 19:42:03.073634 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec376742-d931-4439-a8ef-5edd028476f1-utilities\") pod \"certified-operators-6x8vm\" (UID: \"ec376742-d931-4439-a8ef-5edd028476f1\") " pod="openshift-marketplace/certified-operators-6x8vm" Dec 07 19:42:03 crc kubenswrapper[4815]: I1207 19:42:03.073782 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqrlx\" (UniqueName: \"kubernetes.io/projected/ec376742-d931-4439-a8ef-5edd028476f1-kube-api-access-pqrlx\") pod \"certified-operators-6x8vm\" (UID: \"ec376742-d931-4439-a8ef-5edd028476f1\") " pod="openshift-marketplace/certified-operators-6x8vm" Dec 07 19:42:03 crc kubenswrapper[4815]: I1207 19:42:03.074294 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec376742-d931-4439-a8ef-5edd028476f1-catalog-content\") pod \"certified-operators-6x8vm\" (UID: \"ec376742-d931-4439-a8ef-5edd028476f1\") " pod="openshift-marketplace/certified-operators-6x8vm" Dec 07 19:42:03 crc kubenswrapper[4815]: I1207 19:42:03.074397 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec376742-d931-4439-a8ef-5edd028476f1-utilities\") pod \"certified-operators-6x8vm\" (UID: \"ec376742-d931-4439-a8ef-5edd028476f1\") " pod="openshift-marketplace/certified-operators-6x8vm" Dec 07 19:42:03 crc kubenswrapper[4815]: I1207 19:42:03.093180 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqrlx\" (UniqueName: \"kubernetes.io/projected/ec376742-d931-4439-a8ef-5edd028476f1-kube-api-access-pqrlx\") pod \"certified-operators-6x8vm\" (UID: \"ec376742-d931-4439-a8ef-5edd028476f1\") " pod="openshift-marketplace/certified-operators-6x8vm" Dec 07 19:42:03 crc kubenswrapper[4815]: I1207 19:42:03.207808 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6x8vm" Dec 07 19:42:03 crc kubenswrapper[4815]: I1207 19:42:03.713244 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6x8vm"] Dec 07 19:42:04 crc kubenswrapper[4815]: I1207 19:42:04.607675 4815 generic.go:334] "Generic (PLEG): container finished" podID="ec376742-d931-4439-a8ef-5edd028476f1" containerID="9cac19b61f697da9434c5f56010a9473aedd87af83664246dc4960cd7db4ae7c" exitCode=0 Dec 07 19:42:04 crc kubenswrapper[4815]: I1207 19:42:04.607729 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6x8vm" event={"ID":"ec376742-d931-4439-a8ef-5edd028476f1","Type":"ContainerDied","Data":"9cac19b61f697da9434c5f56010a9473aedd87af83664246dc4960cd7db4ae7c"} Dec 07 19:42:04 crc kubenswrapper[4815]: I1207 19:42:04.607758 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6x8vm" event={"ID":"ec376742-d931-4439-a8ef-5edd028476f1","Type":"ContainerStarted","Data":"742939cf355fcfae9dc5443cdf133c4d8f07d94694c8b81342913d10ab1566e7"} Dec 07 19:42:05 crc kubenswrapper[4815]: I1207 19:42:05.616615 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6x8vm" event={"ID":"ec376742-d931-4439-a8ef-5edd028476f1","Type":"ContainerStarted","Data":"b7528605cec3d1b0154c97f47680277f9a9732b09a393c5a453ee84817102aef"} Dec 07 19:42:07 crc kubenswrapper[4815]: I1207 19:42:07.638859 4815 generic.go:334] "Generic (PLEG): container finished" podID="ec376742-d931-4439-a8ef-5edd028476f1" containerID="b7528605cec3d1b0154c97f47680277f9a9732b09a393c5a453ee84817102aef" exitCode=0 Dec 07 19:42:07 crc kubenswrapper[4815]: I1207 19:42:07.638944 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6x8vm" event={"ID":"ec376742-d931-4439-a8ef-5edd028476f1","Type":"ContainerDied","Data":"b7528605cec3d1b0154c97f47680277f9a9732b09a393c5a453ee84817102aef"} Dec 07 19:42:07 crc kubenswrapper[4815]: I1207 19:42:07.771215 4815 scope.go:117] "RemoveContainer" containerID="f352000dbce74affccf74924ce7b6a53eb5e32c31268c3c83ff0c4e8881fae0d" Dec 07 19:42:07 crc kubenswrapper[4815]: E1207 19:42:07.771440 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gkn4h_openshift-machine-config-operator(3d662ba2-aa03-4eea-bd30-8ad40638f6c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" Dec 07 19:42:08 crc kubenswrapper[4815]: I1207 19:42:08.649663 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6x8vm" event={"ID":"ec376742-d931-4439-a8ef-5edd028476f1","Type":"ContainerStarted","Data":"d40172214156545f7da0587239f438efeaf43058257870f59249f2c71f366080"} Dec 07 19:42:08 crc kubenswrapper[4815]: I1207 19:42:08.675738 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6x8vm" podStartSLOduration=3.241914704 podStartE2EDuration="6.675717783s" podCreationTimestamp="2025-12-07 19:42:02 +0000 UTC" firstStartedPulling="2025-12-07 19:42:04.61073702 +0000 UTC m=+1629.189727065" lastFinishedPulling="2025-12-07 19:42:08.044540089 +0000 UTC m=+1632.623530144" observedRunningTime="2025-12-07 19:42:08.670724083 +0000 UTC m=+1633.249714148" watchObservedRunningTime="2025-12-07 19:42:08.675717783 +0000 UTC m=+1633.254707848" Dec 07 19:42:13 crc kubenswrapper[4815]: I1207 19:42:13.208894 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6x8vm" Dec 07 19:42:13 crc kubenswrapper[4815]: I1207 19:42:13.209536 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6x8vm" Dec 07 19:42:13 crc kubenswrapper[4815]: I1207 19:42:13.281342 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6x8vm" Dec 07 19:42:13 crc kubenswrapper[4815]: I1207 19:42:13.762894 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6x8vm" Dec 07 19:42:13 crc kubenswrapper[4815]: I1207 19:42:13.826343 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6x8vm"] Dec 07 19:42:15 crc kubenswrapper[4815]: I1207 19:42:15.716363 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6x8vm" podUID="ec376742-d931-4439-a8ef-5edd028476f1" containerName="registry-server" containerID="cri-o://d40172214156545f7da0587239f438efeaf43058257870f59249f2c71f366080" gracePeriod=2 Dec 07 19:42:16 crc kubenswrapper[4815]: I1207 19:42:16.173549 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6x8vm" Dec 07 19:42:16 crc kubenswrapper[4815]: I1207 19:42:16.362602 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec376742-d931-4439-a8ef-5edd028476f1-utilities\") pod \"ec376742-d931-4439-a8ef-5edd028476f1\" (UID: \"ec376742-d931-4439-a8ef-5edd028476f1\") " Dec 07 19:42:16 crc kubenswrapper[4815]: I1207 19:42:16.363037 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pqrlx\" (UniqueName: \"kubernetes.io/projected/ec376742-d931-4439-a8ef-5edd028476f1-kube-api-access-pqrlx\") pod \"ec376742-d931-4439-a8ef-5edd028476f1\" (UID: \"ec376742-d931-4439-a8ef-5edd028476f1\") " Dec 07 19:42:16 crc kubenswrapper[4815]: I1207 19:42:16.363303 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec376742-d931-4439-a8ef-5edd028476f1-catalog-content\") pod \"ec376742-d931-4439-a8ef-5edd028476f1\" (UID: \"ec376742-d931-4439-a8ef-5edd028476f1\") " Dec 07 19:42:16 crc kubenswrapper[4815]: I1207 19:42:16.363535 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec376742-d931-4439-a8ef-5edd028476f1-utilities" (OuterVolumeSpecName: "utilities") pod "ec376742-d931-4439-a8ef-5edd028476f1" (UID: "ec376742-d931-4439-a8ef-5edd028476f1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 07 19:42:16 crc kubenswrapper[4815]: I1207 19:42:16.364414 4815 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec376742-d931-4439-a8ef-5edd028476f1-utilities\") on node \"crc\" DevicePath \"\"" Dec 07 19:42:16 crc kubenswrapper[4815]: I1207 19:42:16.410153 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec376742-d931-4439-a8ef-5edd028476f1-kube-api-access-pqrlx" (OuterVolumeSpecName: "kube-api-access-pqrlx") pod "ec376742-d931-4439-a8ef-5edd028476f1" (UID: "ec376742-d931-4439-a8ef-5edd028476f1"). InnerVolumeSpecName "kube-api-access-pqrlx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:42:16 crc kubenswrapper[4815]: I1207 19:42:16.461787 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec376742-d931-4439-a8ef-5edd028476f1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ec376742-d931-4439-a8ef-5edd028476f1" (UID: "ec376742-d931-4439-a8ef-5edd028476f1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 07 19:42:16 crc kubenswrapper[4815]: I1207 19:42:16.466774 4815 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec376742-d931-4439-a8ef-5edd028476f1-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 07 19:42:16 crc kubenswrapper[4815]: I1207 19:42:16.466807 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pqrlx\" (UniqueName: \"kubernetes.io/projected/ec376742-d931-4439-a8ef-5edd028476f1-kube-api-access-pqrlx\") on node \"crc\" DevicePath \"\"" Dec 07 19:42:16 crc kubenswrapper[4815]: I1207 19:42:16.730177 4815 generic.go:334] "Generic (PLEG): container finished" podID="ec376742-d931-4439-a8ef-5edd028476f1" containerID="d40172214156545f7da0587239f438efeaf43058257870f59249f2c71f366080" exitCode=0 Dec 07 19:42:16 crc kubenswrapper[4815]: I1207 19:42:16.730243 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6x8vm" event={"ID":"ec376742-d931-4439-a8ef-5edd028476f1","Type":"ContainerDied","Data":"d40172214156545f7da0587239f438efeaf43058257870f59249f2c71f366080"} Dec 07 19:42:16 crc kubenswrapper[4815]: I1207 19:42:16.730272 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6x8vm" Dec 07 19:42:16 crc kubenswrapper[4815]: I1207 19:42:16.730325 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6x8vm" event={"ID":"ec376742-d931-4439-a8ef-5edd028476f1","Type":"ContainerDied","Data":"742939cf355fcfae9dc5443cdf133c4d8f07d94694c8b81342913d10ab1566e7"} Dec 07 19:42:16 crc kubenswrapper[4815]: I1207 19:42:16.730356 4815 scope.go:117] "RemoveContainer" containerID="d40172214156545f7da0587239f438efeaf43058257870f59249f2c71f366080" Dec 07 19:42:16 crc kubenswrapper[4815]: I1207 19:42:16.757992 4815 scope.go:117] "RemoveContainer" containerID="b7528605cec3d1b0154c97f47680277f9a9732b09a393c5a453ee84817102aef" Dec 07 19:42:16 crc kubenswrapper[4815]: I1207 19:42:16.781546 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6x8vm"] Dec 07 19:42:16 crc kubenswrapper[4815]: I1207 19:42:16.791401 4815 scope.go:117] "RemoveContainer" containerID="9cac19b61f697da9434c5f56010a9473aedd87af83664246dc4960cd7db4ae7c" Dec 07 19:42:16 crc kubenswrapper[4815]: I1207 19:42:16.796055 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6x8vm"] Dec 07 19:42:16 crc kubenswrapper[4815]: I1207 19:42:16.823136 4815 scope.go:117] "RemoveContainer" containerID="d40172214156545f7da0587239f438efeaf43058257870f59249f2c71f366080" Dec 07 19:42:16 crc kubenswrapper[4815]: E1207 19:42:16.823657 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d40172214156545f7da0587239f438efeaf43058257870f59249f2c71f366080\": container with ID starting with d40172214156545f7da0587239f438efeaf43058257870f59249f2c71f366080 not found: ID does not exist" containerID="d40172214156545f7da0587239f438efeaf43058257870f59249f2c71f366080" Dec 07 19:42:16 crc kubenswrapper[4815]: I1207 19:42:16.823702 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d40172214156545f7da0587239f438efeaf43058257870f59249f2c71f366080"} err="failed to get container status \"d40172214156545f7da0587239f438efeaf43058257870f59249f2c71f366080\": rpc error: code = NotFound desc = could not find container \"d40172214156545f7da0587239f438efeaf43058257870f59249f2c71f366080\": container with ID starting with d40172214156545f7da0587239f438efeaf43058257870f59249f2c71f366080 not found: ID does not exist" Dec 07 19:42:16 crc kubenswrapper[4815]: I1207 19:42:16.823729 4815 scope.go:117] "RemoveContainer" containerID="b7528605cec3d1b0154c97f47680277f9a9732b09a393c5a453ee84817102aef" Dec 07 19:42:16 crc kubenswrapper[4815]: E1207 19:42:16.824134 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7528605cec3d1b0154c97f47680277f9a9732b09a393c5a453ee84817102aef\": container with ID starting with b7528605cec3d1b0154c97f47680277f9a9732b09a393c5a453ee84817102aef not found: ID does not exist" containerID="b7528605cec3d1b0154c97f47680277f9a9732b09a393c5a453ee84817102aef" Dec 07 19:42:16 crc kubenswrapper[4815]: I1207 19:42:16.824192 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7528605cec3d1b0154c97f47680277f9a9732b09a393c5a453ee84817102aef"} err="failed to get container status \"b7528605cec3d1b0154c97f47680277f9a9732b09a393c5a453ee84817102aef\": rpc error: code = NotFound desc = could not find container \"b7528605cec3d1b0154c97f47680277f9a9732b09a393c5a453ee84817102aef\": container with ID starting with b7528605cec3d1b0154c97f47680277f9a9732b09a393c5a453ee84817102aef not found: ID does not exist" Dec 07 19:42:16 crc kubenswrapper[4815]: I1207 19:42:16.824221 4815 scope.go:117] "RemoveContainer" containerID="9cac19b61f697da9434c5f56010a9473aedd87af83664246dc4960cd7db4ae7c" Dec 07 19:42:16 crc kubenswrapper[4815]: E1207 19:42:16.825237 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9cac19b61f697da9434c5f56010a9473aedd87af83664246dc4960cd7db4ae7c\": container with ID starting with 9cac19b61f697da9434c5f56010a9473aedd87af83664246dc4960cd7db4ae7c not found: ID does not exist" containerID="9cac19b61f697da9434c5f56010a9473aedd87af83664246dc4960cd7db4ae7c" Dec 07 19:42:16 crc kubenswrapper[4815]: I1207 19:42:16.825279 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cac19b61f697da9434c5f56010a9473aedd87af83664246dc4960cd7db4ae7c"} err="failed to get container status \"9cac19b61f697da9434c5f56010a9473aedd87af83664246dc4960cd7db4ae7c\": rpc error: code = NotFound desc = could not find container \"9cac19b61f697da9434c5f56010a9473aedd87af83664246dc4960cd7db4ae7c\": container with ID starting with 9cac19b61f697da9434c5f56010a9473aedd87af83664246dc4960cd7db4ae7c not found: ID does not exist" Dec 07 19:42:17 crc kubenswrapper[4815]: I1207 19:42:17.783307 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec376742-d931-4439-a8ef-5edd028476f1" path="/var/lib/kubelet/pods/ec376742-d931-4439-a8ef-5edd028476f1/volumes" Dec 07 19:42:21 crc kubenswrapper[4815]: I1207 19:42:21.769659 4815 scope.go:117] "RemoveContainer" containerID="f352000dbce74affccf74924ce7b6a53eb5e32c31268c3c83ff0c4e8881fae0d" Dec 07 19:42:21 crc kubenswrapper[4815]: E1207 19:42:21.770345 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gkn4h_openshift-machine-config-operator(3d662ba2-aa03-4eea-bd30-8ad40638f6c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" Dec 07 19:42:21 crc kubenswrapper[4815]: I1207 19:42:21.941621 4815 scope.go:117] "RemoveContainer" containerID="8e79d983bc26c3afdd6f97bef7ed4cb4ff56dc471e407cc1606beba10a7dbac5" Dec 07 19:42:21 crc kubenswrapper[4815]: I1207 19:42:21.970118 4815 scope.go:117] "RemoveContainer" containerID="77e9cd6e5aa474c3b93dce8f1044055eedb05d3144f0668a29681ad252a757bd" Dec 07 19:42:36 crc kubenswrapper[4815]: I1207 19:42:36.769136 4815 scope.go:117] "RemoveContainer" containerID="f352000dbce74affccf74924ce7b6a53eb5e32c31268c3c83ff0c4e8881fae0d" Dec 07 19:42:36 crc kubenswrapper[4815]: E1207 19:42:36.770685 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gkn4h_openshift-machine-config-operator(3d662ba2-aa03-4eea-bd30-8ad40638f6c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" Dec 07 19:42:42 crc kubenswrapper[4815]: I1207 19:42:42.967583 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2fdpx"] Dec 07 19:42:42 crc kubenswrapper[4815]: E1207 19:42:42.968459 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec376742-d931-4439-a8ef-5edd028476f1" containerName="registry-server" Dec 07 19:42:42 crc kubenswrapper[4815]: I1207 19:42:42.968475 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec376742-d931-4439-a8ef-5edd028476f1" containerName="registry-server" Dec 07 19:42:42 crc kubenswrapper[4815]: E1207 19:42:42.968506 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec376742-d931-4439-a8ef-5edd028476f1" containerName="extract-content" Dec 07 19:42:42 crc kubenswrapper[4815]: I1207 19:42:42.968514 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec376742-d931-4439-a8ef-5edd028476f1" containerName="extract-content" Dec 07 19:42:42 crc kubenswrapper[4815]: E1207 19:42:42.968525 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec376742-d931-4439-a8ef-5edd028476f1" containerName="extract-utilities" Dec 07 19:42:42 crc kubenswrapper[4815]: I1207 19:42:42.968540 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec376742-d931-4439-a8ef-5edd028476f1" containerName="extract-utilities" Dec 07 19:42:42 crc kubenswrapper[4815]: I1207 19:42:42.968859 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec376742-d931-4439-a8ef-5edd028476f1" containerName="registry-server" Dec 07 19:42:42 crc kubenswrapper[4815]: I1207 19:42:42.970709 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2fdpx" Dec 07 19:42:42 crc kubenswrapper[4815]: I1207 19:42:42.984491 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2fdpx"] Dec 07 19:42:43 crc kubenswrapper[4815]: I1207 19:42:43.143490 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6abcca75-a25b-49d2-8c61-59402faf16c5-utilities\") pod \"community-operators-2fdpx\" (UID: \"6abcca75-a25b-49d2-8c61-59402faf16c5\") " pod="openshift-marketplace/community-operators-2fdpx" Dec 07 19:42:43 crc kubenswrapper[4815]: I1207 19:42:43.143771 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppvgs\" (UniqueName: \"kubernetes.io/projected/6abcca75-a25b-49d2-8c61-59402faf16c5-kube-api-access-ppvgs\") pod \"community-operators-2fdpx\" (UID: \"6abcca75-a25b-49d2-8c61-59402faf16c5\") " pod="openshift-marketplace/community-operators-2fdpx" Dec 07 19:42:43 crc kubenswrapper[4815]: I1207 19:42:43.143946 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6abcca75-a25b-49d2-8c61-59402faf16c5-catalog-content\") pod \"community-operators-2fdpx\" (UID: \"6abcca75-a25b-49d2-8c61-59402faf16c5\") " pod="openshift-marketplace/community-operators-2fdpx" Dec 07 19:42:43 crc kubenswrapper[4815]: I1207 19:42:43.245884 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6abcca75-a25b-49d2-8c61-59402faf16c5-catalog-content\") pod \"community-operators-2fdpx\" (UID: \"6abcca75-a25b-49d2-8c61-59402faf16c5\") " pod="openshift-marketplace/community-operators-2fdpx" Dec 07 19:42:43 crc kubenswrapper[4815]: I1207 19:42:43.246041 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6abcca75-a25b-49d2-8c61-59402faf16c5-utilities\") pod \"community-operators-2fdpx\" (UID: \"6abcca75-a25b-49d2-8c61-59402faf16c5\") " pod="openshift-marketplace/community-operators-2fdpx" Dec 07 19:42:43 crc kubenswrapper[4815]: I1207 19:42:43.246148 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppvgs\" (UniqueName: \"kubernetes.io/projected/6abcca75-a25b-49d2-8c61-59402faf16c5-kube-api-access-ppvgs\") pod \"community-operators-2fdpx\" (UID: \"6abcca75-a25b-49d2-8c61-59402faf16c5\") " pod="openshift-marketplace/community-operators-2fdpx" Dec 07 19:42:43 crc kubenswrapper[4815]: I1207 19:42:43.246689 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6abcca75-a25b-49d2-8c61-59402faf16c5-catalog-content\") pod \"community-operators-2fdpx\" (UID: \"6abcca75-a25b-49d2-8c61-59402faf16c5\") " pod="openshift-marketplace/community-operators-2fdpx" Dec 07 19:42:43 crc kubenswrapper[4815]: I1207 19:42:43.246907 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6abcca75-a25b-49d2-8c61-59402faf16c5-utilities\") pod \"community-operators-2fdpx\" (UID: \"6abcca75-a25b-49d2-8c61-59402faf16c5\") " pod="openshift-marketplace/community-operators-2fdpx" Dec 07 19:42:43 crc kubenswrapper[4815]: I1207 19:42:43.270779 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppvgs\" (UniqueName: \"kubernetes.io/projected/6abcca75-a25b-49d2-8c61-59402faf16c5-kube-api-access-ppvgs\") pod \"community-operators-2fdpx\" (UID: \"6abcca75-a25b-49d2-8c61-59402faf16c5\") " pod="openshift-marketplace/community-operators-2fdpx" Dec 07 19:42:43 crc kubenswrapper[4815]: I1207 19:42:43.301186 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2fdpx" Dec 07 19:42:43 crc kubenswrapper[4815]: I1207 19:42:43.754683 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2fdpx"] Dec 07 19:42:44 crc kubenswrapper[4815]: I1207 19:42:44.031875 4815 generic.go:334] "Generic (PLEG): container finished" podID="6abcca75-a25b-49d2-8c61-59402faf16c5" containerID="4b3fef84f3020c03cff8ad141239ef8826649f6e181deb4a493fe7bac651a747" exitCode=0 Dec 07 19:42:44 crc kubenswrapper[4815]: I1207 19:42:44.031969 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2fdpx" event={"ID":"6abcca75-a25b-49d2-8c61-59402faf16c5","Type":"ContainerDied","Data":"4b3fef84f3020c03cff8ad141239ef8826649f6e181deb4a493fe7bac651a747"} Dec 07 19:42:44 crc kubenswrapper[4815]: I1207 19:42:44.032898 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2fdpx" event={"ID":"6abcca75-a25b-49d2-8c61-59402faf16c5","Type":"ContainerStarted","Data":"922b8557f2605379a1bfae1a9eeb845a0cb945ae560d3f62f2a743eee8c0d016"} Dec 07 19:42:44 crc kubenswrapper[4815]: I1207 19:42:44.034470 4815 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 07 19:42:45 crc kubenswrapper[4815]: I1207 19:42:45.041478 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2fdpx" event={"ID":"6abcca75-a25b-49d2-8c61-59402faf16c5","Type":"ContainerStarted","Data":"5e487d95dfc114a96531cfe097024a920209fd7677f3ff21bdb5515fb49dbf3c"} Dec 07 19:42:46 crc kubenswrapper[4815]: E1207 19:42:46.821003 4815 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6abcca75_a25b_49d2_8c61_59402faf16c5.slice/crio-conmon-5e487d95dfc114a96531cfe097024a920209fd7677f3ff21bdb5515fb49dbf3c.scope\": RecentStats: unable to find data in memory cache]" Dec 07 19:42:47 crc kubenswrapper[4815]: I1207 19:42:47.069686 4815 generic.go:334] "Generic (PLEG): container finished" podID="6abcca75-a25b-49d2-8c61-59402faf16c5" containerID="5e487d95dfc114a96531cfe097024a920209fd7677f3ff21bdb5515fb49dbf3c" exitCode=0 Dec 07 19:42:47 crc kubenswrapper[4815]: I1207 19:42:47.070041 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2fdpx" event={"ID":"6abcca75-a25b-49d2-8c61-59402faf16c5","Type":"ContainerDied","Data":"5e487d95dfc114a96531cfe097024a920209fd7677f3ff21bdb5515fb49dbf3c"} Dec 07 19:42:48 crc kubenswrapper[4815]: I1207 19:42:48.079824 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2fdpx" event={"ID":"6abcca75-a25b-49d2-8c61-59402faf16c5","Type":"ContainerStarted","Data":"2aac56fd6bcfc4deaba38648da5aced9e31d470796d58cb6a7426c404acc7a30"} Dec 07 19:42:48 crc kubenswrapper[4815]: I1207 19:42:48.108533 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2fdpx" podStartSLOduration=2.676871899 podStartE2EDuration="6.108513427s" podCreationTimestamp="2025-12-07 19:42:42 +0000 UTC" firstStartedPulling="2025-12-07 19:42:44.034235613 +0000 UTC m=+1668.613225658" lastFinishedPulling="2025-12-07 19:42:47.465877131 +0000 UTC m=+1672.044867186" observedRunningTime="2025-12-07 19:42:48.103779024 +0000 UTC m=+1672.682769069" watchObservedRunningTime="2025-12-07 19:42:48.108513427 +0000 UTC m=+1672.687503472" Dec 07 19:42:49 crc kubenswrapper[4815]: I1207 19:42:49.770394 4815 scope.go:117] "RemoveContainer" containerID="f352000dbce74affccf74924ce7b6a53eb5e32c31268c3c83ff0c4e8881fae0d" Dec 07 19:42:49 crc kubenswrapper[4815]: E1207 19:42:49.771143 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gkn4h_openshift-machine-config-operator(3d662ba2-aa03-4eea-bd30-8ad40638f6c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" Dec 07 19:42:53 crc kubenswrapper[4815]: I1207 19:42:53.302067 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2fdpx" Dec 07 19:42:53 crc kubenswrapper[4815]: I1207 19:42:53.302432 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2fdpx" Dec 07 19:42:53 crc kubenswrapper[4815]: I1207 19:42:53.351592 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2fdpx" Dec 07 19:42:54 crc kubenswrapper[4815]: I1207 19:42:54.179129 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2fdpx" Dec 07 19:42:54 crc kubenswrapper[4815]: I1207 19:42:54.227839 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2fdpx"] Dec 07 19:42:56 crc kubenswrapper[4815]: I1207 19:42:56.148226 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-2fdpx" podUID="6abcca75-a25b-49d2-8c61-59402faf16c5" containerName="registry-server" containerID="cri-o://2aac56fd6bcfc4deaba38648da5aced9e31d470796d58cb6a7426c404acc7a30" gracePeriod=2 Dec 07 19:42:56 crc kubenswrapper[4815]: I1207 19:42:56.528466 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2fdpx" Dec 07 19:42:56 crc kubenswrapper[4815]: I1207 19:42:56.723366 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6abcca75-a25b-49d2-8c61-59402faf16c5-catalog-content\") pod \"6abcca75-a25b-49d2-8c61-59402faf16c5\" (UID: \"6abcca75-a25b-49d2-8c61-59402faf16c5\") " Dec 07 19:42:56 crc kubenswrapper[4815]: I1207 19:42:56.723501 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ppvgs\" (UniqueName: \"kubernetes.io/projected/6abcca75-a25b-49d2-8c61-59402faf16c5-kube-api-access-ppvgs\") pod \"6abcca75-a25b-49d2-8c61-59402faf16c5\" (UID: \"6abcca75-a25b-49d2-8c61-59402faf16c5\") " Dec 07 19:42:56 crc kubenswrapper[4815]: I1207 19:42:56.723564 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6abcca75-a25b-49d2-8c61-59402faf16c5-utilities\") pod \"6abcca75-a25b-49d2-8c61-59402faf16c5\" (UID: \"6abcca75-a25b-49d2-8c61-59402faf16c5\") " Dec 07 19:42:56 crc kubenswrapper[4815]: I1207 19:42:56.725161 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6abcca75-a25b-49d2-8c61-59402faf16c5-utilities" (OuterVolumeSpecName: "utilities") pod "6abcca75-a25b-49d2-8c61-59402faf16c5" (UID: "6abcca75-a25b-49d2-8c61-59402faf16c5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 07 19:42:56 crc kubenswrapper[4815]: I1207 19:42:56.731216 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6abcca75-a25b-49d2-8c61-59402faf16c5-kube-api-access-ppvgs" (OuterVolumeSpecName: "kube-api-access-ppvgs") pod "6abcca75-a25b-49d2-8c61-59402faf16c5" (UID: "6abcca75-a25b-49d2-8c61-59402faf16c5"). InnerVolumeSpecName "kube-api-access-ppvgs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:42:56 crc kubenswrapper[4815]: I1207 19:42:56.788678 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6abcca75-a25b-49d2-8c61-59402faf16c5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6abcca75-a25b-49d2-8c61-59402faf16c5" (UID: "6abcca75-a25b-49d2-8c61-59402faf16c5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 07 19:42:56 crc kubenswrapper[4815]: I1207 19:42:56.825096 4815 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6abcca75-a25b-49d2-8c61-59402faf16c5-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 07 19:42:56 crc kubenswrapper[4815]: I1207 19:42:56.825128 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ppvgs\" (UniqueName: \"kubernetes.io/projected/6abcca75-a25b-49d2-8c61-59402faf16c5-kube-api-access-ppvgs\") on node \"crc\" DevicePath \"\"" Dec 07 19:42:56 crc kubenswrapper[4815]: I1207 19:42:56.825142 4815 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6abcca75-a25b-49d2-8c61-59402faf16c5-utilities\") on node \"crc\" DevicePath \"\"" Dec 07 19:42:57 crc kubenswrapper[4815]: I1207 19:42:57.158742 4815 generic.go:334] "Generic (PLEG): container finished" podID="6abcca75-a25b-49d2-8c61-59402faf16c5" containerID="2aac56fd6bcfc4deaba38648da5aced9e31d470796d58cb6a7426c404acc7a30" exitCode=0 Dec 07 19:42:57 crc kubenswrapper[4815]: I1207 19:42:57.158781 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2fdpx" event={"ID":"6abcca75-a25b-49d2-8c61-59402faf16c5","Type":"ContainerDied","Data":"2aac56fd6bcfc4deaba38648da5aced9e31d470796d58cb6a7426c404acc7a30"} Dec 07 19:42:57 crc kubenswrapper[4815]: I1207 19:42:57.158805 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2fdpx" event={"ID":"6abcca75-a25b-49d2-8c61-59402faf16c5","Type":"ContainerDied","Data":"922b8557f2605379a1bfae1a9eeb845a0cb945ae560d3f62f2a743eee8c0d016"} Dec 07 19:42:57 crc kubenswrapper[4815]: I1207 19:42:57.158815 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2fdpx" Dec 07 19:42:57 crc kubenswrapper[4815]: I1207 19:42:57.158820 4815 scope.go:117] "RemoveContainer" containerID="2aac56fd6bcfc4deaba38648da5aced9e31d470796d58cb6a7426c404acc7a30" Dec 07 19:42:57 crc kubenswrapper[4815]: I1207 19:42:57.196688 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2fdpx"] Dec 07 19:42:57 crc kubenswrapper[4815]: I1207 19:42:57.207068 4815 scope.go:117] "RemoveContainer" containerID="5e487d95dfc114a96531cfe097024a920209fd7677f3ff21bdb5515fb49dbf3c" Dec 07 19:42:57 crc kubenswrapper[4815]: I1207 19:42:57.208071 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-2fdpx"] Dec 07 19:42:57 crc kubenswrapper[4815]: I1207 19:42:57.236496 4815 scope.go:117] "RemoveContainer" containerID="4b3fef84f3020c03cff8ad141239ef8826649f6e181deb4a493fe7bac651a747" Dec 07 19:42:57 crc kubenswrapper[4815]: I1207 19:42:57.277607 4815 scope.go:117] "RemoveContainer" containerID="2aac56fd6bcfc4deaba38648da5aced9e31d470796d58cb6a7426c404acc7a30" Dec 07 19:42:57 crc kubenswrapper[4815]: E1207 19:42:57.277974 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2aac56fd6bcfc4deaba38648da5aced9e31d470796d58cb6a7426c404acc7a30\": container with ID starting with 2aac56fd6bcfc4deaba38648da5aced9e31d470796d58cb6a7426c404acc7a30 not found: ID does not exist" containerID="2aac56fd6bcfc4deaba38648da5aced9e31d470796d58cb6a7426c404acc7a30" Dec 07 19:42:57 crc kubenswrapper[4815]: I1207 19:42:57.278010 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2aac56fd6bcfc4deaba38648da5aced9e31d470796d58cb6a7426c404acc7a30"} err="failed to get container status \"2aac56fd6bcfc4deaba38648da5aced9e31d470796d58cb6a7426c404acc7a30\": rpc error: code = NotFound desc = could not find container \"2aac56fd6bcfc4deaba38648da5aced9e31d470796d58cb6a7426c404acc7a30\": container with ID starting with 2aac56fd6bcfc4deaba38648da5aced9e31d470796d58cb6a7426c404acc7a30 not found: ID does not exist" Dec 07 19:42:57 crc kubenswrapper[4815]: I1207 19:42:57.278036 4815 scope.go:117] "RemoveContainer" containerID="5e487d95dfc114a96531cfe097024a920209fd7677f3ff21bdb5515fb49dbf3c" Dec 07 19:42:57 crc kubenswrapper[4815]: E1207 19:42:57.278260 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e487d95dfc114a96531cfe097024a920209fd7677f3ff21bdb5515fb49dbf3c\": container with ID starting with 5e487d95dfc114a96531cfe097024a920209fd7677f3ff21bdb5515fb49dbf3c not found: ID does not exist" containerID="5e487d95dfc114a96531cfe097024a920209fd7677f3ff21bdb5515fb49dbf3c" Dec 07 19:42:57 crc kubenswrapper[4815]: I1207 19:42:57.278303 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e487d95dfc114a96531cfe097024a920209fd7677f3ff21bdb5515fb49dbf3c"} err="failed to get container status \"5e487d95dfc114a96531cfe097024a920209fd7677f3ff21bdb5515fb49dbf3c\": rpc error: code = NotFound desc = could not find container \"5e487d95dfc114a96531cfe097024a920209fd7677f3ff21bdb5515fb49dbf3c\": container with ID starting with 5e487d95dfc114a96531cfe097024a920209fd7677f3ff21bdb5515fb49dbf3c not found: ID does not exist" Dec 07 19:42:57 crc kubenswrapper[4815]: I1207 19:42:57.278323 4815 scope.go:117] "RemoveContainer" containerID="4b3fef84f3020c03cff8ad141239ef8826649f6e181deb4a493fe7bac651a747" Dec 07 19:42:57 crc kubenswrapper[4815]: E1207 19:42:57.278500 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b3fef84f3020c03cff8ad141239ef8826649f6e181deb4a493fe7bac651a747\": container with ID starting with 4b3fef84f3020c03cff8ad141239ef8826649f6e181deb4a493fe7bac651a747 not found: ID does not exist" containerID="4b3fef84f3020c03cff8ad141239ef8826649f6e181deb4a493fe7bac651a747" Dec 07 19:42:57 crc kubenswrapper[4815]: I1207 19:42:57.278541 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b3fef84f3020c03cff8ad141239ef8826649f6e181deb4a493fe7bac651a747"} err="failed to get container status \"4b3fef84f3020c03cff8ad141239ef8826649f6e181deb4a493fe7bac651a747\": rpc error: code = NotFound desc = could not find container \"4b3fef84f3020c03cff8ad141239ef8826649f6e181deb4a493fe7bac651a747\": container with ID starting with 4b3fef84f3020c03cff8ad141239ef8826649f6e181deb4a493fe7bac651a747 not found: ID does not exist" Dec 07 19:42:57 crc kubenswrapper[4815]: I1207 19:42:57.780037 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6abcca75-a25b-49d2-8c61-59402faf16c5" path="/var/lib/kubelet/pods/6abcca75-a25b-49d2-8c61-59402faf16c5/volumes" Dec 07 19:43:03 crc kubenswrapper[4815]: I1207 19:43:03.212349 4815 generic.go:334] "Generic (PLEG): container finished" podID="ee1d4597-bd09-4333-b5bd-e50c52c92cd3" containerID="3e43f36a7c67aaa817a84cd9b0c06d463555e74cd0d7dd42f4028fcf42f30af2" exitCode=0 Dec 07 19:43:03 crc kubenswrapper[4815]: I1207 19:43:03.212426 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zcf4l" event={"ID":"ee1d4597-bd09-4333-b5bd-e50c52c92cd3","Type":"ContainerDied","Data":"3e43f36a7c67aaa817a84cd9b0c06d463555e74cd0d7dd42f4028fcf42f30af2"} Dec 07 19:43:04 crc kubenswrapper[4815]: I1207 19:43:04.647996 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zcf4l" Dec 07 19:43:04 crc kubenswrapper[4815]: I1207 19:43:04.769843 4815 scope.go:117] "RemoveContainer" containerID="f352000dbce74affccf74924ce7b6a53eb5e32c31268c3c83ff0c4e8881fae0d" Dec 07 19:43:04 crc kubenswrapper[4815]: E1207 19:43:04.770270 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gkn4h_openshift-machine-config-operator(3d662ba2-aa03-4eea-bd30-8ad40638f6c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" Dec 07 19:43:04 crc kubenswrapper[4815]: I1207 19:43:04.773501 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee1d4597-bd09-4333-b5bd-e50c52c92cd3-bootstrap-combined-ca-bundle\") pod \"ee1d4597-bd09-4333-b5bd-e50c52c92cd3\" (UID: \"ee1d4597-bd09-4333-b5bd-e50c52c92cd3\") " Dec 07 19:43:04 crc kubenswrapper[4815]: I1207 19:43:04.773658 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qphs4\" (UniqueName: \"kubernetes.io/projected/ee1d4597-bd09-4333-b5bd-e50c52c92cd3-kube-api-access-qphs4\") pod \"ee1d4597-bd09-4333-b5bd-e50c52c92cd3\" (UID: \"ee1d4597-bd09-4333-b5bd-e50c52c92cd3\") " Dec 07 19:43:04 crc kubenswrapper[4815]: I1207 19:43:04.773871 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ee1d4597-bd09-4333-b5bd-e50c52c92cd3-inventory\") pod \"ee1d4597-bd09-4333-b5bd-e50c52c92cd3\" (UID: \"ee1d4597-bd09-4333-b5bd-e50c52c92cd3\") " Dec 07 19:43:04 crc kubenswrapper[4815]: I1207 19:43:04.774043 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ee1d4597-bd09-4333-b5bd-e50c52c92cd3-ssh-key\") pod \"ee1d4597-bd09-4333-b5bd-e50c52c92cd3\" (UID: \"ee1d4597-bd09-4333-b5bd-e50c52c92cd3\") " Dec 07 19:43:04 crc kubenswrapper[4815]: I1207 19:43:04.779247 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee1d4597-bd09-4333-b5bd-e50c52c92cd3-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "ee1d4597-bd09-4333-b5bd-e50c52c92cd3" (UID: "ee1d4597-bd09-4333-b5bd-e50c52c92cd3"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:43:04 crc kubenswrapper[4815]: I1207 19:43:04.786212 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee1d4597-bd09-4333-b5bd-e50c52c92cd3-kube-api-access-qphs4" (OuterVolumeSpecName: "kube-api-access-qphs4") pod "ee1d4597-bd09-4333-b5bd-e50c52c92cd3" (UID: "ee1d4597-bd09-4333-b5bd-e50c52c92cd3"). InnerVolumeSpecName "kube-api-access-qphs4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:43:04 crc kubenswrapper[4815]: I1207 19:43:04.805556 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee1d4597-bd09-4333-b5bd-e50c52c92cd3-inventory" (OuterVolumeSpecName: "inventory") pod "ee1d4597-bd09-4333-b5bd-e50c52c92cd3" (UID: "ee1d4597-bd09-4333-b5bd-e50c52c92cd3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:43:04 crc kubenswrapper[4815]: I1207 19:43:04.825797 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee1d4597-bd09-4333-b5bd-e50c52c92cd3-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ee1d4597-bd09-4333-b5bd-e50c52c92cd3" (UID: "ee1d4597-bd09-4333-b5bd-e50c52c92cd3"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:43:04 crc kubenswrapper[4815]: I1207 19:43:04.878553 4815 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee1d4597-bd09-4333-b5bd-e50c52c92cd3-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 07 19:43:04 crc kubenswrapper[4815]: I1207 19:43:04.878754 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qphs4\" (UniqueName: \"kubernetes.io/projected/ee1d4597-bd09-4333-b5bd-e50c52c92cd3-kube-api-access-qphs4\") on node \"crc\" DevicePath \"\"" Dec 07 19:43:04 crc kubenswrapper[4815]: I1207 19:43:04.878850 4815 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ee1d4597-bd09-4333-b5bd-e50c52c92cd3-inventory\") on node \"crc\" DevicePath \"\"" Dec 07 19:43:04 crc kubenswrapper[4815]: I1207 19:43:04.878947 4815 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ee1d4597-bd09-4333-b5bd-e50c52c92cd3-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 07 19:43:05 crc kubenswrapper[4815]: I1207 19:43:05.245907 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zcf4l" event={"ID":"ee1d4597-bd09-4333-b5bd-e50c52c92cd3","Type":"ContainerDied","Data":"41863916b3c039cd2ab481f643c14a675c55790d660bc478677bd40be953770e"} Dec 07 19:43:05 crc kubenswrapper[4815]: I1207 19:43:05.246004 4815 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="41863916b3c039cd2ab481f643c14a675c55790d660bc478677bd40be953770e" Dec 07 19:43:05 crc kubenswrapper[4815]: I1207 19:43:05.246038 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zcf4l" Dec 07 19:43:05 crc kubenswrapper[4815]: I1207 19:43:05.369972 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9pvhw"] Dec 07 19:43:05 crc kubenswrapper[4815]: E1207 19:43:05.374814 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee1d4597-bd09-4333-b5bd-e50c52c92cd3" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 07 19:43:05 crc kubenswrapper[4815]: I1207 19:43:05.374847 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee1d4597-bd09-4333-b5bd-e50c52c92cd3" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 07 19:43:05 crc kubenswrapper[4815]: E1207 19:43:05.374887 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6abcca75-a25b-49d2-8c61-59402faf16c5" containerName="extract-content" Dec 07 19:43:05 crc kubenswrapper[4815]: I1207 19:43:05.374899 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="6abcca75-a25b-49d2-8c61-59402faf16c5" containerName="extract-content" Dec 07 19:43:05 crc kubenswrapper[4815]: E1207 19:43:05.374951 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6abcca75-a25b-49d2-8c61-59402faf16c5" containerName="registry-server" Dec 07 19:43:05 crc kubenswrapper[4815]: I1207 19:43:05.374965 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="6abcca75-a25b-49d2-8c61-59402faf16c5" containerName="registry-server" Dec 07 19:43:05 crc kubenswrapper[4815]: E1207 19:43:05.374988 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6abcca75-a25b-49d2-8c61-59402faf16c5" containerName="extract-utilities" Dec 07 19:43:05 crc kubenswrapper[4815]: I1207 19:43:05.375000 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="6abcca75-a25b-49d2-8c61-59402faf16c5" containerName="extract-utilities" Dec 07 19:43:05 crc kubenswrapper[4815]: I1207 19:43:05.375337 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee1d4597-bd09-4333-b5bd-e50c52c92cd3" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 07 19:43:05 crc kubenswrapper[4815]: I1207 19:43:05.375388 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="6abcca75-a25b-49d2-8c61-59402faf16c5" containerName="registry-server" Dec 07 19:43:05 crc kubenswrapper[4815]: I1207 19:43:05.376328 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9pvhw" Dec 07 19:43:05 crc kubenswrapper[4815]: I1207 19:43:05.385444 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9pvhw"] Dec 07 19:43:05 crc kubenswrapper[4815]: I1207 19:43:05.390630 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 07 19:43:05 crc kubenswrapper[4815]: I1207 19:43:05.391247 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rvv2t" Dec 07 19:43:05 crc kubenswrapper[4815]: I1207 19:43:05.391256 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 07 19:43:05 crc kubenswrapper[4815]: I1207 19:43:05.392047 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 07 19:43:05 crc kubenswrapper[4815]: I1207 19:43:05.488660 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drbdt\" (UniqueName: \"kubernetes.io/projected/7a7fbd43-dafb-4aaa-be96-f84484865f64-kube-api-access-drbdt\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-9pvhw\" (UID: \"7a7fbd43-dafb-4aaa-be96-f84484865f64\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9pvhw" Dec 07 19:43:05 crc kubenswrapper[4815]: I1207 19:43:05.488726 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7a7fbd43-dafb-4aaa-be96-f84484865f64-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-9pvhw\" (UID: \"7a7fbd43-dafb-4aaa-be96-f84484865f64\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9pvhw" Dec 07 19:43:05 crc kubenswrapper[4815]: I1207 19:43:05.488797 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7a7fbd43-dafb-4aaa-be96-f84484865f64-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-9pvhw\" (UID: \"7a7fbd43-dafb-4aaa-be96-f84484865f64\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9pvhw" Dec 07 19:43:05 crc kubenswrapper[4815]: I1207 19:43:05.590663 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drbdt\" (UniqueName: \"kubernetes.io/projected/7a7fbd43-dafb-4aaa-be96-f84484865f64-kube-api-access-drbdt\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-9pvhw\" (UID: \"7a7fbd43-dafb-4aaa-be96-f84484865f64\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9pvhw" Dec 07 19:43:05 crc kubenswrapper[4815]: I1207 19:43:05.590732 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7a7fbd43-dafb-4aaa-be96-f84484865f64-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-9pvhw\" (UID: \"7a7fbd43-dafb-4aaa-be96-f84484865f64\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9pvhw" Dec 07 19:43:05 crc kubenswrapper[4815]: I1207 19:43:05.590793 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7a7fbd43-dafb-4aaa-be96-f84484865f64-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-9pvhw\" (UID: \"7a7fbd43-dafb-4aaa-be96-f84484865f64\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9pvhw" Dec 07 19:43:05 crc kubenswrapper[4815]: I1207 19:43:05.596899 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7a7fbd43-dafb-4aaa-be96-f84484865f64-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-9pvhw\" (UID: \"7a7fbd43-dafb-4aaa-be96-f84484865f64\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9pvhw" Dec 07 19:43:05 crc kubenswrapper[4815]: I1207 19:43:05.598410 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7a7fbd43-dafb-4aaa-be96-f84484865f64-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-9pvhw\" (UID: \"7a7fbd43-dafb-4aaa-be96-f84484865f64\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9pvhw" Dec 07 19:43:05 crc kubenswrapper[4815]: I1207 19:43:05.606722 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drbdt\" (UniqueName: \"kubernetes.io/projected/7a7fbd43-dafb-4aaa-be96-f84484865f64-kube-api-access-drbdt\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-9pvhw\" (UID: \"7a7fbd43-dafb-4aaa-be96-f84484865f64\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9pvhw" Dec 07 19:43:05 crc kubenswrapper[4815]: I1207 19:43:05.747021 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9pvhw" Dec 07 19:43:06 crc kubenswrapper[4815]: I1207 19:43:06.241942 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9pvhw"] Dec 07 19:43:06 crc kubenswrapper[4815]: I1207 19:43:06.259706 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9pvhw" event={"ID":"7a7fbd43-dafb-4aaa-be96-f84484865f64","Type":"ContainerStarted","Data":"664458af3f709c3354e9124eb5a4a8614132c3d8bd6b92c3879db50af909b68c"} Dec 07 19:43:07 crc kubenswrapper[4815]: I1207 19:43:07.272491 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9pvhw" event={"ID":"7a7fbd43-dafb-4aaa-be96-f84484865f64","Type":"ContainerStarted","Data":"054ced52dc918fa747c3b6884fe1a48db3c41e22634c98aca4aa3e24ddffec02"} Dec 07 19:43:07 crc kubenswrapper[4815]: I1207 19:43:07.298554 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9pvhw" podStartSLOduration=1.8640522019999999 podStartE2EDuration="2.298531499s" podCreationTimestamp="2025-12-07 19:43:05 +0000 UTC" firstStartedPulling="2025-12-07 19:43:06.250858322 +0000 UTC m=+1690.829848367" lastFinishedPulling="2025-12-07 19:43:06.685337619 +0000 UTC m=+1691.264327664" observedRunningTime="2025-12-07 19:43:07.295559926 +0000 UTC m=+1691.874549971" watchObservedRunningTime="2025-12-07 19:43:07.298531499 +0000 UTC m=+1691.877521544" Dec 07 19:43:17 crc kubenswrapper[4815]: I1207 19:43:17.770482 4815 scope.go:117] "RemoveContainer" containerID="f352000dbce74affccf74924ce7b6a53eb5e32c31268c3c83ff0c4e8881fae0d" Dec 07 19:43:17 crc kubenswrapper[4815]: E1207 19:43:17.772123 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gkn4h_openshift-machine-config-operator(3d662ba2-aa03-4eea-bd30-8ad40638f6c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" Dec 07 19:43:22 crc kubenswrapper[4815]: I1207 19:43:22.044753 4815 scope.go:117] "RemoveContainer" containerID="2e68f59d3031524fd452c0742278d3666deb9d8355bf4a49eb552eef5630ae5e" Dec 07 19:43:22 crc kubenswrapper[4815]: I1207 19:43:22.085527 4815 scope.go:117] "RemoveContainer" containerID="fb6bed024ebfbb77d55ee677f67757b14d7ee99c405d976cc72fbbb84ba8eae6" Dec 07 19:43:28 crc kubenswrapper[4815]: I1207 19:43:28.769975 4815 scope.go:117] "RemoveContainer" containerID="f352000dbce74affccf74924ce7b6a53eb5e32c31268c3c83ff0c4e8881fae0d" Dec 07 19:43:28 crc kubenswrapper[4815]: E1207 19:43:28.770768 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gkn4h_openshift-machine-config-operator(3d662ba2-aa03-4eea-bd30-8ad40638f6c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" Dec 07 19:43:43 crc kubenswrapper[4815]: I1207 19:43:43.772707 4815 scope.go:117] "RemoveContainer" containerID="f352000dbce74affccf74924ce7b6a53eb5e32c31268c3c83ff0c4e8881fae0d" Dec 07 19:43:43 crc kubenswrapper[4815]: E1207 19:43:43.773425 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gkn4h_openshift-machine-config-operator(3d662ba2-aa03-4eea-bd30-8ad40638f6c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" Dec 07 19:43:53 crc kubenswrapper[4815]: I1207 19:43:53.051066 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-qqh8l"] Dec 07 19:43:53 crc kubenswrapper[4815]: I1207 19:43:53.063960 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-z7kz6"] Dec 07 19:43:53 crc kubenswrapper[4815]: I1207 19:43:53.082359 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-6091-account-create-update-9gll9"] Dec 07 19:43:53 crc kubenswrapper[4815]: I1207 19:43:53.105071 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-z7kz6"] Dec 07 19:43:53 crc kubenswrapper[4815]: I1207 19:43:53.114387 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-6091-account-create-update-9gll9"] Dec 07 19:43:53 crc kubenswrapper[4815]: I1207 19:43:53.121814 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-qqh8l"] Dec 07 19:43:53 crc kubenswrapper[4815]: I1207 19:43:53.779232 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29003470-3a1c-4604-bb14-e005d825b5d2" path="/var/lib/kubelet/pods/29003470-3a1c-4604-bb14-e005d825b5d2/volumes" Dec 07 19:43:53 crc kubenswrapper[4815]: I1207 19:43:53.779848 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41c495d9-ee0f-4f79-b451-6ac61ec6db38" path="/var/lib/kubelet/pods/41c495d9-ee0f-4f79-b451-6ac61ec6db38/volumes" Dec 07 19:43:53 crc kubenswrapper[4815]: I1207 19:43:53.780446 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d318fb6c-147d-42f1-8b19-0abd6b58c83c" path="/var/lib/kubelet/pods/d318fb6c-147d-42f1-8b19-0abd6b58c83c/volumes" Dec 07 19:43:54 crc kubenswrapper[4815]: I1207 19:43:54.044045 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-49f3-account-create-update-v7qzn"] Dec 07 19:43:54 crc kubenswrapper[4815]: I1207 19:43:54.059926 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-49f3-account-create-update-v7qzn"] Dec 07 19:43:54 crc kubenswrapper[4815]: I1207 19:43:54.803216 4815 scope.go:117] "RemoveContainer" containerID="f352000dbce74affccf74924ce7b6a53eb5e32c31268c3c83ff0c4e8881fae0d" Dec 07 19:43:54 crc kubenswrapper[4815]: E1207 19:43:54.803519 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gkn4h_openshift-machine-config-operator(3d662ba2-aa03-4eea-bd30-8ad40638f6c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" Dec 07 19:43:55 crc kubenswrapper[4815]: I1207 19:43:55.798354 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86c5c96e-14ac-43c0-b753-faf913b71ed9" path="/var/lib/kubelet/pods/86c5c96e-14ac-43c0-b753-faf913b71ed9/volumes" Dec 07 19:43:58 crc kubenswrapper[4815]: I1207 19:43:58.026074 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-dbfd4"] Dec 07 19:43:58 crc kubenswrapper[4815]: I1207 19:43:58.035660 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-dbfd4"] Dec 07 19:43:59 crc kubenswrapper[4815]: I1207 19:43:59.032823 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-8d69-account-create-update-5gjns"] Dec 07 19:43:59 crc kubenswrapper[4815]: I1207 19:43:59.043147 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-8d69-account-create-update-5gjns"] Dec 07 19:43:59 crc kubenswrapper[4815]: I1207 19:43:59.782582 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ce65cf3-4329-4b48-8171-f04889224482" path="/var/lib/kubelet/pods/4ce65cf3-4329-4b48-8171-f04889224482/volumes" Dec 07 19:43:59 crc kubenswrapper[4815]: I1207 19:43:59.783315 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b7b56ed-7eb9-4bb0-addd-1e234f0f2204" path="/var/lib/kubelet/pods/8b7b56ed-7eb9-4bb0-addd-1e234f0f2204/volumes" Dec 07 19:44:06 crc kubenswrapper[4815]: I1207 19:44:06.770814 4815 scope.go:117] "RemoveContainer" containerID="f352000dbce74affccf74924ce7b6a53eb5e32c31268c3c83ff0c4e8881fae0d" Dec 07 19:44:06 crc kubenswrapper[4815]: E1207 19:44:06.771987 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gkn4h_openshift-machine-config-operator(3d662ba2-aa03-4eea-bd30-8ad40638f6c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" Dec 07 19:44:20 crc kubenswrapper[4815]: I1207 19:44:20.769807 4815 scope.go:117] "RemoveContainer" containerID="f352000dbce74affccf74924ce7b6a53eb5e32c31268c3c83ff0c4e8881fae0d" Dec 07 19:44:20 crc kubenswrapper[4815]: E1207 19:44:20.770717 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gkn4h_openshift-machine-config-operator(3d662ba2-aa03-4eea-bd30-8ad40638f6c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" Dec 07 19:44:22 crc kubenswrapper[4815]: I1207 19:44:22.150314 4815 scope.go:117] "RemoveContainer" containerID="413fe81c61b692e04e1d2a7fb37af579a1e2f298dc1b7bd06aaf96021a73e74e" Dec 07 19:44:22 crc kubenswrapper[4815]: I1207 19:44:22.175986 4815 scope.go:117] "RemoveContainer" containerID="987d1d8085f5aa4d7aac4b25f0914e81da0aa0a55a6eeb0609ef98ac15795d09" Dec 07 19:44:22 crc kubenswrapper[4815]: I1207 19:44:22.220496 4815 scope.go:117] "RemoveContainer" containerID="29aa9f4931f11735f982d8534d433d442982a877dea241f2844724847f9738d5" Dec 07 19:44:22 crc kubenswrapper[4815]: I1207 19:44:22.255721 4815 scope.go:117] "RemoveContainer" containerID="3c5a1416cc04506e01c294be2f260a88a9dc7c21d8f7cea9650a51f522750029" Dec 07 19:44:22 crc kubenswrapper[4815]: I1207 19:44:22.292392 4815 scope.go:117] "RemoveContainer" containerID="c604803ec0f05e625d3d60a546f978661ca9cba5978a160b9056cf053104af25" Dec 07 19:44:22 crc kubenswrapper[4815]: I1207 19:44:22.338808 4815 scope.go:117] "RemoveContainer" containerID="32d06d9aeda0cb5ad5ab366cd1a5a3a6c0a29ce0220962dfec177f9970840ae7" Dec 07 19:44:22 crc kubenswrapper[4815]: I1207 19:44:22.361149 4815 scope.go:117] "RemoveContainer" containerID="6e82e266d076860201e90b246dc6e841f1554832841722c3755e22355a393a53" Dec 07 19:44:22 crc kubenswrapper[4815]: I1207 19:44:22.394624 4815 scope.go:117] "RemoveContainer" containerID="688b5a1e33aec6fb3f8977ce625ae7b61a265d7bf10a234593dd01692d4aeb78" Dec 07 19:44:26 crc kubenswrapper[4815]: I1207 19:44:26.047429 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9pvhw" event={"ID":"7a7fbd43-dafb-4aaa-be96-f84484865f64","Type":"ContainerDied","Data":"054ced52dc918fa747c3b6884fe1a48db3c41e22634c98aca4aa3e24ddffec02"} Dec 07 19:44:26 crc kubenswrapper[4815]: I1207 19:44:26.047376 4815 generic.go:334] "Generic (PLEG): container finished" podID="7a7fbd43-dafb-4aaa-be96-f84484865f64" containerID="054ced52dc918fa747c3b6884fe1a48db3c41e22634c98aca4aa3e24ddffec02" exitCode=0 Dec 07 19:44:26 crc kubenswrapper[4815]: I1207 19:44:26.064771 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-tjblv"] Dec 07 19:44:26 crc kubenswrapper[4815]: I1207 19:44:26.075042 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-t9b9t"] Dec 07 19:44:26 crc kubenswrapper[4815]: I1207 19:44:26.086976 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7561-account-create-update-v2s5j"] Dec 07 19:44:26 crc kubenswrapper[4815]: I1207 19:44:26.099971 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-tjblv"] Dec 07 19:44:26 crc kubenswrapper[4815]: I1207 19:44:26.111197 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-t9b9t"] Dec 07 19:44:26 crc kubenswrapper[4815]: I1207 19:44:26.122285 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-f0f9-account-create-update-djwr2"] Dec 07 19:44:26 crc kubenswrapper[4815]: I1207 19:44:26.129246 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-wgnj5"] Dec 07 19:44:26 crc kubenswrapper[4815]: I1207 19:44:26.136668 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-7561-account-create-update-v2s5j"] Dec 07 19:44:26 crc kubenswrapper[4815]: I1207 19:44:26.145417 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-f0f9-account-create-update-djwr2"] Dec 07 19:44:26 crc kubenswrapper[4815]: I1207 19:44:26.156276 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-wgnj5"] Dec 07 19:44:26 crc kubenswrapper[4815]: I1207 19:44:26.163329 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-22e4-account-create-update-5hjjx"] Dec 07 19:44:26 crc kubenswrapper[4815]: I1207 19:44:26.177237 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-22e4-account-create-update-5hjjx"] Dec 07 19:44:27 crc kubenswrapper[4815]: I1207 19:44:27.458428 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9pvhw" Dec 07 19:44:27 crc kubenswrapper[4815]: I1207 19:44:27.532427 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7a7fbd43-dafb-4aaa-be96-f84484865f64-inventory\") pod \"7a7fbd43-dafb-4aaa-be96-f84484865f64\" (UID: \"7a7fbd43-dafb-4aaa-be96-f84484865f64\") " Dec 07 19:44:27 crc kubenswrapper[4815]: I1207 19:44:27.532569 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7a7fbd43-dafb-4aaa-be96-f84484865f64-ssh-key\") pod \"7a7fbd43-dafb-4aaa-be96-f84484865f64\" (UID: \"7a7fbd43-dafb-4aaa-be96-f84484865f64\") " Dec 07 19:44:27 crc kubenswrapper[4815]: I1207 19:44:27.533315 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-drbdt\" (UniqueName: \"kubernetes.io/projected/7a7fbd43-dafb-4aaa-be96-f84484865f64-kube-api-access-drbdt\") pod \"7a7fbd43-dafb-4aaa-be96-f84484865f64\" (UID: \"7a7fbd43-dafb-4aaa-be96-f84484865f64\") " Dec 07 19:44:27 crc kubenswrapper[4815]: I1207 19:44:27.548293 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a7fbd43-dafb-4aaa-be96-f84484865f64-kube-api-access-drbdt" (OuterVolumeSpecName: "kube-api-access-drbdt") pod "7a7fbd43-dafb-4aaa-be96-f84484865f64" (UID: "7a7fbd43-dafb-4aaa-be96-f84484865f64"). InnerVolumeSpecName "kube-api-access-drbdt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:44:27 crc kubenswrapper[4815]: I1207 19:44:27.561230 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a7fbd43-dafb-4aaa-be96-f84484865f64-inventory" (OuterVolumeSpecName: "inventory") pod "7a7fbd43-dafb-4aaa-be96-f84484865f64" (UID: "7a7fbd43-dafb-4aaa-be96-f84484865f64"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:44:27 crc kubenswrapper[4815]: I1207 19:44:27.568950 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a7fbd43-dafb-4aaa-be96-f84484865f64-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "7a7fbd43-dafb-4aaa-be96-f84484865f64" (UID: "7a7fbd43-dafb-4aaa-be96-f84484865f64"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:44:27 crc kubenswrapper[4815]: I1207 19:44:27.635065 4815 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7a7fbd43-dafb-4aaa-be96-f84484865f64-inventory\") on node \"crc\" DevicePath \"\"" Dec 07 19:44:27 crc kubenswrapper[4815]: I1207 19:44:27.635103 4815 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7a7fbd43-dafb-4aaa-be96-f84484865f64-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 07 19:44:27 crc kubenswrapper[4815]: I1207 19:44:27.635113 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-drbdt\" (UniqueName: \"kubernetes.io/projected/7a7fbd43-dafb-4aaa-be96-f84484865f64-kube-api-access-drbdt\") on node \"crc\" DevicePath \"\"" Dec 07 19:44:27 crc kubenswrapper[4815]: I1207 19:44:27.792573 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08539e2c-7b61-45a4-8dad-7763d4cc8d01" path="/var/lib/kubelet/pods/08539e2c-7b61-45a4-8dad-7763d4cc8d01/volumes" Dec 07 19:44:27 crc kubenswrapper[4815]: I1207 19:44:27.793991 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a63d827-767e-4965-871c-be277190b680" path="/var/lib/kubelet/pods/2a63d827-767e-4965-871c-be277190b680/volumes" Dec 07 19:44:27 crc kubenswrapper[4815]: I1207 19:44:27.795325 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72214816-2ac0-4ba7-99f8-6d56479c7e40" path="/var/lib/kubelet/pods/72214816-2ac0-4ba7-99f8-6d56479c7e40/volumes" Dec 07 19:44:27 crc kubenswrapper[4815]: I1207 19:44:27.812154 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2b4585c-b74e-47d2-9b7e-6a61791104be" path="/var/lib/kubelet/pods/b2b4585c-b74e-47d2-9b7e-6a61791104be/volumes" Dec 07 19:44:27 crc kubenswrapper[4815]: I1207 19:44:27.813825 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4ee2adb-b254-42cc-8100-09c598d670ef" path="/var/lib/kubelet/pods/c4ee2adb-b254-42cc-8100-09c598d670ef/volumes" Dec 07 19:44:27 crc kubenswrapper[4815]: I1207 19:44:27.815096 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7ceffd5-e9d5-42d6-a93f-cbbfa94c0492" path="/var/lib/kubelet/pods/c7ceffd5-e9d5-42d6-a93f-cbbfa94c0492/volumes" Dec 07 19:44:28 crc kubenswrapper[4815]: I1207 19:44:28.067366 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9pvhw" event={"ID":"7a7fbd43-dafb-4aaa-be96-f84484865f64","Type":"ContainerDied","Data":"664458af3f709c3354e9124eb5a4a8614132c3d8bd6b92c3879db50af909b68c"} Dec 07 19:44:28 crc kubenswrapper[4815]: I1207 19:44:28.067429 4815 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="664458af3f709c3354e9124eb5a4a8614132c3d8bd6b92c3879db50af909b68c" Dec 07 19:44:28 crc kubenswrapper[4815]: I1207 19:44:28.067457 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9pvhw" Dec 07 19:44:28 crc kubenswrapper[4815]: I1207 19:44:28.163908 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-5kdl5"] Dec 07 19:44:28 crc kubenswrapper[4815]: E1207 19:44:28.164316 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a7fbd43-dafb-4aaa-be96-f84484865f64" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 07 19:44:28 crc kubenswrapper[4815]: I1207 19:44:28.164332 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a7fbd43-dafb-4aaa-be96-f84484865f64" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 07 19:44:28 crc kubenswrapper[4815]: I1207 19:44:28.164515 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a7fbd43-dafb-4aaa-be96-f84484865f64" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 07 19:44:28 crc kubenswrapper[4815]: I1207 19:44:28.165103 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-5kdl5" Dec 07 19:44:28 crc kubenswrapper[4815]: I1207 19:44:28.167119 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rvv2t" Dec 07 19:44:28 crc kubenswrapper[4815]: I1207 19:44:28.167151 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 07 19:44:28 crc kubenswrapper[4815]: I1207 19:44:28.168809 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 07 19:44:28 crc kubenswrapper[4815]: I1207 19:44:28.168828 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 07 19:44:28 crc kubenswrapper[4815]: I1207 19:44:28.181589 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-5kdl5"] Dec 07 19:44:28 crc kubenswrapper[4815]: I1207 19:44:28.245421 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4jl8\" (UniqueName: \"kubernetes.io/projected/91293094-d529-4fbf-84db-c166a7ebcb7f-kube-api-access-c4jl8\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-5kdl5\" (UID: \"91293094-d529-4fbf-84db-c166a7ebcb7f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-5kdl5" Dec 07 19:44:28 crc kubenswrapper[4815]: I1207 19:44:28.245557 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/91293094-d529-4fbf-84db-c166a7ebcb7f-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-5kdl5\" (UID: \"91293094-d529-4fbf-84db-c166a7ebcb7f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-5kdl5" Dec 07 19:44:28 crc kubenswrapper[4815]: I1207 19:44:28.245697 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/91293094-d529-4fbf-84db-c166a7ebcb7f-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-5kdl5\" (UID: \"91293094-d529-4fbf-84db-c166a7ebcb7f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-5kdl5" Dec 07 19:44:28 crc kubenswrapper[4815]: I1207 19:44:28.346377 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4jl8\" (UniqueName: \"kubernetes.io/projected/91293094-d529-4fbf-84db-c166a7ebcb7f-kube-api-access-c4jl8\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-5kdl5\" (UID: \"91293094-d529-4fbf-84db-c166a7ebcb7f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-5kdl5" Dec 07 19:44:28 crc kubenswrapper[4815]: I1207 19:44:28.346461 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/91293094-d529-4fbf-84db-c166a7ebcb7f-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-5kdl5\" (UID: \"91293094-d529-4fbf-84db-c166a7ebcb7f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-5kdl5" Dec 07 19:44:28 crc kubenswrapper[4815]: I1207 19:44:28.346563 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/91293094-d529-4fbf-84db-c166a7ebcb7f-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-5kdl5\" (UID: \"91293094-d529-4fbf-84db-c166a7ebcb7f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-5kdl5" Dec 07 19:44:28 crc kubenswrapper[4815]: I1207 19:44:28.351158 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/91293094-d529-4fbf-84db-c166a7ebcb7f-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-5kdl5\" (UID: \"91293094-d529-4fbf-84db-c166a7ebcb7f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-5kdl5" Dec 07 19:44:28 crc kubenswrapper[4815]: I1207 19:44:28.358774 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/91293094-d529-4fbf-84db-c166a7ebcb7f-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-5kdl5\" (UID: \"91293094-d529-4fbf-84db-c166a7ebcb7f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-5kdl5" Dec 07 19:44:28 crc kubenswrapper[4815]: I1207 19:44:28.364434 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4jl8\" (UniqueName: \"kubernetes.io/projected/91293094-d529-4fbf-84db-c166a7ebcb7f-kube-api-access-c4jl8\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-5kdl5\" (UID: \"91293094-d529-4fbf-84db-c166a7ebcb7f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-5kdl5" Dec 07 19:44:28 crc kubenswrapper[4815]: I1207 19:44:28.484116 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-5kdl5" Dec 07 19:44:29 crc kubenswrapper[4815]: I1207 19:44:29.065402 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-5kdl5"] Dec 07 19:44:29 crc kubenswrapper[4815]: W1207 19:44:29.069279 4815 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91293094_d529_4fbf_84db_c166a7ebcb7f.slice/crio-79697f111688289041513e6acc0b8db0e59e4220b557a6679074dbb1237de3a1 WatchSource:0}: Error finding container 79697f111688289041513e6acc0b8db0e59e4220b557a6679074dbb1237de3a1: Status 404 returned error can't find the container with id 79697f111688289041513e6acc0b8db0e59e4220b557a6679074dbb1237de3a1 Dec 07 19:44:30 crc kubenswrapper[4815]: I1207 19:44:30.088176 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-5kdl5" event={"ID":"91293094-d529-4fbf-84db-c166a7ebcb7f","Type":"ContainerStarted","Data":"4545000b891a6a7de93deaec65e00fcd5a65d51e3308473342422003f41a285f"} Dec 07 19:44:30 crc kubenswrapper[4815]: I1207 19:44:30.089651 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-5kdl5" event={"ID":"91293094-d529-4fbf-84db-c166a7ebcb7f","Type":"ContainerStarted","Data":"79697f111688289041513e6acc0b8db0e59e4220b557a6679074dbb1237de3a1"} Dec 07 19:44:30 crc kubenswrapper[4815]: I1207 19:44:30.116669 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-5kdl5" podStartSLOduration=1.680373125 podStartE2EDuration="2.116654252s" podCreationTimestamp="2025-12-07 19:44:28 +0000 UTC" firstStartedPulling="2025-12-07 19:44:29.07271988 +0000 UTC m=+1773.651709925" lastFinishedPulling="2025-12-07 19:44:29.509001007 +0000 UTC m=+1774.087991052" observedRunningTime="2025-12-07 19:44:30.114995316 +0000 UTC m=+1774.693985401" watchObservedRunningTime="2025-12-07 19:44:30.116654252 +0000 UTC m=+1774.695644297" Dec 07 19:44:31 crc kubenswrapper[4815]: I1207 19:44:31.027142 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-pvvmf"] Dec 07 19:44:31 crc kubenswrapper[4815]: I1207 19:44:31.035086 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-pvvmf"] Dec 07 19:44:31 crc kubenswrapper[4815]: I1207 19:44:31.781061 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb0c957-b8cf-48ef-8a8d-f409fa031e2f" path="/var/lib/kubelet/pods/4bb0c957-b8cf-48ef-8a8d-f409fa031e2f/volumes" Dec 07 19:44:35 crc kubenswrapper[4815]: I1207 19:44:35.780330 4815 scope.go:117] "RemoveContainer" containerID="f352000dbce74affccf74924ce7b6a53eb5e32c31268c3c83ff0c4e8881fae0d" Dec 07 19:44:35 crc kubenswrapper[4815]: E1207 19:44:35.781669 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gkn4h_openshift-machine-config-operator(3d662ba2-aa03-4eea-bd30-8ad40638f6c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" Dec 07 19:44:36 crc kubenswrapper[4815]: I1207 19:44:36.148270 4815 generic.go:334] "Generic (PLEG): container finished" podID="91293094-d529-4fbf-84db-c166a7ebcb7f" containerID="4545000b891a6a7de93deaec65e00fcd5a65d51e3308473342422003f41a285f" exitCode=0 Dec 07 19:44:36 crc kubenswrapper[4815]: I1207 19:44:36.148327 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-5kdl5" event={"ID":"91293094-d529-4fbf-84db-c166a7ebcb7f","Type":"ContainerDied","Data":"4545000b891a6a7de93deaec65e00fcd5a65d51e3308473342422003f41a285f"} Dec 07 19:44:37 crc kubenswrapper[4815]: I1207 19:44:37.588079 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-5kdl5" Dec 07 19:44:37 crc kubenswrapper[4815]: I1207 19:44:37.763768 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c4jl8\" (UniqueName: \"kubernetes.io/projected/91293094-d529-4fbf-84db-c166a7ebcb7f-kube-api-access-c4jl8\") pod \"91293094-d529-4fbf-84db-c166a7ebcb7f\" (UID: \"91293094-d529-4fbf-84db-c166a7ebcb7f\") " Dec 07 19:44:37 crc kubenswrapper[4815]: I1207 19:44:37.763860 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/91293094-d529-4fbf-84db-c166a7ebcb7f-inventory\") pod \"91293094-d529-4fbf-84db-c166a7ebcb7f\" (UID: \"91293094-d529-4fbf-84db-c166a7ebcb7f\") " Dec 07 19:44:37 crc kubenswrapper[4815]: I1207 19:44:37.763934 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/91293094-d529-4fbf-84db-c166a7ebcb7f-ssh-key\") pod \"91293094-d529-4fbf-84db-c166a7ebcb7f\" (UID: \"91293094-d529-4fbf-84db-c166a7ebcb7f\") " Dec 07 19:44:37 crc kubenswrapper[4815]: I1207 19:44:37.777749 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91293094-d529-4fbf-84db-c166a7ebcb7f-kube-api-access-c4jl8" (OuterVolumeSpecName: "kube-api-access-c4jl8") pod "91293094-d529-4fbf-84db-c166a7ebcb7f" (UID: "91293094-d529-4fbf-84db-c166a7ebcb7f"). InnerVolumeSpecName "kube-api-access-c4jl8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:44:37 crc kubenswrapper[4815]: I1207 19:44:37.793975 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91293094-d529-4fbf-84db-c166a7ebcb7f-inventory" (OuterVolumeSpecName: "inventory") pod "91293094-d529-4fbf-84db-c166a7ebcb7f" (UID: "91293094-d529-4fbf-84db-c166a7ebcb7f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:44:37 crc kubenswrapper[4815]: I1207 19:44:37.799081 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91293094-d529-4fbf-84db-c166a7ebcb7f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "91293094-d529-4fbf-84db-c166a7ebcb7f" (UID: "91293094-d529-4fbf-84db-c166a7ebcb7f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:44:37 crc kubenswrapper[4815]: I1207 19:44:37.866515 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c4jl8\" (UniqueName: \"kubernetes.io/projected/91293094-d529-4fbf-84db-c166a7ebcb7f-kube-api-access-c4jl8\") on node \"crc\" DevicePath \"\"" Dec 07 19:44:37 crc kubenswrapper[4815]: I1207 19:44:37.866549 4815 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/91293094-d529-4fbf-84db-c166a7ebcb7f-inventory\") on node \"crc\" DevicePath \"\"" Dec 07 19:44:37 crc kubenswrapper[4815]: I1207 19:44:37.866560 4815 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/91293094-d529-4fbf-84db-c166a7ebcb7f-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 07 19:44:38 crc kubenswrapper[4815]: I1207 19:44:38.171072 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-5kdl5" event={"ID":"91293094-d529-4fbf-84db-c166a7ebcb7f","Type":"ContainerDied","Data":"79697f111688289041513e6acc0b8db0e59e4220b557a6679074dbb1237de3a1"} Dec 07 19:44:38 crc kubenswrapper[4815]: I1207 19:44:38.171476 4815 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="79697f111688289041513e6acc0b8db0e59e4220b557a6679074dbb1237de3a1" Dec 07 19:44:38 crc kubenswrapper[4815]: I1207 19:44:38.171154 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-5kdl5" Dec 07 19:44:38 crc kubenswrapper[4815]: I1207 19:44:38.244762 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-t8v85"] Dec 07 19:44:38 crc kubenswrapper[4815]: E1207 19:44:38.245382 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91293094-d529-4fbf-84db-c166a7ebcb7f" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 07 19:44:38 crc kubenswrapper[4815]: I1207 19:44:38.245401 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="91293094-d529-4fbf-84db-c166a7ebcb7f" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 07 19:44:38 crc kubenswrapper[4815]: I1207 19:44:38.245602 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="91293094-d529-4fbf-84db-c166a7ebcb7f" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 07 19:44:38 crc kubenswrapper[4815]: I1207 19:44:38.246242 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-t8v85" Dec 07 19:44:38 crc kubenswrapper[4815]: I1207 19:44:38.248338 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rvv2t" Dec 07 19:44:38 crc kubenswrapper[4815]: I1207 19:44:38.248563 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 07 19:44:38 crc kubenswrapper[4815]: I1207 19:44:38.255889 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 07 19:44:38 crc kubenswrapper[4815]: I1207 19:44:38.260315 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 07 19:44:38 crc kubenswrapper[4815]: I1207 19:44:38.265037 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-t8v85"] Dec 07 19:44:38 crc kubenswrapper[4815]: I1207 19:44:38.375154 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hgtc\" (UniqueName: \"kubernetes.io/projected/acfde0af-ed46-4636-b117-d2c2c7b2c0c8-kube-api-access-5hgtc\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-t8v85\" (UID: \"acfde0af-ed46-4636-b117-d2c2c7b2c0c8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-t8v85" Dec 07 19:44:38 crc kubenswrapper[4815]: I1207 19:44:38.375209 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/acfde0af-ed46-4636-b117-d2c2c7b2c0c8-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-t8v85\" (UID: \"acfde0af-ed46-4636-b117-d2c2c7b2c0c8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-t8v85" Dec 07 19:44:38 crc kubenswrapper[4815]: I1207 19:44:38.375271 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/acfde0af-ed46-4636-b117-d2c2c7b2c0c8-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-t8v85\" (UID: \"acfde0af-ed46-4636-b117-d2c2c7b2c0c8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-t8v85" Dec 07 19:44:38 crc kubenswrapper[4815]: I1207 19:44:38.476953 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hgtc\" (UniqueName: \"kubernetes.io/projected/acfde0af-ed46-4636-b117-d2c2c7b2c0c8-kube-api-access-5hgtc\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-t8v85\" (UID: \"acfde0af-ed46-4636-b117-d2c2c7b2c0c8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-t8v85" Dec 07 19:44:38 crc kubenswrapper[4815]: I1207 19:44:38.476997 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/acfde0af-ed46-4636-b117-d2c2c7b2c0c8-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-t8v85\" (UID: \"acfde0af-ed46-4636-b117-d2c2c7b2c0c8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-t8v85" Dec 07 19:44:38 crc kubenswrapper[4815]: I1207 19:44:38.477059 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/acfde0af-ed46-4636-b117-d2c2c7b2c0c8-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-t8v85\" (UID: \"acfde0af-ed46-4636-b117-d2c2c7b2c0c8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-t8v85" Dec 07 19:44:38 crc kubenswrapper[4815]: I1207 19:44:38.533628 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/acfde0af-ed46-4636-b117-d2c2c7b2c0c8-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-t8v85\" (UID: \"acfde0af-ed46-4636-b117-d2c2c7b2c0c8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-t8v85" Dec 07 19:44:38 crc kubenswrapper[4815]: I1207 19:44:38.539327 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/acfde0af-ed46-4636-b117-d2c2c7b2c0c8-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-t8v85\" (UID: \"acfde0af-ed46-4636-b117-d2c2c7b2c0c8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-t8v85" Dec 07 19:44:38 crc kubenswrapper[4815]: I1207 19:44:38.548703 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hgtc\" (UniqueName: \"kubernetes.io/projected/acfde0af-ed46-4636-b117-d2c2c7b2c0c8-kube-api-access-5hgtc\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-t8v85\" (UID: \"acfde0af-ed46-4636-b117-d2c2c7b2c0c8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-t8v85" Dec 07 19:44:38 crc kubenswrapper[4815]: I1207 19:44:38.563676 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-t8v85" Dec 07 19:44:39 crc kubenswrapper[4815]: I1207 19:44:39.129791 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-t8v85"] Dec 07 19:44:39 crc kubenswrapper[4815]: I1207 19:44:39.181701 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-t8v85" event={"ID":"acfde0af-ed46-4636-b117-d2c2c7b2c0c8","Type":"ContainerStarted","Data":"c0a319f3eac3eef6a3aef53827517d02fac39cc7af430535cafaec2195ea41a0"} Dec 07 19:44:40 crc kubenswrapper[4815]: I1207 19:44:40.198422 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-t8v85" event={"ID":"acfde0af-ed46-4636-b117-d2c2c7b2c0c8","Type":"ContainerStarted","Data":"2f6a0ac1356f1f6063ac6dccff231c0e3f65896581cba4238cc1be1515f31358"} Dec 07 19:44:40 crc kubenswrapper[4815]: I1207 19:44:40.232016 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-t8v85" podStartSLOduration=1.8011844959999999 podStartE2EDuration="2.231999441s" podCreationTimestamp="2025-12-07 19:44:38 +0000 UTC" firstStartedPulling="2025-12-07 19:44:39.149064663 +0000 UTC m=+1783.728054708" lastFinishedPulling="2025-12-07 19:44:39.579879608 +0000 UTC m=+1784.158869653" observedRunningTime="2025-12-07 19:44:40.225286702 +0000 UTC m=+1784.804276807" watchObservedRunningTime="2025-12-07 19:44:40.231999441 +0000 UTC m=+1784.810989486" Dec 07 19:44:46 crc kubenswrapper[4815]: I1207 19:44:46.770646 4815 scope.go:117] "RemoveContainer" containerID="f352000dbce74affccf74924ce7b6a53eb5e32c31268c3c83ff0c4e8881fae0d" Dec 07 19:44:46 crc kubenswrapper[4815]: E1207 19:44:46.771588 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gkn4h_openshift-machine-config-operator(3d662ba2-aa03-4eea-bd30-8ad40638f6c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" Dec 07 19:44:58 crc kubenswrapper[4815]: I1207 19:44:58.042422 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-s7wqs"] Dec 07 19:44:58 crc kubenswrapper[4815]: I1207 19:44:58.051621 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-s7wqs"] Dec 07 19:44:59 crc kubenswrapper[4815]: I1207 19:44:59.770001 4815 scope.go:117] "RemoveContainer" containerID="f352000dbce74affccf74924ce7b6a53eb5e32c31268c3c83ff0c4e8881fae0d" Dec 07 19:44:59 crc kubenswrapper[4815]: E1207 19:44:59.770570 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gkn4h_openshift-machine-config-operator(3d662ba2-aa03-4eea-bd30-8ad40638f6c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" Dec 07 19:44:59 crc kubenswrapper[4815]: I1207 19:44:59.785189 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e07853f-f24e-4bf6-8af1-15e4e9cccbc4" path="/var/lib/kubelet/pods/9e07853f-f24e-4bf6-8af1-15e4e9cccbc4/volumes" Dec 07 19:45:00 crc kubenswrapper[4815]: I1207 19:45:00.041386 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-p5hgv"] Dec 07 19:45:00 crc kubenswrapper[4815]: I1207 19:45:00.074479 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-p5hgv"] Dec 07 19:45:00 crc kubenswrapper[4815]: I1207 19:45:00.152636 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29418945-8trsf"] Dec 07 19:45:00 crc kubenswrapper[4815]: I1207 19:45:00.153992 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29418945-8trsf" Dec 07 19:45:00 crc kubenswrapper[4815]: I1207 19:45:00.157631 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 07 19:45:00 crc kubenswrapper[4815]: I1207 19:45:00.157716 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 07 19:45:00 crc kubenswrapper[4815]: I1207 19:45:00.169334 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29418945-8trsf"] Dec 07 19:45:00 crc kubenswrapper[4815]: I1207 19:45:00.292781 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xm2cl\" (UniqueName: \"kubernetes.io/projected/29f8422b-0d66-48c7-bd82-1537396f9f24-kube-api-access-xm2cl\") pod \"collect-profiles-29418945-8trsf\" (UID: \"29f8422b-0d66-48c7-bd82-1537396f9f24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29418945-8trsf" Dec 07 19:45:00 crc kubenswrapper[4815]: I1207 19:45:00.293398 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/29f8422b-0d66-48c7-bd82-1537396f9f24-secret-volume\") pod \"collect-profiles-29418945-8trsf\" (UID: \"29f8422b-0d66-48c7-bd82-1537396f9f24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29418945-8trsf" Dec 07 19:45:00 crc kubenswrapper[4815]: I1207 19:45:00.293517 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/29f8422b-0d66-48c7-bd82-1537396f9f24-config-volume\") pod \"collect-profiles-29418945-8trsf\" (UID: \"29f8422b-0d66-48c7-bd82-1537396f9f24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29418945-8trsf" Dec 07 19:45:00 crc kubenswrapper[4815]: I1207 19:45:00.394760 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xm2cl\" (UniqueName: \"kubernetes.io/projected/29f8422b-0d66-48c7-bd82-1537396f9f24-kube-api-access-xm2cl\") pod \"collect-profiles-29418945-8trsf\" (UID: \"29f8422b-0d66-48c7-bd82-1537396f9f24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29418945-8trsf" Dec 07 19:45:00 crc kubenswrapper[4815]: I1207 19:45:00.394818 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/29f8422b-0d66-48c7-bd82-1537396f9f24-secret-volume\") pod \"collect-profiles-29418945-8trsf\" (UID: \"29f8422b-0d66-48c7-bd82-1537396f9f24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29418945-8trsf" Dec 07 19:45:00 crc kubenswrapper[4815]: I1207 19:45:00.394883 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/29f8422b-0d66-48c7-bd82-1537396f9f24-config-volume\") pod \"collect-profiles-29418945-8trsf\" (UID: \"29f8422b-0d66-48c7-bd82-1537396f9f24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29418945-8trsf" Dec 07 19:45:00 crc kubenswrapper[4815]: I1207 19:45:00.395633 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/29f8422b-0d66-48c7-bd82-1537396f9f24-config-volume\") pod \"collect-profiles-29418945-8trsf\" (UID: \"29f8422b-0d66-48c7-bd82-1537396f9f24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29418945-8trsf" Dec 07 19:45:00 crc kubenswrapper[4815]: I1207 19:45:00.401365 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/29f8422b-0d66-48c7-bd82-1537396f9f24-secret-volume\") pod \"collect-profiles-29418945-8trsf\" (UID: \"29f8422b-0d66-48c7-bd82-1537396f9f24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29418945-8trsf" Dec 07 19:45:00 crc kubenswrapper[4815]: I1207 19:45:00.422216 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xm2cl\" (UniqueName: \"kubernetes.io/projected/29f8422b-0d66-48c7-bd82-1537396f9f24-kube-api-access-xm2cl\") pod \"collect-profiles-29418945-8trsf\" (UID: \"29f8422b-0d66-48c7-bd82-1537396f9f24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29418945-8trsf" Dec 07 19:45:00 crc kubenswrapper[4815]: I1207 19:45:00.479485 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29418945-8trsf" Dec 07 19:45:00 crc kubenswrapper[4815]: I1207 19:45:00.743018 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29418945-8trsf"] Dec 07 19:45:01 crc kubenswrapper[4815]: I1207 19:45:01.387876 4815 generic.go:334] "Generic (PLEG): container finished" podID="29f8422b-0d66-48c7-bd82-1537396f9f24" containerID="c090b18d133b6699345401646de4b070aa5a215e6bff5d1ed216b3b01f2ab534" exitCode=0 Dec 07 19:45:01 crc kubenswrapper[4815]: I1207 19:45:01.388063 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29418945-8trsf" event={"ID":"29f8422b-0d66-48c7-bd82-1537396f9f24","Type":"ContainerDied","Data":"c090b18d133b6699345401646de4b070aa5a215e6bff5d1ed216b3b01f2ab534"} Dec 07 19:45:01 crc kubenswrapper[4815]: I1207 19:45:01.388164 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29418945-8trsf" event={"ID":"29f8422b-0d66-48c7-bd82-1537396f9f24","Type":"ContainerStarted","Data":"32799f15406dabce87f74c1847709442c0aea4fae7cdb3200ece8a0209b455cc"} Dec 07 19:45:01 crc kubenswrapper[4815]: I1207 19:45:01.781754 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a333b1b6-9ef2-4332-9622-e9fcc9d41854" path="/var/lib/kubelet/pods/a333b1b6-9ef2-4332-9622-e9fcc9d41854/volumes" Dec 07 19:45:02 crc kubenswrapper[4815]: I1207 19:45:02.846434 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29418945-8trsf" Dec 07 19:45:03 crc kubenswrapper[4815]: I1207 19:45:03.019077 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xm2cl\" (UniqueName: \"kubernetes.io/projected/29f8422b-0d66-48c7-bd82-1537396f9f24-kube-api-access-xm2cl\") pod \"29f8422b-0d66-48c7-bd82-1537396f9f24\" (UID: \"29f8422b-0d66-48c7-bd82-1537396f9f24\") " Dec 07 19:45:03 crc kubenswrapper[4815]: I1207 19:45:03.019180 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/29f8422b-0d66-48c7-bd82-1537396f9f24-secret-volume\") pod \"29f8422b-0d66-48c7-bd82-1537396f9f24\" (UID: \"29f8422b-0d66-48c7-bd82-1537396f9f24\") " Dec 07 19:45:03 crc kubenswrapper[4815]: I1207 19:45:03.019211 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/29f8422b-0d66-48c7-bd82-1537396f9f24-config-volume\") pod \"29f8422b-0d66-48c7-bd82-1537396f9f24\" (UID: \"29f8422b-0d66-48c7-bd82-1537396f9f24\") " Dec 07 19:45:03 crc kubenswrapper[4815]: I1207 19:45:03.020157 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29f8422b-0d66-48c7-bd82-1537396f9f24-config-volume" (OuterVolumeSpecName: "config-volume") pod "29f8422b-0d66-48c7-bd82-1537396f9f24" (UID: "29f8422b-0d66-48c7-bd82-1537396f9f24"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 19:45:03 crc kubenswrapper[4815]: I1207 19:45:03.026028 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29f8422b-0d66-48c7-bd82-1537396f9f24-kube-api-access-xm2cl" (OuterVolumeSpecName: "kube-api-access-xm2cl") pod "29f8422b-0d66-48c7-bd82-1537396f9f24" (UID: "29f8422b-0d66-48c7-bd82-1537396f9f24"). InnerVolumeSpecName "kube-api-access-xm2cl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:45:03 crc kubenswrapper[4815]: I1207 19:45:03.026113 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29f8422b-0d66-48c7-bd82-1537396f9f24-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "29f8422b-0d66-48c7-bd82-1537396f9f24" (UID: "29f8422b-0d66-48c7-bd82-1537396f9f24"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:45:03 crc kubenswrapper[4815]: I1207 19:45:03.121742 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xm2cl\" (UniqueName: \"kubernetes.io/projected/29f8422b-0d66-48c7-bd82-1537396f9f24-kube-api-access-xm2cl\") on node \"crc\" DevicePath \"\"" Dec 07 19:45:03 crc kubenswrapper[4815]: I1207 19:45:03.121785 4815 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/29f8422b-0d66-48c7-bd82-1537396f9f24-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 07 19:45:03 crc kubenswrapper[4815]: I1207 19:45:03.121797 4815 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/29f8422b-0d66-48c7-bd82-1537396f9f24-config-volume\") on node \"crc\" DevicePath \"\"" Dec 07 19:45:03 crc kubenswrapper[4815]: I1207 19:45:03.413121 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29418945-8trsf" event={"ID":"29f8422b-0d66-48c7-bd82-1537396f9f24","Type":"ContainerDied","Data":"32799f15406dabce87f74c1847709442c0aea4fae7cdb3200ece8a0209b455cc"} Dec 07 19:45:03 crc kubenswrapper[4815]: I1207 19:45:03.413162 4815 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="32799f15406dabce87f74c1847709442c0aea4fae7cdb3200ece8a0209b455cc" Dec 07 19:45:03 crc kubenswrapper[4815]: I1207 19:45:03.413237 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29418945-8trsf" Dec 07 19:45:06 crc kubenswrapper[4815]: I1207 19:45:06.040671 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-jfdqh"] Dec 07 19:45:06 crc kubenswrapper[4815]: I1207 19:45:06.053219 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-jfdqh"] Dec 07 19:45:07 crc kubenswrapper[4815]: I1207 19:45:07.784261 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c31f2b72-9e36-4a29-90e2-a3599b27f94b" path="/var/lib/kubelet/pods/c31f2b72-9e36-4a29-90e2-a3599b27f94b/volumes" Dec 07 19:45:09 crc kubenswrapper[4815]: I1207 19:45:09.030251 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-xxh6g"] Dec 07 19:45:09 crc kubenswrapper[4815]: I1207 19:45:09.052935 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-xxh6g"] Dec 07 19:45:09 crc kubenswrapper[4815]: I1207 19:45:09.780269 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f0f688e-04dc-46a6-a4eb-1c8d5e635abe" path="/var/lib/kubelet/pods/2f0f688e-04dc-46a6-a4eb-1c8d5e635abe/volumes" Dec 07 19:45:14 crc kubenswrapper[4815]: I1207 19:45:14.769479 4815 scope.go:117] "RemoveContainer" containerID="f352000dbce74affccf74924ce7b6a53eb5e32c31268c3c83ff0c4e8881fae0d" Dec 07 19:45:14 crc kubenswrapper[4815]: E1207 19:45:14.769972 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gkn4h_openshift-machine-config-operator(3d662ba2-aa03-4eea-bd30-8ad40638f6c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" Dec 07 19:45:21 crc kubenswrapper[4815]: I1207 19:45:21.062570 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-5s8vb"] Dec 07 19:45:21 crc kubenswrapper[4815]: I1207 19:45:21.074153 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-5s8vb"] Dec 07 19:45:21 crc kubenswrapper[4815]: I1207 19:45:21.780665 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9829ff6-a37c-403a-9b72-8a2c1e3df5d5" path="/var/lib/kubelet/pods/a9829ff6-a37c-403a-9b72-8a2c1e3df5d5/volumes" Dec 07 19:45:22 crc kubenswrapper[4815]: I1207 19:45:22.536136 4815 scope.go:117] "RemoveContainer" containerID="37d0eee11812d81cb6af3ee722d559ea0115882ed3d4303ab77b47e111d50e87" Dec 07 19:45:22 crc kubenswrapper[4815]: I1207 19:45:22.577476 4815 scope.go:117] "RemoveContainer" containerID="9733e73186249f0acf8025983943ef1294053b7f3b72e4d643632ffa9e1ab6f6" Dec 07 19:45:22 crc kubenswrapper[4815]: I1207 19:45:22.607683 4815 scope.go:117] "RemoveContainer" containerID="e6e3a0aa7582e8a28024c2949c8649e6dfe241f89d48688460aada97a926a10b" Dec 07 19:45:22 crc kubenswrapper[4815]: I1207 19:45:22.652106 4815 scope.go:117] "RemoveContainer" containerID="7700c80d760dfd0819f2f9de3d138f2dafe643324fdbc5b091bdd0eda3b36356" Dec 07 19:45:22 crc kubenswrapper[4815]: I1207 19:45:22.699363 4815 scope.go:117] "RemoveContainer" containerID="6eb415b1b752efa9ed7b88727e7e457320edcf19312d3224929a6a5dbd27ff24" Dec 07 19:45:22 crc kubenswrapper[4815]: I1207 19:45:22.755096 4815 scope.go:117] "RemoveContainer" containerID="eac252735903913f8661462e07b24ff978ccdcc2edc9ab499bb5168fbaab94b7" Dec 07 19:45:22 crc kubenswrapper[4815]: I1207 19:45:22.800081 4815 scope.go:117] "RemoveContainer" containerID="cd537bca59835e29c964d50ab4b23521eee29ad171fc43572eeb1460e1e2bec4" Dec 07 19:45:22 crc kubenswrapper[4815]: I1207 19:45:22.843728 4815 scope.go:117] "RemoveContainer" containerID="e5171d2eca80ad011fa8ee06ba4baec734cf54b36eb0a0c632b1146f3e2e3fd1" Dec 07 19:45:22 crc kubenswrapper[4815]: I1207 19:45:22.868925 4815 scope.go:117] "RemoveContainer" containerID="889d8bcfb06ac09b2961d353f1f74b92505bb6027988185fec9ae18edfec1164" Dec 07 19:45:22 crc kubenswrapper[4815]: I1207 19:45:22.888325 4815 scope.go:117] "RemoveContainer" containerID="9101cea0698219e2c998588a6935d8d020c7e39bdcf2b48591620eabbab7b833" Dec 07 19:45:22 crc kubenswrapper[4815]: I1207 19:45:22.908668 4815 scope.go:117] "RemoveContainer" containerID="4e80c0b15820ac03d4cbf050bb5a08544ee945014233e8026425a70df3759782" Dec 07 19:45:22 crc kubenswrapper[4815]: I1207 19:45:22.930038 4815 scope.go:117] "RemoveContainer" containerID="74bd87f41d19359e2f6de109e98f309e50a3be966bdc0417071bfb0fb9d3c5c8" Dec 07 19:45:25 crc kubenswrapper[4815]: I1207 19:45:25.037053 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-ljqjq"] Dec 07 19:45:25 crc kubenswrapper[4815]: I1207 19:45:25.048293 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-ljqjq"] Dec 07 19:45:25 crc kubenswrapper[4815]: I1207 19:45:25.656367 4815 generic.go:334] "Generic (PLEG): container finished" podID="acfde0af-ed46-4636-b117-d2c2c7b2c0c8" containerID="2f6a0ac1356f1f6063ac6dccff231c0e3f65896581cba4238cc1be1515f31358" exitCode=0 Dec 07 19:45:25 crc kubenswrapper[4815]: I1207 19:45:25.656421 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-t8v85" event={"ID":"acfde0af-ed46-4636-b117-d2c2c7b2c0c8","Type":"ContainerDied","Data":"2f6a0ac1356f1f6063ac6dccff231c0e3f65896581cba4238cc1be1515f31358"} Dec 07 19:45:25 crc kubenswrapper[4815]: I1207 19:45:25.782311 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfb75e0b-4e7b-484f-832b-5ed69650f1f1" path="/var/lib/kubelet/pods/cfb75e0b-4e7b-484f-832b-5ed69650f1f1/volumes" Dec 07 19:45:27 crc kubenswrapper[4815]: I1207 19:45:27.047861 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-t8v85" Dec 07 19:45:27 crc kubenswrapper[4815]: I1207 19:45:27.194014 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/acfde0af-ed46-4636-b117-d2c2c7b2c0c8-inventory\") pod \"acfde0af-ed46-4636-b117-d2c2c7b2c0c8\" (UID: \"acfde0af-ed46-4636-b117-d2c2c7b2c0c8\") " Dec 07 19:45:27 crc kubenswrapper[4815]: I1207 19:45:27.194404 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/acfde0af-ed46-4636-b117-d2c2c7b2c0c8-ssh-key\") pod \"acfde0af-ed46-4636-b117-d2c2c7b2c0c8\" (UID: \"acfde0af-ed46-4636-b117-d2c2c7b2c0c8\") " Dec 07 19:45:27 crc kubenswrapper[4815]: I1207 19:45:27.194679 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5hgtc\" (UniqueName: \"kubernetes.io/projected/acfde0af-ed46-4636-b117-d2c2c7b2c0c8-kube-api-access-5hgtc\") pod \"acfde0af-ed46-4636-b117-d2c2c7b2c0c8\" (UID: \"acfde0af-ed46-4636-b117-d2c2c7b2c0c8\") " Dec 07 19:45:27 crc kubenswrapper[4815]: I1207 19:45:27.205588 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acfde0af-ed46-4636-b117-d2c2c7b2c0c8-kube-api-access-5hgtc" (OuterVolumeSpecName: "kube-api-access-5hgtc") pod "acfde0af-ed46-4636-b117-d2c2c7b2c0c8" (UID: "acfde0af-ed46-4636-b117-d2c2c7b2c0c8"). InnerVolumeSpecName "kube-api-access-5hgtc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:45:27 crc kubenswrapper[4815]: I1207 19:45:27.217982 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acfde0af-ed46-4636-b117-d2c2c7b2c0c8-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "acfde0af-ed46-4636-b117-d2c2c7b2c0c8" (UID: "acfde0af-ed46-4636-b117-d2c2c7b2c0c8"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:45:27 crc kubenswrapper[4815]: I1207 19:45:27.219075 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acfde0af-ed46-4636-b117-d2c2c7b2c0c8-inventory" (OuterVolumeSpecName: "inventory") pod "acfde0af-ed46-4636-b117-d2c2c7b2c0c8" (UID: "acfde0af-ed46-4636-b117-d2c2c7b2c0c8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:45:27 crc kubenswrapper[4815]: I1207 19:45:27.296604 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5hgtc\" (UniqueName: \"kubernetes.io/projected/acfde0af-ed46-4636-b117-d2c2c7b2c0c8-kube-api-access-5hgtc\") on node \"crc\" DevicePath \"\"" Dec 07 19:45:27 crc kubenswrapper[4815]: I1207 19:45:27.296638 4815 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/acfde0af-ed46-4636-b117-d2c2c7b2c0c8-inventory\") on node \"crc\" DevicePath \"\"" Dec 07 19:45:27 crc kubenswrapper[4815]: I1207 19:45:27.296648 4815 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/acfde0af-ed46-4636-b117-d2c2c7b2c0c8-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 07 19:45:27 crc kubenswrapper[4815]: I1207 19:45:27.677477 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-t8v85" event={"ID":"acfde0af-ed46-4636-b117-d2c2c7b2c0c8","Type":"ContainerDied","Data":"c0a319f3eac3eef6a3aef53827517d02fac39cc7af430535cafaec2195ea41a0"} Dec 07 19:45:27 crc kubenswrapper[4815]: I1207 19:45:27.677522 4815 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c0a319f3eac3eef6a3aef53827517d02fac39cc7af430535cafaec2195ea41a0" Dec 07 19:45:27 crc kubenswrapper[4815]: I1207 19:45:27.677527 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-t8v85" Dec 07 19:45:27 crc kubenswrapper[4815]: I1207 19:45:27.793540 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wnbh9"] Dec 07 19:45:27 crc kubenswrapper[4815]: E1207 19:45:27.793965 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acfde0af-ed46-4636-b117-d2c2c7b2c0c8" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 07 19:45:27 crc kubenswrapper[4815]: I1207 19:45:27.793987 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="acfde0af-ed46-4636-b117-d2c2c7b2c0c8" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 07 19:45:27 crc kubenswrapper[4815]: E1207 19:45:27.794029 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29f8422b-0d66-48c7-bd82-1537396f9f24" containerName="collect-profiles" Dec 07 19:45:27 crc kubenswrapper[4815]: I1207 19:45:27.794037 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="29f8422b-0d66-48c7-bd82-1537396f9f24" containerName="collect-profiles" Dec 07 19:45:27 crc kubenswrapper[4815]: I1207 19:45:27.794236 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="acfde0af-ed46-4636-b117-d2c2c7b2c0c8" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 07 19:45:27 crc kubenswrapper[4815]: I1207 19:45:27.794252 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="29f8422b-0d66-48c7-bd82-1537396f9f24" containerName="collect-profiles" Dec 07 19:45:27 crc kubenswrapper[4815]: I1207 19:45:27.794983 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wnbh9" Dec 07 19:45:27 crc kubenswrapper[4815]: I1207 19:45:27.799076 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 07 19:45:27 crc kubenswrapper[4815]: I1207 19:45:27.799276 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 07 19:45:27 crc kubenswrapper[4815]: I1207 19:45:27.799419 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 07 19:45:27 crc kubenswrapper[4815]: I1207 19:45:27.807391 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rvv2t" Dec 07 19:45:27 crc kubenswrapper[4815]: I1207 19:45:27.828290 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wnbh9"] Dec 07 19:45:27 crc kubenswrapper[4815]: I1207 19:45:27.907029 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/af58b4e7-d807-47a1-9c1c-21ab66fe7605-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wnbh9\" (UID: \"af58b4e7-d807-47a1-9c1c-21ab66fe7605\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wnbh9" Dec 07 19:45:27 crc kubenswrapper[4815]: I1207 19:45:27.907177 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkjls\" (UniqueName: \"kubernetes.io/projected/af58b4e7-d807-47a1-9c1c-21ab66fe7605-kube-api-access-zkjls\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wnbh9\" (UID: \"af58b4e7-d807-47a1-9c1c-21ab66fe7605\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wnbh9" Dec 07 19:45:27 crc kubenswrapper[4815]: I1207 19:45:27.907226 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/af58b4e7-d807-47a1-9c1c-21ab66fe7605-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wnbh9\" (UID: \"af58b4e7-d807-47a1-9c1c-21ab66fe7605\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wnbh9" Dec 07 19:45:28 crc kubenswrapper[4815]: I1207 19:45:28.008867 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/af58b4e7-d807-47a1-9c1c-21ab66fe7605-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wnbh9\" (UID: \"af58b4e7-d807-47a1-9c1c-21ab66fe7605\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wnbh9" Dec 07 19:45:28 crc kubenswrapper[4815]: I1207 19:45:28.009009 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkjls\" (UniqueName: \"kubernetes.io/projected/af58b4e7-d807-47a1-9c1c-21ab66fe7605-kube-api-access-zkjls\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wnbh9\" (UID: \"af58b4e7-d807-47a1-9c1c-21ab66fe7605\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wnbh9" Dec 07 19:45:28 crc kubenswrapper[4815]: I1207 19:45:28.009061 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/af58b4e7-d807-47a1-9c1c-21ab66fe7605-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wnbh9\" (UID: \"af58b4e7-d807-47a1-9c1c-21ab66fe7605\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wnbh9" Dec 07 19:45:28 crc kubenswrapper[4815]: I1207 19:45:28.014114 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/af58b4e7-d807-47a1-9c1c-21ab66fe7605-ssh-key\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wnbh9\" (UID: \"af58b4e7-d807-47a1-9c1c-21ab66fe7605\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wnbh9" Dec 07 19:45:28 crc kubenswrapper[4815]: I1207 19:45:28.014401 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/af58b4e7-d807-47a1-9c1c-21ab66fe7605-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wnbh9\" (UID: \"af58b4e7-d807-47a1-9c1c-21ab66fe7605\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wnbh9" Dec 07 19:45:28 crc kubenswrapper[4815]: I1207 19:45:28.039550 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkjls\" (UniqueName: \"kubernetes.io/projected/af58b4e7-d807-47a1-9c1c-21ab66fe7605-kube-api-access-zkjls\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wnbh9\" (UID: \"af58b4e7-d807-47a1-9c1c-21ab66fe7605\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wnbh9" Dec 07 19:45:28 crc kubenswrapper[4815]: I1207 19:45:28.116641 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wnbh9" Dec 07 19:45:28 crc kubenswrapper[4815]: I1207 19:45:28.628810 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wnbh9"] Dec 07 19:45:28 crc kubenswrapper[4815]: I1207 19:45:28.692352 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wnbh9" event={"ID":"af58b4e7-d807-47a1-9c1c-21ab66fe7605","Type":"ContainerStarted","Data":"8e61f2cec255284aff9977d6914abaf6bb178b60c37ac2dce21d5d58e5399f0d"} Dec 07 19:45:28 crc kubenswrapper[4815]: I1207 19:45:28.770521 4815 scope.go:117] "RemoveContainer" containerID="f352000dbce74affccf74924ce7b6a53eb5e32c31268c3c83ff0c4e8881fae0d" Dec 07 19:45:28 crc kubenswrapper[4815]: E1207 19:45:28.771047 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gkn4h_openshift-machine-config-operator(3d662ba2-aa03-4eea-bd30-8ad40638f6c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" Dec 07 19:45:29 crc kubenswrapper[4815]: I1207 19:45:29.700286 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wnbh9" event={"ID":"af58b4e7-d807-47a1-9c1c-21ab66fe7605","Type":"ContainerStarted","Data":"5dfc7eeb96d6c4b6c770fbe343e4e28d6912bec9fe316a43fe44324697f5221b"} Dec 07 19:45:29 crc kubenswrapper[4815]: I1207 19:45:29.719510 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wnbh9" podStartSLOduration=2.113258213 podStartE2EDuration="2.719491491s" podCreationTimestamp="2025-12-07 19:45:27 +0000 UTC" firstStartedPulling="2025-12-07 19:45:28.644193908 +0000 UTC m=+1833.223183963" lastFinishedPulling="2025-12-07 19:45:29.250427196 +0000 UTC m=+1833.829417241" observedRunningTime="2025-12-07 19:45:29.71699233 +0000 UTC m=+1834.295982385" watchObservedRunningTime="2025-12-07 19:45:29.719491491 +0000 UTC m=+1834.298481536" Dec 07 19:45:33 crc kubenswrapper[4815]: I1207 19:45:33.758966 4815 generic.go:334] "Generic (PLEG): container finished" podID="af58b4e7-d807-47a1-9c1c-21ab66fe7605" containerID="5dfc7eeb96d6c4b6c770fbe343e4e28d6912bec9fe316a43fe44324697f5221b" exitCode=0 Dec 07 19:45:33 crc kubenswrapper[4815]: I1207 19:45:33.759042 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wnbh9" event={"ID":"af58b4e7-d807-47a1-9c1c-21ab66fe7605","Type":"ContainerDied","Data":"5dfc7eeb96d6c4b6c770fbe343e4e28d6912bec9fe316a43fe44324697f5221b"} Dec 07 19:45:35 crc kubenswrapper[4815]: I1207 19:45:35.175344 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wnbh9" Dec 07 19:45:35 crc kubenswrapper[4815]: I1207 19:45:35.347552 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/af58b4e7-d807-47a1-9c1c-21ab66fe7605-inventory\") pod \"af58b4e7-d807-47a1-9c1c-21ab66fe7605\" (UID: \"af58b4e7-d807-47a1-9c1c-21ab66fe7605\") " Dec 07 19:45:35 crc kubenswrapper[4815]: I1207 19:45:35.347746 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/af58b4e7-d807-47a1-9c1c-21ab66fe7605-ssh-key\") pod \"af58b4e7-d807-47a1-9c1c-21ab66fe7605\" (UID: \"af58b4e7-d807-47a1-9c1c-21ab66fe7605\") " Dec 07 19:45:35 crc kubenswrapper[4815]: I1207 19:45:35.347839 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkjls\" (UniqueName: \"kubernetes.io/projected/af58b4e7-d807-47a1-9c1c-21ab66fe7605-kube-api-access-zkjls\") pod \"af58b4e7-d807-47a1-9c1c-21ab66fe7605\" (UID: \"af58b4e7-d807-47a1-9c1c-21ab66fe7605\") " Dec 07 19:45:35 crc kubenswrapper[4815]: I1207 19:45:35.361146 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af58b4e7-d807-47a1-9c1c-21ab66fe7605-kube-api-access-zkjls" (OuterVolumeSpecName: "kube-api-access-zkjls") pod "af58b4e7-d807-47a1-9c1c-21ab66fe7605" (UID: "af58b4e7-d807-47a1-9c1c-21ab66fe7605"). InnerVolumeSpecName "kube-api-access-zkjls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:45:35 crc kubenswrapper[4815]: I1207 19:45:35.393590 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af58b4e7-d807-47a1-9c1c-21ab66fe7605-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "af58b4e7-d807-47a1-9c1c-21ab66fe7605" (UID: "af58b4e7-d807-47a1-9c1c-21ab66fe7605"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:45:35 crc kubenswrapper[4815]: I1207 19:45:35.415506 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af58b4e7-d807-47a1-9c1c-21ab66fe7605-inventory" (OuterVolumeSpecName: "inventory") pod "af58b4e7-d807-47a1-9c1c-21ab66fe7605" (UID: "af58b4e7-d807-47a1-9c1c-21ab66fe7605"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:45:35 crc kubenswrapper[4815]: I1207 19:45:35.449598 4815 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/af58b4e7-d807-47a1-9c1c-21ab66fe7605-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 07 19:45:35 crc kubenswrapper[4815]: I1207 19:45:35.449635 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkjls\" (UniqueName: \"kubernetes.io/projected/af58b4e7-d807-47a1-9c1c-21ab66fe7605-kube-api-access-zkjls\") on node \"crc\" DevicePath \"\"" Dec 07 19:45:35 crc kubenswrapper[4815]: I1207 19:45:35.449649 4815 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/af58b4e7-d807-47a1-9c1c-21ab66fe7605-inventory\") on node \"crc\" DevicePath \"\"" Dec 07 19:45:35 crc kubenswrapper[4815]: I1207 19:45:35.780587 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wnbh9" Dec 07 19:45:35 crc kubenswrapper[4815]: I1207 19:45:35.790104 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wnbh9" event={"ID":"af58b4e7-d807-47a1-9c1c-21ab66fe7605","Type":"ContainerDied","Data":"8e61f2cec255284aff9977d6914abaf6bb178b60c37ac2dce21d5d58e5399f0d"} Dec 07 19:45:35 crc kubenswrapper[4815]: I1207 19:45:35.790172 4815 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e61f2cec255284aff9977d6914abaf6bb178b60c37ac2dce21d5d58e5399f0d" Dec 07 19:45:35 crc kubenswrapper[4815]: I1207 19:45:35.847018 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zfw9g"] Dec 07 19:45:35 crc kubenswrapper[4815]: E1207 19:45:35.847424 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af58b4e7-d807-47a1-9c1c-21ab66fe7605" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Dec 07 19:45:35 crc kubenswrapper[4815]: I1207 19:45:35.847449 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="af58b4e7-d807-47a1-9c1c-21ab66fe7605" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Dec 07 19:45:35 crc kubenswrapper[4815]: I1207 19:45:35.847638 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="af58b4e7-d807-47a1-9c1c-21ab66fe7605" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Dec 07 19:45:35 crc kubenswrapper[4815]: I1207 19:45:35.851274 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zfw9g" Dec 07 19:45:35 crc kubenswrapper[4815]: I1207 19:45:35.854887 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 07 19:45:35 crc kubenswrapper[4815]: I1207 19:45:35.855151 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 07 19:45:35 crc kubenswrapper[4815]: I1207 19:45:35.855219 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rvv2t" Dec 07 19:45:35 crc kubenswrapper[4815]: I1207 19:45:35.855350 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 07 19:45:35 crc kubenswrapper[4815]: I1207 19:45:35.857069 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zfw9g"] Dec 07 19:45:35 crc kubenswrapper[4815]: I1207 19:45:35.961000 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mndl\" (UniqueName: \"kubernetes.io/projected/c6416bd8-ba0c-491b-b5b3-3a2ed9dff393-kube-api-access-8mndl\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-zfw9g\" (UID: \"c6416bd8-ba0c-491b-b5b3-3a2ed9dff393\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zfw9g" Dec 07 19:45:35 crc kubenswrapper[4815]: I1207 19:45:35.961052 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c6416bd8-ba0c-491b-b5b3-3a2ed9dff393-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-zfw9g\" (UID: \"c6416bd8-ba0c-491b-b5b3-3a2ed9dff393\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zfw9g" Dec 07 19:45:35 crc kubenswrapper[4815]: I1207 19:45:35.961071 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c6416bd8-ba0c-491b-b5b3-3a2ed9dff393-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-zfw9g\" (UID: \"c6416bd8-ba0c-491b-b5b3-3a2ed9dff393\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zfw9g" Dec 07 19:45:36 crc kubenswrapper[4815]: I1207 19:45:36.063141 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mndl\" (UniqueName: \"kubernetes.io/projected/c6416bd8-ba0c-491b-b5b3-3a2ed9dff393-kube-api-access-8mndl\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-zfw9g\" (UID: \"c6416bd8-ba0c-491b-b5b3-3a2ed9dff393\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zfw9g" Dec 07 19:45:36 crc kubenswrapper[4815]: I1207 19:45:36.063198 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c6416bd8-ba0c-491b-b5b3-3a2ed9dff393-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-zfw9g\" (UID: \"c6416bd8-ba0c-491b-b5b3-3a2ed9dff393\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zfw9g" Dec 07 19:45:36 crc kubenswrapper[4815]: I1207 19:45:36.063220 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c6416bd8-ba0c-491b-b5b3-3a2ed9dff393-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-zfw9g\" (UID: \"c6416bd8-ba0c-491b-b5b3-3a2ed9dff393\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zfw9g" Dec 07 19:45:36 crc kubenswrapper[4815]: I1207 19:45:36.069908 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c6416bd8-ba0c-491b-b5b3-3a2ed9dff393-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-zfw9g\" (UID: \"c6416bd8-ba0c-491b-b5b3-3a2ed9dff393\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zfw9g" Dec 07 19:45:36 crc kubenswrapper[4815]: I1207 19:45:36.070944 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c6416bd8-ba0c-491b-b5b3-3a2ed9dff393-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-zfw9g\" (UID: \"c6416bd8-ba0c-491b-b5b3-3a2ed9dff393\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zfw9g" Dec 07 19:45:36 crc kubenswrapper[4815]: I1207 19:45:36.095772 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mndl\" (UniqueName: \"kubernetes.io/projected/c6416bd8-ba0c-491b-b5b3-3a2ed9dff393-kube-api-access-8mndl\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-zfw9g\" (UID: \"c6416bd8-ba0c-491b-b5b3-3a2ed9dff393\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zfw9g" Dec 07 19:45:36 crc kubenswrapper[4815]: I1207 19:45:36.171389 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zfw9g" Dec 07 19:45:36 crc kubenswrapper[4815]: I1207 19:45:36.753598 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zfw9g"] Dec 07 19:45:36 crc kubenswrapper[4815]: I1207 19:45:36.791135 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zfw9g" event={"ID":"c6416bd8-ba0c-491b-b5b3-3a2ed9dff393","Type":"ContainerStarted","Data":"f13675ab5519c3712d5fe8d0d0108acb3c4580b0ab9df979c07b3e4b7eb6e8a5"} Dec 07 19:45:37 crc kubenswrapper[4815]: I1207 19:45:37.813889 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zfw9g" event={"ID":"c6416bd8-ba0c-491b-b5b3-3a2ed9dff393","Type":"ContainerStarted","Data":"9d9216c1770c43c050cebb73ce0f2206205fd66edab20d9588395e9fb76f0b4a"} Dec 07 19:45:37 crc kubenswrapper[4815]: I1207 19:45:37.842639 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zfw9g" podStartSLOduration=2.197819031 podStartE2EDuration="2.84260391s" podCreationTimestamp="2025-12-07 19:45:35 +0000 UTC" firstStartedPulling="2025-12-07 19:45:36.732324767 +0000 UTC m=+1841.311314822" lastFinishedPulling="2025-12-07 19:45:37.377109616 +0000 UTC m=+1841.956099701" observedRunningTime="2025-12-07 19:45:37.831324451 +0000 UTC m=+1842.410314496" watchObservedRunningTime="2025-12-07 19:45:37.84260391 +0000 UTC m=+1842.421593955" Dec 07 19:45:40 crc kubenswrapper[4815]: I1207 19:45:40.769397 4815 scope.go:117] "RemoveContainer" containerID="f352000dbce74affccf74924ce7b6a53eb5e32c31268c3c83ff0c4e8881fae0d" Dec 07 19:45:40 crc kubenswrapper[4815]: E1207 19:45:40.770159 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gkn4h_openshift-machine-config-operator(3d662ba2-aa03-4eea-bd30-8ad40638f6c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" Dec 07 19:45:55 crc kubenswrapper[4815]: I1207 19:45:55.777679 4815 scope.go:117] "RemoveContainer" containerID="f352000dbce74affccf74924ce7b6a53eb5e32c31268c3c83ff0c4e8881fae0d" Dec 07 19:45:55 crc kubenswrapper[4815]: E1207 19:45:55.778763 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gkn4h_openshift-machine-config-operator(3d662ba2-aa03-4eea-bd30-8ad40638f6c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" Dec 07 19:46:05 crc kubenswrapper[4815]: I1207 19:46:05.053320 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-mxr8g"] Dec 07 19:46:05 crc kubenswrapper[4815]: I1207 19:46:05.064426 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-mxr8g"] Dec 07 19:46:05 crc kubenswrapper[4815]: I1207 19:46:05.782366 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87680dbf-76ea-41e7-9349-4a83b07b7c8f" path="/var/lib/kubelet/pods/87680dbf-76ea-41e7-9349-4a83b07b7c8f/volumes" Dec 07 19:46:06 crc kubenswrapper[4815]: I1207 19:46:06.051683 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-9jkcq"] Dec 07 19:46:06 crc kubenswrapper[4815]: I1207 19:46:06.067394 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-5cd5-account-create-update-tdn22"] Dec 07 19:46:06 crc kubenswrapper[4815]: I1207 19:46:06.077713 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-gmwt8"] Dec 07 19:46:06 crc kubenswrapper[4815]: I1207 19:46:06.088485 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-5cd5-account-create-update-tdn22"] Dec 07 19:46:06 crc kubenswrapper[4815]: I1207 19:46:06.097753 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-d187-account-create-update-db6lq"] Dec 07 19:46:06 crc kubenswrapper[4815]: I1207 19:46:06.105663 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-9jkcq"] Dec 07 19:46:06 crc kubenswrapper[4815]: I1207 19:46:06.112645 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-gmwt8"] Dec 07 19:46:06 crc kubenswrapper[4815]: I1207 19:46:06.118689 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-d187-account-create-update-db6lq"] Dec 07 19:46:06 crc kubenswrapper[4815]: I1207 19:46:06.126183 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-ba1d-account-create-update-kt8v7"] Dec 07 19:46:06 crc kubenswrapper[4815]: I1207 19:46:06.134510 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-ba1d-account-create-update-kt8v7"] Dec 07 19:46:07 crc kubenswrapper[4815]: I1207 19:46:07.794852 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54cb9bbd-aa49-4113-8094-db794e456915" path="/var/lib/kubelet/pods/54cb9bbd-aa49-4113-8094-db794e456915/volumes" Dec 07 19:46:07 crc kubenswrapper[4815]: I1207 19:46:07.797491 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4283dd6-964a-4f3d-8509-7a2c7fbd393f" path="/var/lib/kubelet/pods/c4283dd6-964a-4f3d-8509-7a2c7fbd393f/volumes" Dec 07 19:46:07 crc kubenswrapper[4815]: I1207 19:46:07.798879 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cca9cdb9-858e-463a-b89f-4fe86331b304" path="/var/lib/kubelet/pods/cca9cdb9-858e-463a-b89f-4fe86331b304/volumes" Dec 07 19:46:07 crc kubenswrapper[4815]: I1207 19:46:07.800433 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d740fe41-3740-496c-b34d-3fd2f63fb619" path="/var/lib/kubelet/pods/d740fe41-3740-496c-b34d-3fd2f63fb619/volumes" Dec 07 19:46:07 crc kubenswrapper[4815]: I1207 19:46:07.803190 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e656eb7e-25aa-4982-a61a-d5a41cfef90d" path="/var/lib/kubelet/pods/e656eb7e-25aa-4982-a61a-d5a41cfef90d/volumes" Dec 07 19:46:08 crc kubenswrapper[4815]: I1207 19:46:08.770975 4815 scope.go:117] "RemoveContainer" containerID="f352000dbce74affccf74924ce7b6a53eb5e32c31268c3c83ff0c4e8881fae0d" Dec 07 19:46:08 crc kubenswrapper[4815]: E1207 19:46:08.771395 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gkn4h_openshift-machine-config-operator(3d662ba2-aa03-4eea-bd30-8ad40638f6c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" Dec 07 19:46:23 crc kubenswrapper[4815]: I1207 19:46:23.140705 4815 scope.go:117] "RemoveContainer" containerID="63c108d380c4d742b2fa6c28ab59f9759c819309bd5dc1bcb83bb82eaf53d029" Dec 07 19:46:23 crc kubenswrapper[4815]: I1207 19:46:23.169162 4815 scope.go:117] "RemoveContainer" containerID="9807558ff6db72094e22cacfd87a8d67aa115435ff38a4b101e4eb6219c86929" Dec 07 19:46:23 crc kubenswrapper[4815]: I1207 19:46:23.258769 4815 scope.go:117] "RemoveContainer" containerID="055213ce89c021f00d002f19dbb91660ad69d0f48893edf9abb3298d5ef36bc6" Dec 07 19:46:23 crc kubenswrapper[4815]: I1207 19:46:23.287250 4815 scope.go:117] "RemoveContainer" containerID="7bcc6874aa3d429722f2cc094154f26214b1835ffe964a06f0c7f27672616f32" Dec 07 19:46:23 crc kubenswrapper[4815]: I1207 19:46:23.321327 4815 scope.go:117] "RemoveContainer" containerID="0cec2bb0329f76f7f7a557247163d68d41b2eed2fffcd968cf72da9ca9e73cba" Dec 07 19:46:23 crc kubenswrapper[4815]: I1207 19:46:23.359997 4815 scope.go:117] "RemoveContainer" containerID="31df9400bc214ed9b14b1b8916413c0977d9624efba7003e4d8c173732d571c0" Dec 07 19:46:23 crc kubenswrapper[4815]: I1207 19:46:23.404268 4815 scope.go:117] "RemoveContainer" containerID="2d596b7f5dc3f88d2fb84b5c96a9b7351ba0f39a7c0fe80afa378773bdfeef7f" Dec 07 19:46:23 crc kubenswrapper[4815]: I1207 19:46:23.770485 4815 scope.go:117] "RemoveContainer" containerID="f352000dbce74affccf74924ce7b6a53eb5e32c31268c3c83ff0c4e8881fae0d" Dec 07 19:46:23 crc kubenswrapper[4815]: E1207 19:46:23.771690 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gkn4h_openshift-machine-config-operator(3d662ba2-aa03-4eea-bd30-8ad40638f6c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" Dec 07 19:46:33 crc kubenswrapper[4815]: I1207 19:46:33.052521 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-qgp5k"] Dec 07 19:46:33 crc kubenswrapper[4815]: I1207 19:46:33.060946 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-qgp5k"] Dec 07 19:46:33 crc kubenswrapper[4815]: I1207 19:46:33.780405 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83eabbaf-4dd5-4f1e-b9f8-bc98f0b295c5" path="/var/lib/kubelet/pods/83eabbaf-4dd5-4f1e-b9f8-bc98f0b295c5/volumes" Dec 07 19:46:35 crc kubenswrapper[4815]: I1207 19:46:35.779975 4815 scope.go:117] "RemoveContainer" containerID="f352000dbce74affccf74924ce7b6a53eb5e32c31268c3c83ff0c4e8881fae0d" Dec 07 19:46:36 crc kubenswrapper[4815]: I1207 19:46:36.332468 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" event={"ID":"3d662ba2-aa03-4eea-bd30-8ad40638f6c7","Type":"ContainerStarted","Data":"a4d1ab49e52a51b21d475b2ec69e983b5266f8725eddc625be4fa0eac2776a34"} Dec 07 19:46:43 crc kubenswrapper[4815]: I1207 19:46:43.394165 4815 generic.go:334] "Generic (PLEG): container finished" podID="c6416bd8-ba0c-491b-b5b3-3a2ed9dff393" containerID="9d9216c1770c43c050cebb73ce0f2206205fd66edab20d9588395e9fb76f0b4a" exitCode=0 Dec 07 19:46:43 crc kubenswrapper[4815]: I1207 19:46:43.394293 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zfw9g" event={"ID":"c6416bd8-ba0c-491b-b5b3-3a2ed9dff393","Type":"ContainerDied","Data":"9d9216c1770c43c050cebb73ce0f2206205fd66edab20d9588395e9fb76f0b4a"} Dec 07 19:46:44 crc kubenswrapper[4815]: I1207 19:46:44.826983 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zfw9g" Dec 07 19:46:44 crc kubenswrapper[4815]: I1207 19:46:44.991691 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8mndl\" (UniqueName: \"kubernetes.io/projected/c6416bd8-ba0c-491b-b5b3-3a2ed9dff393-kube-api-access-8mndl\") pod \"c6416bd8-ba0c-491b-b5b3-3a2ed9dff393\" (UID: \"c6416bd8-ba0c-491b-b5b3-3a2ed9dff393\") " Dec 07 19:46:44 crc kubenswrapper[4815]: I1207 19:46:44.992018 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c6416bd8-ba0c-491b-b5b3-3a2ed9dff393-inventory\") pod \"c6416bd8-ba0c-491b-b5b3-3a2ed9dff393\" (UID: \"c6416bd8-ba0c-491b-b5b3-3a2ed9dff393\") " Dec 07 19:46:44 crc kubenswrapper[4815]: I1207 19:46:44.992060 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c6416bd8-ba0c-491b-b5b3-3a2ed9dff393-ssh-key\") pod \"c6416bd8-ba0c-491b-b5b3-3a2ed9dff393\" (UID: \"c6416bd8-ba0c-491b-b5b3-3a2ed9dff393\") " Dec 07 19:46:44 crc kubenswrapper[4815]: I1207 19:46:44.999027 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6416bd8-ba0c-491b-b5b3-3a2ed9dff393-kube-api-access-8mndl" (OuterVolumeSpecName: "kube-api-access-8mndl") pod "c6416bd8-ba0c-491b-b5b3-3a2ed9dff393" (UID: "c6416bd8-ba0c-491b-b5b3-3a2ed9dff393"). InnerVolumeSpecName "kube-api-access-8mndl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:46:45 crc kubenswrapper[4815]: I1207 19:46:45.022838 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6416bd8-ba0c-491b-b5b3-3a2ed9dff393-inventory" (OuterVolumeSpecName: "inventory") pod "c6416bd8-ba0c-491b-b5b3-3a2ed9dff393" (UID: "c6416bd8-ba0c-491b-b5b3-3a2ed9dff393"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:46:45 crc kubenswrapper[4815]: I1207 19:46:45.032462 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6416bd8-ba0c-491b-b5b3-3a2ed9dff393-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c6416bd8-ba0c-491b-b5b3-3a2ed9dff393" (UID: "c6416bd8-ba0c-491b-b5b3-3a2ed9dff393"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:46:45 crc kubenswrapper[4815]: I1207 19:46:45.094208 4815 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c6416bd8-ba0c-491b-b5b3-3a2ed9dff393-inventory\") on node \"crc\" DevicePath \"\"" Dec 07 19:46:45 crc kubenswrapper[4815]: I1207 19:46:45.094457 4815 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c6416bd8-ba0c-491b-b5b3-3a2ed9dff393-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 07 19:46:45 crc kubenswrapper[4815]: I1207 19:46:45.094544 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8mndl\" (UniqueName: \"kubernetes.io/projected/c6416bd8-ba0c-491b-b5b3-3a2ed9dff393-kube-api-access-8mndl\") on node \"crc\" DevicePath \"\"" Dec 07 19:46:45 crc kubenswrapper[4815]: I1207 19:46:45.415244 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zfw9g" event={"ID":"c6416bd8-ba0c-491b-b5b3-3a2ed9dff393","Type":"ContainerDied","Data":"f13675ab5519c3712d5fe8d0d0108acb3c4580b0ab9df979c07b3e4b7eb6e8a5"} Dec 07 19:46:45 crc kubenswrapper[4815]: I1207 19:46:45.415292 4815 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f13675ab5519c3712d5fe8d0d0108acb3c4580b0ab9df979c07b3e4b7eb6e8a5" Dec 07 19:46:45 crc kubenswrapper[4815]: I1207 19:46:45.415354 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-zfw9g" Dec 07 19:46:45 crc kubenswrapper[4815]: I1207 19:46:45.515277 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-mcc2r"] Dec 07 19:46:45 crc kubenswrapper[4815]: E1207 19:46:45.516510 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6416bd8-ba0c-491b-b5b3-3a2ed9dff393" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 07 19:46:45 crc kubenswrapper[4815]: I1207 19:46:45.516530 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6416bd8-ba0c-491b-b5b3-3a2ed9dff393" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 07 19:46:45 crc kubenswrapper[4815]: I1207 19:46:45.517001 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6416bd8-ba0c-491b-b5b3-3a2ed9dff393" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 07 19:46:45 crc kubenswrapper[4815]: I1207 19:46:45.518211 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-mcc2r" Dec 07 19:46:45 crc kubenswrapper[4815]: I1207 19:46:45.522327 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 07 19:46:45 crc kubenswrapper[4815]: I1207 19:46:45.522694 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rvv2t" Dec 07 19:46:45 crc kubenswrapper[4815]: I1207 19:46:45.523430 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 07 19:46:45 crc kubenswrapper[4815]: I1207 19:46:45.523624 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 07 19:46:45 crc kubenswrapper[4815]: I1207 19:46:45.535183 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-mcc2r"] Dec 07 19:46:45 crc kubenswrapper[4815]: I1207 19:46:45.604979 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6h9w\" (UniqueName: \"kubernetes.io/projected/f094fb15-aab4-422c-b2a1-1d0fd8d7ff22-kube-api-access-z6h9w\") pod \"ssh-known-hosts-edpm-deployment-mcc2r\" (UID: \"f094fb15-aab4-422c-b2a1-1d0fd8d7ff22\") " pod="openstack/ssh-known-hosts-edpm-deployment-mcc2r" Dec 07 19:46:45 crc kubenswrapper[4815]: I1207 19:46:45.605025 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/f094fb15-aab4-422c-b2a1-1d0fd8d7ff22-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-mcc2r\" (UID: \"f094fb15-aab4-422c-b2a1-1d0fd8d7ff22\") " pod="openstack/ssh-known-hosts-edpm-deployment-mcc2r" Dec 07 19:46:45 crc kubenswrapper[4815]: I1207 19:46:45.605084 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f094fb15-aab4-422c-b2a1-1d0fd8d7ff22-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-mcc2r\" (UID: \"f094fb15-aab4-422c-b2a1-1d0fd8d7ff22\") " pod="openstack/ssh-known-hosts-edpm-deployment-mcc2r" Dec 07 19:46:45 crc kubenswrapper[4815]: I1207 19:46:45.706215 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6h9w\" (UniqueName: \"kubernetes.io/projected/f094fb15-aab4-422c-b2a1-1d0fd8d7ff22-kube-api-access-z6h9w\") pod \"ssh-known-hosts-edpm-deployment-mcc2r\" (UID: \"f094fb15-aab4-422c-b2a1-1d0fd8d7ff22\") " pod="openstack/ssh-known-hosts-edpm-deployment-mcc2r" Dec 07 19:46:45 crc kubenswrapper[4815]: I1207 19:46:45.706262 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/f094fb15-aab4-422c-b2a1-1d0fd8d7ff22-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-mcc2r\" (UID: \"f094fb15-aab4-422c-b2a1-1d0fd8d7ff22\") " pod="openstack/ssh-known-hosts-edpm-deployment-mcc2r" Dec 07 19:46:45 crc kubenswrapper[4815]: I1207 19:46:45.706325 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f094fb15-aab4-422c-b2a1-1d0fd8d7ff22-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-mcc2r\" (UID: \"f094fb15-aab4-422c-b2a1-1d0fd8d7ff22\") " pod="openstack/ssh-known-hosts-edpm-deployment-mcc2r" Dec 07 19:46:45 crc kubenswrapper[4815]: I1207 19:46:45.712250 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/f094fb15-aab4-422c-b2a1-1d0fd8d7ff22-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-mcc2r\" (UID: \"f094fb15-aab4-422c-b2a1-1d0fd8d7ff22\") " pod="openstack/ssh-known-hosts-edpm-deployment-mcc2r" Dec 07 19:46:45 crc kubenswrapper[4815]: I1207 19:46:45.723541 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f094fb15-aab4-422c-b2a1-1d0fd8d7ff22-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-mcc2r\" (UID: \"f094fb15-aab4-422c-b2a1-1d0fd8d7ff22\") " pod="openstack/ssh-known-hosts-edpm-deployment-mcc2r" Dec 07 19:46:45 crc kubenswrapper[4815]: I1207 19:46:45.726595 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6h9w\" (UniqueName: \"kubernetes.io/projected/f094fb15-aab4-422c-b2a1-1d0fd8d7ff22-kube-api-access-z6h9w\") pod \"ssh-known-hosts-edpm-deployment-mcc2r\" (UID: \"f094fb15-aab4-422c-b2a1-1d0fd8d7ff22\") " pod="openstack/ssh-known-hosts-edpm-deployment-mcc2r" Dec 07 19:46:45 crc kubenswrapper[4815]: I1207 19:46:45.853486 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-mcc2r" Dec 07 19:46:46 crc kubenswrapper[4815]: I1207 19:46:46.461717 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-mcc2r"] Dec 07 19:46:47 crc kubenswrapper[4815]: I1207 19:46:47.432811 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-mcc2r" event={"ID":"f094fb15-aab4-422c-b2a1-1d0fd8d7ff22","Type":"ContainerStarted","Data":"a4c37613337357a9d55a5879d362d60590488d87b438f9a40b418b8826503ade"} Dec 07 19:46:47 crc kubenswrapper[4815]: I1207 19:46:47.434677 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-mcc2r" event={"ID":"f094fb15-aab4-422c-b2a1-1d0fd8d7ff22","Type":"ContainerStarted","Data":"2db3f13b4342d4ea4a183bb35d4e5ef8b83564c90a8a7be5486adb883354febe"} Dec 07 19:46:47 crc kubenswrapper[4815]: I1207 19:46:47.452941 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-mcc2r" podStartSLOduration=1.9634157920000002 podStartE2EDuration="2.452903386s" podCreationTimestamp="2025-12-07 19:46:45 +0000 UTC" firstStartedPulling="2025-12-07 19:46:46.452124403 +0000 UTC m=+1911.031114448" lastFinishedPulling="2025-12-07 19:46:46.941611977 +0000 UTC m=+1911.520602042" observedRunningTime="2025-12-07 19:46:47.448336327 +0000 UTC m=+1912.027326412" watchObservedRunningTime="2025-12-07 19:46:47.452903386 +0000 UTC m=+1912.031893431" Dec 07 19:46:55 crc kubenswrapper[4815]: I1207 19:46:55.512480 4815 generic.go:334] "Generic (PLEG): container finished" podID="f094fb15-aab4-422c-b2a1-1d0fd8d7ff22" containerID="a4c37613337357a9d55a5879d362d60590488d87b438f9a40b418b8826503ade" exitCode=0 Dec 07 19:46:55 crc kubenswrapper[4815]: I1207 19:46:55.512596 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-mcc2r" event={"ID":"f094fb15-aab4-422c-b2a1-1d0fd8d7ff22","Type":"ContainerDied","Data":"a4c37613337357a9d55a5879d362d60590488d87b438f9a40b418b8826503ade"} Dec 07 19:46:56 crc kubenswrapper[4815]: I1207 19:46:56.877671 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-mcc2r" Dec 07 19:46:57 crc kubenswrapper[4815]: I1207 19:46:57.032885 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f094fb15-aab4-422c-b2a1-1d0fd8d7ff22-ssh-key-openstack-edpm-ipam\") pod \"f094fb15-aab4-422c-b2a1-1d0fd8d7ff22\" (UID: \"f094fb15-aab4-422c-b2a1-1d0fd8d7ff22\") " Dec 07 19:46:57 crc kubenswrapper[4815]: I1207 19:46:57.033036 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z6h9w\" (UniqueName: \"kubernetes.io/projected/f094fb15-aab4-422c-b2a1-1d0fd8d7ff22-kube-api-access-z6h9w\") pod \"f094fb15-aab4-422c-b2a1-1d0fd8d7ff22\" (UID: \"f094fb15-aab4-422c-b2a1-1d0fd8d7ff22\") " Dec 07 19:46:57 crc kubenswrapper[4815]: I1207 19:46:57.033092 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/f094fb15-aab4-422c-b2a1-1d0fd8d7ff22-inventory-0\") pod \"f094fb15-aab4-422c-b2a1-1d0fd8d7ff22\" (UID: \"f094fb15-aab4-422c-b2a1-1d0fd8d7ff22\") " Dec 07 19:46:57 crc kubenswrapper[4815]: I1207 19:46:57.039340 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f094fb15-aab4-422c-b2a1-1d0fd8d7ff22-kube-api-access-z6h9w" (OuterVolumeSpecName: "kube-api-access-z6h9w") pod "f094fb15-aab4-422c-b2a1-1d0fd8d7ff22" (UID: "f094fb15-aab4-422c-b2a1-1d0fd8d7ff22"). InnerVolumeSpecName "kube-api-access-z6h9w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:46:57 crc kubenswrapper[4815]: I1207 19:46:57.061727 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f094fb15-aab4-422c-b2a1-1d0fd8d7ff22-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "f094fb15-aab4-422c-b2a1-1d0fd8d7ff22" (UID: "f094fb15-aab4-422c-b2a1-1d0fd8d7ff22"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:46:57 crc kubenswrapper[4815]: I1207 19:46:57.078077 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f094fb15-aab4-422c-b2a1-1d0fd8d7ff22-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "f094fb15-aab4-422c-b2a1-1d0fd8d7ff22" (UID: "f094fb15-aab4-422c-b2a1-1d0fd8d7ff22"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:46:57 crc kubenswrapper[4815]: I1207 19:46:57.134934 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z6h9w\" (UniqueName: \"kubernetes.io/projected/f094fb15-aab4-422c-b2a1-1d0fd8d7ff22-kube-api-access-z6h9w\") on node \"crc\" DevicePath \"\"" Dec 07 19:46:57 crc kubenswrapper[4815]: I1207 19:46:57.134965 4815 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/f094fb15-aab4-422c-b2a1-1d0fd8d7ff22-inventory-0\") on node \"crc\" DevicePath \"\"" Dec 07 19:46:57 crc kubenswrapper[4815]: I1207 19:46:57.134976 4815 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f094fb15-aab4-422c-b2a1-1d0fd8d7ff22-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 07 19:46:57 crc kubenswrapper[4815]: I1207 19:46:57.528456 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-mcc2r" event={"ID":"f094fb15-aab4-422c-b2a1-1d0fd8d7ff22","Type":"ContainerDied","Data":"2db3f13b4342d4ea4a183bb35d4e5ef8b83564c90a8a7be5486adb883354febe"} Dec 07 19:46:57 crc kubenswrapper[4815]: I1207 19:46:57.528493 4815 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2db3f13b4342d4ea4a183bb35d4e5ef8b83564c90a8a7be5486adb883354febe" Dec 07 19:46:57 crc kubenswrapper[4815]: I1207 19:46:57.528520 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-mcc2r" Dec 07 19:46:57 crc kubenswrapper[4815]: I1207 19:46:57.623398 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-mp7vp"] Dec 07 19:46:57 crc kubenswrapper[4815]: E1207 19:46:57.623780 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f094fb15-aab4-422c-b2a1-1d0fd8d7ff22" containerName="ssh-known-hosts-edpm-deployment" Dec 07 19:46:57 crc kubenswrapper[4815]: I1207 19:46:57.623796 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="f094fb15-aab4-422c-b2a1-1d0fd8d7ff22" containerName="ssh-known-hosts-edpm-deployment" Dec 07 19:46:57 crc kubenswrapper[4815]: I1207 19:46:57.623988 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="f094fb15-aab4-422c-b2a1-1d0fd8d7ff22" containerName="ssh-known-hosts-edpm-deployment" Dec 07 19:46:57 crc kubenswrapper[4815]: I1207 19:46:57.624592 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-mp7vp" Dec 07 19:46:57 crc kubenswrapper[4815]: I1207 19:46:57.627188 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 07 19:46:57 crc kubenswrapper[4815]: I1207 19:46:57.628596 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 07 19:46:57 crc kubenswrapper[4815]: I1207 19:46:57.632430 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 07 19:46:57 crc kubenswrapper[4815]: I1207 19:46:57.636388 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-mp7vp"] Dec 07 19:46:57 crc kubenswrapper[4815]: I1207 19:46:57.642225 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rvv2t" Dec 07 19:46:57 crc kubenswrapper[4815]: I1207 19:46:57.744182 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bfe86916-a7d7-41a5-853c-acb6c37a85c1-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-mp7vp\" (UID: \"bfe86916-a7d7-41a5-853c-acb6c37a85c1\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-mp7vp" Dec 07 19:46:57 crc kubenswrapper[4815]: I1207 19:46:57.744225 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bfe86916-a7d7-41a5-853c-acb6c37a85c1-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-mp7vp\" (UID: \"bfe86916-a7d7-41a5-853c-acb6c37a85c1\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-mp7vp" Dec 07 19:46:57 crc kubenswrapper[4815]: I1207 19:46:57.744277 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrwfq\" (UniqueName: \"kubernetes.io/projected/bfe86916-a7d7-41a5-853c-acb6c37a85c1-kube-api-access-hrwfq\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-mp7vp\" (UID: \"bfe86916-a7d7-41a5-853c-acb6c37a85c1\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-mp7vp" Dec 07 19:46:57 crc kubenswrapper[4815]: I1207 19:46:57.845560 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrwfq\" (UniqueName: \"kubernetes.io/projected/bfe86916-a7d7-41a5-853c-acb6c37a85c1-kube-api-access-hrwfq\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-mp7vp\" (UID: \"bfe86916-a7d7-41a5-853c-acb6c37a85c1\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-mp7vp" Dec 07 19:46:57 crc kubenswrapper[4815]: I1207 19:46:57.845689 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bfe86916-a7d7-41a5-853c-acb6c37a85c1-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-mp7vp\" (UID: \"bfe86916-a7d7-41a5-853c-acb6c37a85c1\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-mp7vp" Dec 07 19:46:57 crc kubenswrapper[4815]: I1207 19:46:57.845718 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bfe86916-a7d7-41a5-853c-acb6c37a85c1-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-mp7vp\" (UID: \"bfe86916-a7d7-41a5-853c-acb6c37a85c1\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-mp7vp" Dec 07 19:46:57 crc kubenswrapper[4815]: I1207 19:46:57.851200 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bfe86916-a7d7-41a5-853c-acb6c37a85c1-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-mp7vp\" (UID: \"bfe86916-a7d7-41a5-853c-acb6c37a85c1\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-mp7vp" Dec 07 19:46:57 crc kubenswrapper[4815]: I1207 19:46:57.856079 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bfe86916-a7d7-41a5-853c-acb6c37a85c1-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-mp7vp\" (UID: \"bfe86916-a7d7-41a5-853c-acb6c37a85c1\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-mp7vp" Dec 07 19:46:57 crc kubenswrapper[4815]: I1207 19:46:57.867629 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrwfq\" (UniqueName: \"kubernetes.io/projected/bfe86916-a7d7-41a5-853c-acb6c37a85c1-kube-api-access-hrwfq\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-mp7vp\" (UID: \"bfe86916-a7d7-41a5-853c-acb6c37a85c1\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-mp7vp" Dec 07 19:46:57 crc kubenswrapper[4815]: I1207 19:46:57.941719 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-mp7vp" Dec 07 19:46:58 crc kubenswrapper[4815]: I1207 19:46:58.516873 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-mp7vp"] Dec 07 19:46:58 crc kubenswrapper[4815]: I1207 19:46:58.539015 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-mp7vp" event={"ID":"bfe86916-a7d7-41a5-853c-acb6c37a85c1","Type":"ContainerStarted","Data":"259cd6f0c6e20849b1e67eb8c784c0b03a13986c0b688bf9f04b2b285a089f0d"} Dec 07 19:46:59 crc kubenswrapper[4815]: I1207 19:46:59.557016 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-mp7vp" event={"ID":"bfe86916-a7d7-41a5-853c-acb6c37a85c1","Type":"ContainerStarted","Data":"e53b7333353747c26eeb7502cfd11d15f07ce2786e8e7568469580dbd54d1e7c"} Dec 07 19:46:59 crc kubenswrapper[4815]: I1207 19:46:59.577525 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-mp7vp" podStartSLOduration=2.15157017 podStartE2EDuration="2.577505005s" podCreationTimestamp="2025-12-07 19:46:57 +0000 UTC" firstStartedPulling="2025-12-07 19:46:58.519114341 +0000 UTC m=+1923.098104386" lastFinishedPulling="2025-12-07 19:46:58.945049166 +0000 UTC m=+1923.524039221" observedRunningTime="2025-12-07 19:46:59.574907832 +0000 UTC m=+1924.153897877" watchObservedRunningTime="2025-12-07 19:46:59.577505005 +0000 UTC m=+1924.156495060" Dec 07 19:47:00 crc kubenswrapper[4815]: I1207 19:47:00.075979 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-tsf6c"] Dec 07 19:47:00 crc kubenswrapper[4815]: I1207 19:47:00.090685 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-tsf6c"] Dec 07 19:47:01 crc kubenswrapper[4815]: I1207 19:47:01.782090 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2cdbde09-9f1a-4448-8fa3-372d29371084" path="/var/lib/kubelet/pods/2cdbde09-9f1a-4448-8fa3-372d29371084/volumes" Dec 07 19:47:02 crc kubenswrapper[4815]: I1207 19:47:02.046842 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-nrp9x"] Dec 07 19:47:02 crc kubenswrapper[4815]: I1207 19:47:02.057420 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-nrp9x"] Dec 07 19:47:03 crc kubenswrapper[4815]: I1207 19:47:03.779621 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e95a85c7-1619-4d79-bc43-3acc00d3ab9a" path="/var/lib/kubelet/pods/e95a85c7-1619-4d79-bc43-3acc00d3ab9a/volumes" Dec 07 19:47:08 crc kubenswrapper[4815]: I1207 19:47:08.628644 4815 generic.go:334] "Generic (PLEG): container finished" podID="bfe86916-a7d7-41a5-853c-acb6c37a85c1" containerID="e53b7333353747c26eeb7502cfd11d15f07ce2786e8e7568469580dbd54d1e7c" exitCode=0 Dec 07 19:47:08 crc kubenswrapper[4815]: I1207 19:47:08.628716 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-mp7vp" event={"ID":"bfe86916-a7d7-41a5-853c-acb6c37a85c1","Type":"ContainerDied","Data":"e53b7333353747c26eeb7502cfd11d15f07ce2786e8e7568469580dbd54d1e7c"} Dec 07 19:47:10 crc kubenswrapper[4815]: I1207 19:47:10.039309 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-mp7vp" Dec 07 19:47:10 crc kubenswrapper[4815]: I1207 19:47:10.195589 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bfe86916-a7d7-41a5-853c-acb6c37a85c1-inventory\") pod \"bfe86916-a7d7-41a5-853c-acb6c37a85c1\" (UID: \"bfe86916-a7d7-41a5-853c-acb6c37a85c1\") " Dec 07 19:47:10 crc kubenswrapper[4815]: I1207 19:47:10.195732 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hrwfq\" (UniqueName: \"kubernetes.io/projected/bfe86916-a7d7-41a5-853c-acb6c37a85c1-kube-api-access-hrwfq\") pod \"bfe86916-a7d7-41a5-853c-acb6c37a85c1\" (UID: \"bfe86916-a7d7-41a5-853c-acb6c37a85c1\") " Dec 07 19:47:10 crc kubenswrapper[4815]: I1207 19:47:10.195800 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bfe86916-a7d7-41a5-853c-acb6c37a85c1-ssh-key\") pod \"bfe86916-a7d7-41a5-853c-acb6c37a85c1\" (UID: \"bfe86916-a7d7-41a5-853c-acb6c37a85c1\") " Dec 07 19:47:10 crc kubenswrapper[4815]: I1207 19:47:10.202176 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfe86916-a7d7-41a5-853c-acb6c37a85c1-kube-api-access-hrwfq" (OuterVolumeSpecName: "kube-api-access-hrwfq") pod "bfe86916-a7d7-41a5-853c-acb6c37a85c1" (UID: "bfe86916-a7d7-41a5-853c-acb6c37a85c1"). InnerVolumeSpecName "kube-api-access-hrwfq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:47:10 crc kubenswrapper[4815]: I1207 19:47:10.221159 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfe86916-a7d7-41a5-853c-acb6c37a85c1-inventory" (OuterVolumeSpecName: "inventory") pod "bfe86916-a7d7-41a5-853c-acb6c37a85c1" (UID: "bfe86916-a7d7-41a5-853c-acb6c37a85c1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:47:10 crc kubenswrapper[4815]: I1207 19:47:10.224318 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfe86916-a7d7-41a5-853c-acb6c37a85c1-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "bfe86916-a7d7-41a5-853c-acb6c37a85c1" (UID: "bfe86916-a7d7-41a5-853c-acb6c37a85c1"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:47:10 crc kubenswrapper[4815]: I1207 19:47:10.298806 4815 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bfe86916-a7d7-41a5-853c-acb6c37a85c1-inventory\") on node \"crc\" DevicePath \"\"" Dec 07 19:47:10 crc kubenswrapper[4815]: I1207 19:47:10.298847 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hrwfq\" (UniqueName: \"kubernetes.io/projected/bfe86916-a7d7-41a5-853c-acb6c37a85c1-kube-api-access-hrwfq\") on node \"crc\" DevicePath \"\"" Dec 07 19:47:10 crc kubenswrapper[4815]: I1207 19:47:10.298860 4815 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bfe86916-a7d7-41a5-853c-acb6c37a85c1-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 07 19:47:10 crc kubenswrapper[4815]: I1207 19:47:10.647153 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-mp7vp" event={"ID":"bfe86916-a7d7-41a5-853c-acb6c37a85c1","Type":"ContainerDied","Data":"259cd6f0c6e20849b1e67eb8c784c0b03a13986c0b688bf9f04b2b285a089f0d"} Dec 07 19:47:10 crc kubenswrapper[4815]: I1207 19:47:10.647198 4815 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="259cd6f0c6e20849b1e67eb8c784c0b03a13986c0b688bf9f04b2b285a089f0d" Dec 07 19:47:10 crc kubenswrapper[4815]: I1207 19:47:10.647218 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-mp7vp" Dec 07 19:47:10 crc kubenswrapper[4815]: I1207 19:47:10.720809 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-phdwz"] Dec 07 19:47:10 crc kubenswrapper[4815]: E1207 19:47:10.721332 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfe86916-a7d7-41a5-853c-acb6c37a85c1" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 07 19:47:10 crc kubenswrapper[4815]: I1207 19:47:10.721353 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfe86916-a7d7-41a5-853c-acb6c37a85c1" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 07 19:47:10 crc kubenswrapper[4815]: I1207 19:47:10.721570 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfe86916-a7d7-41a5-853c-acb6c37a85c1" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 07 19:47:10 crc kubenswrapper[4815]: I1207 19:47:10.722314 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-phdwz" Dec 07 19:47:10 crc kubenswrapper[4815]: I1207 19:47:10.731397 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 07 19:47:10 crc kubenswrapper[4815]: I1207 19:47:10.731624 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rvv2t" Dec 07 19:47:10 crc kubenswrapper[4815]: I1207 19:47:10.734534 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-phdwz"] Dec 07 19:47:10 crc kubenswrapper[4815]: I1207 19:47:10.731502 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 07 19:47:10 crc kubenswrapper[4815]: I1207 19:47:10.735388 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 07 19:47:10 crc kubenswrapper[4815]: I1207 19:47:10.914366 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6n27n\" (UniqueName: \"kubernetes.io/projected/f593e5ae-6df7-4ec1-b41b-e1ddc2f5e35e-kube-api-access-6n27n\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-phdwz\" (UID: \"f593e5ae-6df7-4ec1-b41b-e1ddc2f5e35e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-phdwz" Dec 07 19:47:10 crc kubenswrapper[4815]: I1207 19:47:10.914864 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f593e5ae-6df7-4ec1-b41b-e1ddc2f5e35e-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-phdwz\" (UID: \"f593e5ae-6df7-4ec1-b41b-e1ddc2f5e35e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-phdwz" Dec 07 19:47:10 crc kubenswrapper[4815]: I1207 19:47:10.915054 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f593e5ae-6df7-4ec1-b41b-e1ddc2f5e35e-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-phdwz\" (UID: \"f593e5ae-6df7-4ec1-b41b-e1ddc2f5e35e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-phdwz" Dec 07 19:47:11 crc kubenswrapper[4815]: I1207 19:47:11.016427 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f593e5ae-6df7-4ec1-b41b-e1ddc2f5e35e-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-phdwz\" (UID: \"f593e5ae-6df7-4ec1-b41b-e1ddc2f5e35e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-phdwz" Dec 07 19:47:11 crc kubenswrapper[4815]: I1207 19:47:11.016667 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6n27n\" (UniqueName: \"kubernetes.io/projected/f593e5ae-6df7-4ec1-b41b-e1ddc2f5e35e-kube-api-access-6n27n\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-phdwz\" (UID: \"f593e5ae-6df7-4ec1-b41b-e1ddc2f5e35e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-phdwz" Dec 07 19:47:11 crc kubenswrapper[4815]: I1207 19:47:11.016812 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f593e5ae-6df7-4ec1-b41b-e1ddc2f5e35e-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-phdwz\" (UID: \"f593e5ae-6df7-4ec1-b41b-e1ddc2f5e35e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-phdwz" Dec 07 19:47:11 crc kubenswrapper[4815]: I1207 19:47:11.020149 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f593e5ae-6df7-4ec1-b41b-e1ddc2f5e35e-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-phdwz\" (UID: \"f593e5ae-6df7-4ec1-b41b-e1ddc2f5e35e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-phdwz" Dec 07 19:47:11 crc kubenswrapper[4815]: I1207 19:47:11.021618 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f593e5ae-6df7-4ec1-b41b-e1ddc2f5e35e-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-phdwz\" (UID: \"f593e5ae-6df7-4ec1-b41b-e1ddc2f5e35e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-phdwz" Dec 07 19:47:11 crc kubenswrapper[4815]: I1207 19:47:11.043565 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6n27n\" (UniqueName: \"kubernetes.io/projected/f593e5ae-6df7-4ec1-b41b-e1ddc2f5e35e-kube-api-access-6n27n\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-phdwz\" (UID: \"f593e5ae-6df7-4ec1-b41b-e1ddc2f5e35e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-phdwz" Dec 07 19:47:11 crc kubenswrapper[4815]: I1207 19:47:11.053665 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-phdwz" Dec 07 19:47:11 crc kubenswrapper[4815]: I1207 19:47:11.575974 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-phdwz"] Dec 07 19:47:11 crc kubenswrapper[4815]: I1207 19:47:11.655680 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-phdwz" event={"ID":"f593e5ae-6df7-4ec1-b41b-e1ddc2f5e35e","Type":"ContainerStarted","Data":"577e31742182e41a996b85cc4f039f35a082ea4e00941c77b80eef34c32327d7"} Dec 07 19:47:12 crc kubenswrapper[4815]: I1207 19:47:12.665376 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-phdwz" event={"ID":"f593e5ae-6df7-4ec1-b41b-e1ddc2f5e35e","Type":"ContainerStarted","Data":"42dbe7945565c56af2a2d95e980cb80384149625fbc9cdbe09d84e95e0189dde"} Dec 07 19:47:12 crc kubenswrapper[4815]: I1207 19:47:12.689542 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-phdwz" podStartSLOduration=2.271974412 podStartE2EDuration="2.689510299s" podCreationTimestamp="2025-12-07 19:47:10 +0000 UTC" firstStartedPulling="2025-12-07 19:47:11.586291627 +0000 UTC m=+1936.165281672" lastFinishedPulling="2025-12-07 19:47:12.003827514 +0000 UTC m=+1936.582817559" observedRunningTime="2025-12-07 19:47:12.681087641 +0000 UTC m=+1937.260077686" watchObservedRunningTime="2025-12-07 19:47:12.689510299 +0000 UTC m=+1937.268500384" Dec 07 19:47:22 crc kubenswrapper[4815]: I1207 19:47:22.792012 4815 generic.go:334] "Generic (PLEG): container finished" podID="f593e5ae-6df7-4ec1-b41b-e1ddc2f5e35e" containerID="42dbe7945565c56af2a2d95e980cb80384149625fbc9cdbe09d84e95e0189dde" exitCode=0 Dec 07 19:47:22 crc kubenswrapper[4815]: I1207 19:47:22.792079 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-phdwz" event={"ID":"f593e5ae-6df7-4ec1-b41b-e1ddc2f5e35e","Type":"ContainerDied","Data":"42dbe7945565c56af2a2d95e980cb80384149625fbc9cdbe09d84e95e0189dde"} Dec 07 19:47:23 crc kubenswrapper[4815]: I1207 19:47:23.525270 4815 scope.go:117] "RemoveContainer" containerID="6467d92ce190ab795185aadce7f6f83f049a27f8b66597f85497c938c9eacee3" Dec 07 19:47:23 crc kubenswrapper[4815]: I1207 19:47:23.578596 4815 scope.go:117] "RemoveContainer" containerID="1c4ec4243730fce8a5a4eb907ccbef3c5cd68c4dff9ca432250d80287da1aad6" Dec 07 19:47:23 crc kubenswrapper[4815]: I1207 19:47:23.619343 4815 scope.go:117] "RemoveContainer" containerID="ecde84fb50b3ec0ddf493efa89a6e6252d087d8723e4f8cf94993d146ec52765" Dec 07 19:47:24 crc kubenswrapper[4815]: I1207 19:47:24.194580 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-phdwz" Dec 07 19:47:24 crc kubenswrapper[4815]: I1207 19:47:24.206273 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f593e5ae-6df7-4ec1-b41b-e1ddc2f5e35e-ssh-key\") pod \"f593e5ae-6df7-4ec1-b41b-e1ddc2f5e35e\" (UID: \"f593e5ae-6df7-4ec1-b41b-e1ddc2f5e35e\") " Dec 07 19:47:24 crc kubenswrapper[4815]: I1207 19:47:24.206394 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6n27n\" (UniqueName: \"kubernetes.io/projected/f593e5ae-6df7-4ec1-b41b-e1ddc2f5e35e-kube-api-access-6n27n\") pod \"f593e5ae-6df7-4ec1-b41b-e1ddc2f5e35e\" (UID: \"f593e5ae-6df7-4ec1-b41b-e1ddc2f5e35e\") " Dec 07 19:47:24 crc kubenswrapper[4815]: I1207 19:47:24.206424 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f593e5ae-6df7-4ec1-b41b-e1ddc2f5e35e-inventory\") pod \"f593e5ae-6df7-4ec1-b41b-e1ddc2f5e35e\" (UID: \"f593e5ae-6df7-4ec1-b41b-e1ddc2f5e35e\") " Dec 07 19:47:24 crc kubenswrapper[4815]: I1207 19:47:24.212402 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f593e5ae-6df7-4ec1-b41b-e1ddc2f5e35e-kube-api-access-6n27n" (OuterVolumeSpecName: "kube-api-access-6n27n") pod "f593e5ae-6df7-4ec1-b41b-e1ddc2f5e35e" (UID: "f593e5ae-6df7-4ec1-b41b-e1ddc2f5e35e"). InnerVolumeSpecName "kube-api-access-6n27n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:47:24 crc kubenswrapper[4815]: I1207 19:47:24.240341 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f593e5ae-6df7-4ec1-b41b-e1ddc2f5e35e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f593e5ae-6df7-4ec1-b41b-e1ddc2f5e35e" (UID: "f593e5ae-6df7-4ec1-b41b-e1ddc2f5e35e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:47:24 crc kubenswrapper[4815]: I1207 19:47:24.245200 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f593e5ae-6df7-4ec1-b41b-e1ddc2f5e35e-inventory" (OuterVolumeSpecName: "inventory") pod "f593e5ae-6df7-4ec1-b41b-e1ddc2f5e35e" (UID: "f593e5ae-6df7-4ec1-b41b-e1ddc2f5e35e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 19:47:24 crc kubenswrapper[4815]: I1207 19:47:24.308718 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6n27n\" (UniqueName: \"kubernetes.io/projected/f593e5ae-6df7-4ec1-b41b-e1ddc2f5e35e-kube-api-access-6n27n\") on node \"crc\" DevicePath \"\"" Dec 07 19:47:24 crc kubenswrapper[4815]: I1207 19:47:24.308746 4815 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f593e5ae-6df7-4ec1-b41b-e1ddc2f5e35e-inventory\") on node \"crc\" DevicePath \"\"" Dec 07 19:47:24 crc kubenswrapper[4815]: I1207 19:47:24.308754 4815 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f593e5ae-6df7-4ec1-b41b-e1ddc2f5e35e-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 07 19:47:24 crc kubenswrapper[4815]: I1207 19:47:24.811026 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-phdwz" event={"ID":"f593e5ae-6df7-4ec1-b41b-e1ddc2f5e35e","Type":"ContainerDied","Data":"577e31742182e41a996b85cc4f039f35a082ea4e00941c77b80eef34c32327d7"} Dec 07 19:47:24 crc kubenswrapper[4815]: I1207 19:47:24.811358 4815 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="577e31742182e41a996b85cc4f039f35a082ea4e00941c77b80eef34c32327d7" Dec 07 19:47:24 crc kubenswrapper[4815]: I1207 19:47:24.811048 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-phdwz" Dec 07 19:47:44 crc kubenswrapper[4815]: I1207 19:47:44.050171 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-7kvht"] Dec 07 19:47:44 crc kubenswrapper[4815]: I1207 19:47:44.058311 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-7kvht"] Dec 07 19:47:45 crc kubenswrapper[4815]: I1207 19:47:45.798335 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97bed517-8a3a-42ee-8a74-e9ad9416898a" path="/var/lib/kubelet/pods/97bed517-8a3a-42ee-8a74-e9ad9416898a/volumes" Dec 07 19:48:23 crc kubenswrapper[4815]: I1207 19:48:23.754147 4815 scope.go:117] "RemoveContainer" containerID="484bea2a3a785e8e4efcc939223774e3732c6c7235cd3dd30885896a89b6a7dd" Dec 07 19:48:42 crc kubenswrapper[4815]: I1207 19:48:42.872109 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-nl9zm"] Dec 07 19:48:42 crc kubenswrapper[4815]: E1207 19:48:42.873495 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f593e5ae-6df7-4ec1-b41b-e1ddc2f5e35e" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 07 19:48:42 crc kubenswrapper[4815]: I1207 19:48:42.873533 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="f593e5ae-6df7-4ec1-b41b-e1ddc2f5e35e" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 07 19:48:42 crc kubenswrapper[4815]: I1207 19:48:42.876108 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="f593e5ae-6df7-4ec1-b41b-e1ddc2f5e35e" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 07 19:48:42 crc kubenswrapper[4815]: I1207 19:48:42.878399 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nl9zm" Dec 07 19:48:42 crc kubenswrapper[4815]: I1207 19:48:42.903616 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nl9zm"] Dec 07 19:48:43 crc kubenswrapper[4815]: I1207 19:48:43.009014 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtz65\" (UniqueName: \"kubernetes.io/projected/25fc81d8-f47c-4dd5-8244-4d51e5d40b54-kube-api-access-xtz65\") pod \"redhat-operators-nl9zm\" (UID: \"25fc81d8-f47c-4dd5-8244-4d51e5d40b54\") " pod="openshift-marketplace/redhat-operators-nl9zm" Dec 07 19:48:43 crc kubenswrapper[4815]: I1207 19:48:43.009111 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25fc81d8-f47c-4dd5-8244-4d51e5d40b54-catalog-content\") pod \"redhat-operators-nl9zm\" (UID: \"25fc81d8-f47c-4dd5-8244-4d51e5d40b54\") " pod="openshift-marketplace/redhat-operators-nl9zm" Dec 07 19:48:43 crc kubenswrapper[4815]: I1207 19:48:43.009184 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25fc81d8-f47c-4dd5-8244-4d51e5d40b54-utilities\") pod \"redhat-operators-nl9zm\" (UID: \"25fc81d8-f47c-4dd5-8244-4d51e5d40b54\") " pod="openshift-marketplace/redhat-operators-nl9zm" Dec 07 19:48:43 crc kubenswrapper[4815]: I1207 19:48:43.111147 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtz65\" (UniqueName: \"kubernetes.io/projected/25fc81d8-f47c-4dd5-8244-4d51e5d40b54-kube-api-access-xtz65\") pod \"redhat-operators-nl9zm\" (UID: \"25fc81d8-f47c-4dd5-8244-4d51e5d40b54\") " pod="openshift-marketplace/redhat-operators-nl9zm" Dec 07 19:48:43 crc kubenswrapper[4815]: I1207 19:48:43.111486 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25fc81d8-f47c-4dd5-8244-4d51e5d40b54-catalog-content\") pod \"redhat-operators-nl9zm\" (UID: \"25fc81d8-f47c-4dd5-8244-4d51e5d40b54\") " pod="openshift-marketplace/redhat-operators-nl9zm" Dec 07 19:48:43 crc kubenswrapper[4815]: I1207 19:48:43.111634 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25fc81d8-f47c-4dd5-8244-4d51e5d40b54-utilities\") pod \"redhat-operators-nl9zm\" (UID: \"25fc81d8-f47c-4dd5-8244-4d51e5d40b54\") " pod="openshift-marketplace/redhat-operators-nl9zm" Dec 07 19:48:43 crc kubenswrapper[4815]: I1207 19:48:43.112032 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25fc81d8-f47c-4dd5-8244-4d51e5d40b54-catalog-content\") pod \"redhat-operators-nl9zm\" (UID: \"25fc81d8-f47c-4dd5-8244-4d51e5d40b54\") " pod="openshift-marketplace/redhat-operators-nl9zm" Dec 07 19:48:43 crc kubenswrapper[4815]: I1207 19:48:43.112039 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25fc81d8-f47c-4dd5-8244-4d51e5d40b54-utilities\") pod \"redhat-operators-nl9zm\" (UID: \"25fc81d8-f47c-4dd5-8244-4d51e5d40b54\") " pod="openshift-marketplace/redhat-operators-nl9zm" Dec 07 19:48:43 crc kubenswrapper[4815]: I1207 19:48:43.141726 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtz65\" (UniqueName: \"kubernetes.io/projected/25fc81d8-f47c-4dd5-8244-4d51e5d40b54-kube-api-access-xtz65\") pod \"redhat-operators-nl9zm\" (UID: \"25fc81d8-f47c-4dd5-8244-4d51e5d40b54\") " pod="openshift-marketplace/redhat-operators-nl9zm" Dec 07 19:48:43 crc kubenswrapper[4815]: I1207 19:48:43.211357 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nl9zm" Dec 07 19:48:43 crc kubenswrapper[4815]: I1207 19:48:43.690794 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nl9zm"] Dec 07 19:48:44 crc kubenswrapper[4815]: I1207 19:48:44.672844 4815 generic.go:334] "Generic (PLEG): container finished" podID="25fc81d8-f47c-4dd5-8244-4d51e5d40b54" containerID="cca8a6b522ec69c1cbd8953e0c81175e44ecb07adaba3ba74a0b3913715ad9b6" exitCode=0 Dec 07 19:48:44 crc kubenswrapper[4815]: I1207 19:48:44.673208 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nl9zm" event={"ID":"25fc81d8-f47c-4dd5-8244-4d51e5d40b54","Type":"ContainerDied","Data":"cca8a6b522ec69c1cbd8953e0c81175e44ecb07adaba3ba74a0b3913715ad9b6"} Dec 07 19:48:44 crc kubenswrapper[4815]: I1207 19:48:44.673709 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nl9zm" event={"ID":"25fc81d8-f47c-4dd5-8244-4d51e5d40b54","Type":"ContainerStarted","Data":"12bba4067210eea8c7721e74b28354354d7a4cdd0dae61ccdff1e6b9663ab192"} Dec 07 19:48:44 crc kubenswrapper[4815]: I1207 19:48:44.677571 4815 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 07 19:48:45 crc kubenswrapper[4815]: I1207 19:48:45.684610 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nl9zm" event={"ID":"25fc81d8-f47c-4dd5-8244-4d51e5d40b54","Type":"ContainerStarted","Data":"53f3c84034f9a451c06d64a8153a64e3d042d4357d49148c8e2c6807f3817415"} Dec 07 19:48:48 crc kubenswrapper[4815]: I1207 19:48:48.712243 4815 generic.go:334] "Generic (PLEG): container finished" podID="25fc81d8-f47c-4dd5-8244-4d51e5d40b54" containerID="53f3c84034f9a451c06d64a8153a64e3d042d4357d49148c8e2c6807f3817415" exitCode=0 Dec 07 19:48:48 crc kubenswrapper[4815]: I1207 19:48:48.712347 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nl9zm" event={"ID":"25fc81d8-f47c-4dd5-8244-4d51e5d40b54","Type":"ContainerDied","Data":"53f3c84034f9a451c06d64a8153a64e3d042d4357d49148c8e2c6807f3817415"} Dec 07 19:48:49 crc kubenswrapper[4815]: I1207 19:48:49.767277 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nl9zm" event={"ID":"25fc81d8-f47c-4dd5-8244-4d51e5d40b54","Type":"ContainerStarted","Data":"46834c47380885b02a4b2e65755d9ddb75d140f3bbb91c0d260a68196c7e14fa"} Dec 07 19:48:49 crc kubenswrapper[4815]: I1207 19:48:49.816782 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-nl9zm" podStartSLOduration=3.186006835 podStartE2EDuration="7.816758554s" podCreationTimestamp="2025-12-07 19:48:42 +0000 UTC" firstStartedPulling="2025-12-07 19:48:44.67700971 +0000 UTC m=+2029.255999795" lastFinishedPulling="2025-12-07 19:48:49.307761469 +0000 UTC m=+2033.886751514" observedRunningTime="2025-12-07 19:48:49.800119143 +0000 UTC m=+2034.379109188" watchObservedRunningTime="2025-12-07 19:48:49.816758554 +0000 UTC m=+2034.395748619" Dec 07 19:48:53 crc kubenswrapper[4815]: I1207 19:48:53.211497 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-nl9zm" Dec 07 19:48:53 crc kubenswrapper[4815]: I1207 19:48:53.213354 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-nl9zm" Dec 07 19:48:54 crc kubenswrapper[4815]: I1207 19:48:54.263871 4815 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-nl9zm" podUID="25fc81d8-f47c-4dd5-8244-4d51e5d40b54" containerName="registry-server" probeResult="failure" output=< Dec 07 19:48:54 crc kubenswrapper[4815]: timeout: failed to connect service ":50051" within 1s Dec 07 19:48:54 crc kubenswrapper[4815]: > Dec 07 19:48:56 crc kubenswrapper[4815]: I1207 19:48:56.360341 4815 patch_prober.go:28] interesting pod/machine-config-daemon-gkn4h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 07 19:48:56 crc kubenswrapper[4815]: I1207 19:48:56.360707 4815 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 07 19:49:03 crc kubenswrapper[4815]: I1207 19:49:03.289963 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-nl9zm" Dec 07 19:49:03 crc kubenswrapper[4815]: I1207 19:49:03.373336 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-nl9zm" Dec 07 19:49:03 crc kubenswrapper[4815]: I1207 19:49:03.555697 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nl9zm"] Dec 07 19:49:04 crc kubenswrapper[4815]: I1207 19:49:04.911840 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-nl9zm" podUID="25fc81d8-f47c-4dd5-8244-4d51e5d40b54" containerName="registry-server" containerID="cri-o://46834c47380885b02a4b2e65755d9ddb75d140f3bbb91c0d260a68196c7e14fa" gracePeriod=2 Dec 07 19:49:05 crc kubenswrapper[4815]: I1207 19:49:05.355180 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nl9zm" Dec 07 19:49:05 crc kubenswrapper[4815]: I1207 19:49:05.554996 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25fc81d8-f47c-4dd5-8244-4d51e5d40b54-catalog-content\") pod \"25fc81d8-f47c-4dd5-8244-4d51e5d40b54\" (UID: \"25fc81d8-f47c-4dd5-8244-4d51e5d40b54\") " Dec 07 19:49:05 crc kubenswrapper[4815]: I1207 19:49:05.555052 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25fc81d8-f47c-4dd5-8244-4d51e5d40b54-utilities\") pod \"25fc81d8-f47c-4dd5-8244-4d51e5d40b54\" (UID: \"25fc81d8-f47c-4dd5-8244-4d51e5d40b54\") " Dec 07 19:49:05 crc kubenswrapper[4815]: I1207 19:49:05.555123 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xtz65\" (UniqueName: \"kubernetes.io/projected/25fc81d8-f47c-4dd5-8244-4d51e5d40b54-kube-api-access-xtz65\") pod \"25fc81d8-f47c-4dd5-8244-4d51e5d40b54\" (UID: \"25fc81d8-f47c-4dd5-8244-4d51e5d40b54\") " Dec 07 19:49:05 crc kubenswrapper[4815]: I1207 19:49:05.555812 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25fc81d8-f47c-4dd5-8244-4d51e5d40b54-utilities" (OuterVolumeSpecName: "utilities") pod "25fc81d8-f47c-4dd5-8244-4d51e5d40b54" (UID: "25fc81d8-f47c-4dd5-8244-4d51e5d40b54"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 07 19:49:05 crc kubenswrapper[4815]: I1207 19:49:05.560386 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25fc81d8-f47c-4dd5-8244-4d51e5d40b54-kube-api-access-xtz65" (OuterVolumeSpecName: "kube-api-access-xtz65") pod "25fc81d8-f47c-4dd5-8244-4d51e5d40b54" (UID: "25fc81d8-f47c-4dd5-8244-4d51e5d40b54"). InnerVolumeSpecName "kube-api-access-xtz65". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:49:05 crc kubenswrapper[4815]: I1207 19:49:05.657034 4815 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25fc81d8-f47c-4dd5-8244-4d51e5d40b54-utilities\") on node \"crc\" DevicePath \"\"" Dec 07 19:49:05 crc kubenswrapper[4815]: I1207 19:49:05.657073 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xtz65\" (UniqueName: \"kubernetes.io/projected/25fc81d8-f47c-4dd5-8244-4d51e5d40b54-kube-api-access-xtz65\") on node \"crc\" DevicePath \"\"" Dec 07 19:49:05 crc kubenswrapper[4815]: I1207 19:49:05.679968 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25fc81d8-f47c-4dd5-8244-4d51e5d40b54-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "25fc81d8-f47c-4dd5-8244-4d51e5d40b54" (UID: "25fc81d8-f47c-4dd5-8244-4d51e5d40b54"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 07 19:49:05 crc kubenswrapper[4815]: I1207 19:49:05.758857 4815 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25fc81d8-f47c-4dd5-8244-4d51e5d40b54-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 07 19:49:05 crc kubenswrapper[4815]: I1207 19:49:05.926960 4815 generic.go:334] "Generic (PLEG): container finished" podID="25fc81d8-f47c-4dd5-8244-4d51e5d40b54" containerID="46834c47380885b02a4b2e65755d9ddb75d140f3bbb91c0d260a68196c7e14fa" exitCode=0 Dec 07 19:49:05 crc kubenswrapper[4815]: I1207 19:49:05.927179 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nl9zm" event={"ID":"25fc81d8-f47c-4dd5-8244-4d51e5d40b54","Type":"ContainerDied","Data":"46834c47380885b02a4b2e65755d9ddb75d140f3bbb91c0d260a68196c7e14fa"} Dec 07 19:49:05 crc kubenswrapper[4815]: I1207 19:49:05.928299 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nl9zm" event={"ID":"25fc81d8-f47c-4dd5-8244-4d51e5d40b54","Type":"ContainerDied","Data":"12bba4067210eea8c7721e74b28354354d7a4cdd0dae61ccdff1e6b9663ab192"} Dec 07 19:49:05 crc kubenswrapper[4815]: I1207 19:49:05.928460 4815 scope.go:117] "RemoveContainer" containerID="46834c47380885b02a4b2e65755d9ddb75d140f3bbb91c0d260a68196c7e14fa" Dec 07 19:49:05 crc kubenswrapper[4815]: I1207 19:49:05.927250 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nl9zm" Dec 07 19:49:05 crc kubenswrapper[4815]: I1207 19:49:05.966615 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nl9zm"] Dec 07 19:49:05 crc kubenswrapper[4815]: I1207 19:49:05.970096 4815 scope.go:117] "RemoveContainer" containerID="53f3c84034f9a451c06d64a8153a64e3d042d4357d49148c8e2c6807f3817415" Dec 07 19:49:05 crc kubenswrapper[4815]: I1207 19:49:05.977967 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-nl9zm"] Dec 07 19:49:05 crc kubenswrapper[4815]: I1207 19:49:05.995901 4815 scope.go:117] "RemoveContainer" containerID="cca8a6b522ec69c1cbd8953e0c81175e44ecb07adaba3ba74a0b3913715ad9b6" Dec 07 19:49:06 crc kubenswrapper[4815]: I1207 19:49:06.059853 4815 scope.go:117] "RemoveContainer" containerID="46834c47380885b02a4b2e65755d9ddb75d140f3bbb91c0d260a68196c7e14fa" Dec 07 19:49:06 crc kubenswrapper[4815]: E1207 19:49:06.063794 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46834c47380885b02a4b2e65755d9ddb75d140f3bbb91c0d260a68196c7e14fa\": container with ID starting with 46834c47380885b02a4b2e65755d9ddb75d140f3bbb91c0d260a68196c7e14fa not found: ID does not exist" containerID="46834c47380885b02a4b2e65755d9ddb75d140f3bbb91c0d260a68196c7e14fa" Dec 07 19:49:06 crc kubenswrapper[4815]: I1207 19:49:06.063853 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46834c47380885b02a4b2e65755d9ddb75d140f3bbb91c0d260a68196c7e14fa"} err="failed to get container status \"46834c47380885b02a4b2e65755d9ddb75d140f3bbb91c0d260a68196c7e14fa\": rpc error: code = NotFound desc = could not find container \"46834c47380885b02a4b2e65755d9ddb75d140f3bbb91c0d260a68196c7e14fa\": container with ID starting with 46834c47380885b02a4b2e65755d9ddb75d140f3bbb91c0d260a68196c7e14fa not found: ID does not exist" Dec 07 19:49:06 crc kubenswrapper[4815]: I1207 19:49:06.063894 4815 scope.go:117] "RemoveContainer" containerID="53f3c84034f9a451c06d64a8153a64e3d042d4357d49148c8e2c6807f3817415" Dec 07 19:49:06 crc kubenswrapper[4815]: E1207 19:49:06.064412 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53f3c84034f9a451c06d64a8153a64e3d042d4357d49148c8e2c6807f3817415\": container with ID starting with 53f3c84034f9a451c06d64a8153a64e3d042d4357d49148c8e2c6807f3817415 not found: ID does not exist" containerID="53f3c84034f9a451c06d64a8153a64e3d042d4357d49148c8e2c6807f3817415" Dec 07 19:49:06 crc kubenswrapper[4815]: I1207 19:49:06.064459 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53f3c84034f9a451c06d64a8153a64e3d042d4357d49148c8e2c6807f3817415"} err="failed to get container status \"53f3c84034f9a451c06d64a8153a64e3d042d4357d49148c8e2c6807f3817415\": rpc error: code = NotFound desc = could not find container \"53f3c84034f9a451c06d64a8153a64e3d042d4357d49148c8e2c6807f3817415\": container with ID starting with 53f3c84034f9a451c06d64a8153a64e3d042d4357d49148c8e2c6807f3817415 not found: ID does not exist" Dec 07 19:49:06 crc kubenswrapper[4815]: I1207 19:49:06.064485 4815 scope.go:117] "RemoveContainer" containerID="cca8a6b522ec69c1cbd8953e0c81175e44ecb07adaba3ba74a0b3913715ad9b6" Dec 07 19:49:06 crc kubenswrapper[4815]: E1207 19:49:06.064715 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cca8a6b522ec69c1cbd8953e0c81175e44ecb07adaba3ba74a0b3913715ad9b6\": container with ID starting with cca8a6b522ec69c1cbd8953e0c81175e44ecb07adaba3ba74a0b3913715ad9b6 not found: ID does not exist" containerID="cca8a6b522ec69c1cbd8953e0c81175e44ecb07adaba3ba74a0b3913715ad9b6" Dec 07 19:49:06 crc kubenswrapper[4815]: I1207 19:49:06.064739 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cca8a6b522ec69c1cbd8953e0c81175e44ecb07adaba3ba74a0b3913715ad9b6"} err="failed to get container status \"cca8a6b522ec69c1cbd8953e0c81175e44ecb07adaba3ba74a0b3913715ad9b6\": rpc error: code = NotFound desc = could not find container \"cca8a6b522ec69c1cbd8953e0c81175e44ecb07adaba3ba74a0b3913715ad9b6\": container with ID starting with cca8a6b522ec69c1cbd8953e0c81175e44ecb07adaba3ba74a0b3913715ad9b6 not found: ID does not exist" Dec 07 19:49:07 crc kubenswrapper[4815]: I1207 19:49:07.780815 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25fc81d8-f47c-4dd5-8244-4d51e5d40b54" path="/var/lib/kubelet/pods/25fc81d8-f47c-4dd5-8244-4d51e5d40b54/volumes" Dec 07 19:49:26 crc kubenswrapper[4815]: I1207 19:49:26.359329 4815 patch_prober.go:28] interesting pod/machine-config-daemon-gkn4h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 07 19:49:26 crc kubenswrapper[4815]: I1207 19:49:26.362294 4815 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 07 19:49:56 crc kubenswrapper[4815]: I1207 19:49:56.359592 4815 patch_prober.go:28] interesting pod/machine-config-daemon-gkn4h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 07 19:49:56 crc kubenswrapper[4815]: I1207 19:49:56.360152 4815 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 07 19:49:56 crc kubenswrapper[4815]: I1207 19:49:56.360206 4815 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" Dec 07 19:49:56 crc kubenswrapper[4815]: I1207 19:49:56.361095 4815 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a4d1ab49e52a51b21d475b2ec69e983b5266f8725eddc625be4fa0eac2776a34"} pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 07 19:49:56 crc kubenswrapper[4815]: I1207 19:49:56.361164 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" containerName="machine-config-daemon" containerID="cri-o://a4d1ab49e52a51b21d475b2ec69e983b5266f8725eddc625be4fa0eac2776a34" gracePeriod=600 Dec 07 19:49:57 crc kubenswrapper[4815]: I1207 19:49:57.397066 4815 generic.go:334] "Generic (PLEG): container finished" podID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" containerID="a4d1ab49e52a51b21d475b2ec69e983b5266f8725eddc625be4fa0eac2776a34" exitCode=0 Dec 07 19:49:57 crc kubenswrapper[4815]: I1207 19:49:57.397202 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" event={"ID":"3d662ba2-aa03-4eea-bd30-8ad40638f6c7","Type":"ContainerDied","Data":"a4d1ab49e52a51b21d475b2ec69e983b5266f8725eddc625be4fa0eac2776a34"} Dec 07 19:49:57 crc kubenswrapper[4815]: I1207 19:49:57.398776 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" event={"ID":"3d662ba2-aa03-4eea-bd30-8ad40638f6c7","Type":"ContainerStarted","Data":"340b584b058ac547a7c5e921d79e470d96d688318eb315a993317b994733d8b7"} Dec 07 19:49:57 crc kubenswrapper[4815]: I1207 19:49:57.398886 4815 scope.go:117] "RemoveContainer" containerID="f352000dbce74affccf74924ce7b6a53eb5e32c31268c3c83ff0c4e8881fae0d" Dec 07 19:51:52 crc kubenswrapper[4815]: I1207 19:51:52.602096 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vtfh5"] Dec 07 19:51:52 crc kubenswrapper[4815]: E1207 19:51:52.604739 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25fc81d8-f47c-4dd5-8244-4d51e5d40b54" containerName="registry-server" Dec 07 19:51:52 crc kubenswrapper[4815]: I1207 19:51:52.604763 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="25fc81d8-f47c-4dd5-8244-4d51e5d40b54" containerName="registry-server" Dec 07 19:51:52 crc kubenswrapper[4815]: E1207 19:51:52.604806 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25fc81d8-f47c-4dd5-8244-4d51e5d40b54" containerName="extract-utilities" Dec 07 19:51:52 crc kubenswrapper[4815]: I1207 19:51:52.604819 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="25fc81d8-f47c-4dd5-8244-4d51e5d40b54" containerName="extract-utilities" Dec 07 19:51:52 crc kubenswrapper[4815]: E1207 19:51:52.604838 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25fc81d8-f47c-4dd5-8244-4d51e5d40b54" containerName="extract-content" Dec 07 19:51:52 crc kubenswrapper[4815]: I1207 19:51:52.604846 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="25fc81d8-f47c-4dd5-8244-4d51e5d40b54" containerName="extract-content" Dec 07 19:51:52 crc kubenswrapper[4815]: I1207 19:51:52.605246 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="25fc81d8-f47c-4dd5-8244-4d51e5d40b54" containerName="registry-server" Dec 07 19:51:52 crc kubenswrapper[4815]: I1207 19:51:52.607015 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vtfh5" Dec 07 19:51:52 crc kubenswrapper[4815]: I1207 19:51:52.638139 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vtfh5"] Dec 07 19:51:52 crc kubenswrapper[4815]: I1207 19:51:52.759245 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f943a60-acea-4d9c-84f4-d2a33cbc0f75-catalog-content\") pod \"redhat-marketplace-vtfh5\" (UID: \"9f943a60-acea-4d9c-84f4-d2a33cbc0f75\") " pod="openshift-marketplace/redhat-marketplace-vtfh5" Dec 07 19:51:52 crc kubenswrapper[4815]: I1207 19:51:52.759349 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5c87\" (UniqueName: \"kubernetes.io/projected/9f943a60-acea-4d9c-84f4-d2a33cbc0f75-kube-api-access-r5c87\") pod \"redhat-marketplace-vtfh5\" (UID: \"9f943a60-acea-4d9c-84f4-d2a33cbc0f75\") " pod="openshift-marketplace/redhat-marketplace-vtfh5" Dec 07 19:51:52 crc kubenswrapper[4815]: I1207 19:51:52.759399 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f943a60-acea-4d9c-84f4-d2a33cbc0f75-utilities\") pod \"redhat-marketplace-vtfh5\" (UID: \"9f943a60-acea-4d9c-84f4-d2a33cbc0f75\") " pod="openshift-marketplace/redhat-marketplace-vtfh5" Dec 07 19:51:52 crc kubenswrapper[4815]: I1207 19:51:52.860953 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f943a60-acea-4d9c-84f4-d2a33cbc0f75-catalog-content\") pod \"redhat-marketplace-vtfh5\" (UID: \"9f943a60-acea-4d9c-84f4-d2a33cbc0f75\") " pod="openshift-marketplace/redhat-marketplace-vtfh5" Dec 07 19:51:52 crc kubenswrapper[4815]: I1207 19:51:52.861325 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5c87\" (UniqueName: \"kubernetes.io/projected/9f943a60-acea-4d9c-84f4-d2a33cbc0f75-kube-api-access-r5c87\") pod \"redhat-marketplace-vtfh5\" (UID: \"9f943a60-acea-4d9c-84f4-d2a33cbc0f75\") " pod="openshift-marketplace/redhat-marketplace-vtfh5" Dec 07 19:51:52 crc kubenswrapper[4815]: I1207 19:51:52.861460 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f943a60-acea-4d9c-84f4-d2a33cbc0f75-utilities\") pod \"redhat-marketplace-vtfh5\" (UID: \"9f943a60-acea-4d9c-84f4-d2a33cbc0f75\") " pod="openshift-marketplace/redhat-marketplace-vtfh5" Dec 07 19:51:52 crc kubenswrapper[4815]: I1207 19:51:52.861446 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f943a60-acea-4d9c-84f4-d2a33cbc0f75-catalog-content\") pod \"redhat-marketplace-vtfh5\" (UID: \"9f943a60-acea-4d9c-84f4-d2a33cbc0f75\") " pod="openshift-marketplace/redhat-marketplace-vtfh5" Dec 07 19:51:52 crc kubenswrapper[4815]: I1207 19:51:52.861797 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f943a60-acea-4d9c-84f4-d2a33cbc0f75-utilities\") pod \"redhat-marketplace-vtfh5\" (UID: \"9f943a60-acea-4d9c-84f4-d2a33cbc0f75\") " pod="openshift-marketplace/redhat-marketplace-vtfh5" Dec 07 19:51:52 crc kubenswrapper[4815]: I1207 19:51:52.887785 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5c87\" (UniqueName: \"kubernetes.io/projected/9f943a60-acea-4d9c-84f4-d2a33cbc0f75-kube-api-access-r5c87\") pod \"redhat-marketplace-vtfh5\" (UID: \"9f943a60-acea-4d9c-84f4-d2a33cbc0f75\") " pod="openshift-marketplace/redhat-marketplace-vtfh5" Dec 07 19:51:52 crc kubenswrapper[4815]: I1207 19:51:52.941150 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vtfh5" Dec 07 19:51:53 crc kubenswrapper[4815]: I1207 19:51:53.563935 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vtfh5"] Dec 07 19:51:53 crc kubenswrapper[4815]: W1207 19:51:53.573605 4815 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f943a60_acea_4d9c_84f4_d2a33cbc0f75.slice/crio-6daa27923142cef3fe2161088b5b0f5324581950673be28f49de5b687ba67226 WatchSource:0}: Error finding container 6daa27923142cef3fe2161088b5b0f5324581950673be28f49de5b687ba67226: Status 404 returned error can't find the container with id 6daa27923142cef3fe2161088b5b0f5324581950673be28f49de5b687ba67226 Dec 07 19:51:54 crc kubenswrapper[4815]: I1207 19:51:54.533954 4815 generic.go:334] "Generic (PLEG): container finished" podID="9f943a60-acea-4d9c-84f4-d2a33cbc0f75" containerID="724681c9f5b7ab949145af7e55ce409084fa79ba4f3b4ef18f58d214c8c1f577" exitCode=0 Dec 07 19:51:54 crc kubenswrapper[4815]: I1207 19:51:54.534037 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vtfh5" event={"ID":"9f943a60-acea-4d9c-84f4-d2a33cbc0f75","Type":"ContainerDied","Data":"724681c9f5b7ab949145af7e55ce409084fa79ba4f3b4ef18f58d214c8c1f577"} Dec 07 19:51:54 crc kubenswrapper[4815]: I1207 19:51:54.534255 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vtfh5" event={"ID":"9f943a60-acea-4d9c-84f4-d2a33cbc0f75","Type":"ContainerStarted","Data":"6daa27923142cef3fe2161088b5b0f5324581950673be28f49de5b687ba67226"} Dec 07 19:51:56 crc kubenswrapper[4815]: I1207 19:51:56.360119 4815 patch_prober.go:28] interesting pod/machine-config-daemon-gkn4h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 07 19:51:56 crc kubenswrapper[4815]: I1207 19:51:56.360726 4815 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 07 19:51:56 crc kubenswrapper[4815]: I1207 19:51:56.727816 4815 generic.go:334] "Generic (PLEG): container finished" podID="9f943a60-acea-4d9c-84f4-d2a33cbc0f75" containerID="9cd565b304de6d6e6989a1e3530d04f5a0ebcc22c71e4718dbac28426c96c0e2" exitCode=0 Dec 07 19:51:56 crc kubenswrapper[4815]: I1207 19:51:56.727881 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vtfh5" event={"ID":"9f943a60-acea-4d9c-84f4-d2a33cbc0f75","Type":"ContainerDied","Data":"9cd565b304de6d6e6989a1e3530d04f5a0ebcc22c71e4718dbac28426c96c0e2"} Dec 07 19:51:57 crc kubenswrapper[4815]: I1207 19:51:57.743311 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vtfh5" event={"ID":"9f943a60-acea-4d9c-84f4-d2a33cbc0f75","Type":"ContainerStarted","Data":"2b9cbc4743b1b0b7c7fe5e70696af93715f340fbf92a510494e2399f99fc165a"} Dec 07 19:51:57 crc kubenswrapper[4815]: I1207 19:51:57.774451 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vtfh5" podStartSLOduration=2.910720898 podStartE2EDuration="5.774430166s" podCreationTimestamp="2025-12-07 19:51:52 +0000 UTC" firstStartedPulling="2025-12-07 19:51:54.537635622 +0000 UTC m=+2219.116625667" lastFinishedPulling="2025-12-07 19:51:57.40134489 +0000 UTC m=+2221.980334935" observedRunningTime="2025-12-07 19:51:57.767042116 +0000 UTC m=+2222.346032191" watchObservedRunningTime="2025-12-07 19:51:57.774430166 +0000 UTC m=+2222.353420211" Dec 07 19:52:02 crc kubenswrapper[4815]: I1207 19:52:02.942948 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vtfh5" Dec 07 19:52:02 crc kubenswrapper[4815]: I1207 19:52:02.943511 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vtfh5" Dec 07 19:52:03 crc kubenswrapper[4815]: I1207 19:52:03.066326 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vtfh5" Dec 07 19:52:03 crc kubenswrapper[4815]: I1207 19:52:03.834030 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vtfh5" Dec 07 19:52:03 crc kubenswrapper[4815]: I1207 19:52:03.904563 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vtfh5"] Dec 07 19:52:05 crc kubenswrapper[4815]: I1207 19:52:05.801551 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-vtfh5" podUID="9f943a60-acea-4d9c-84f4-d2a33cbc0f75" containerName="registry-server" containerID="cri-o://2b9cbc4743b1b0b7c7fe5e70696af93715f340fbf92a510494e2399f99fc165a" gracePeriod=2 Dec 07 19:52:06 crc kubenswrapper[4815]: I1207 19:52:06.264427 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vtfh5" Dec 07 19:52:06 crc kubenswrapper[4815]: I1207 19:52:06.434131 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f943a60-acea-4d9c-84f4-d2a33cbc0f75-catalog-content\") pod \"9f943a60-acea-4d9c-84f4-d2a33cbc0f75\" (UID: \"9f943a60-acea-4d9c-84f4-d2a33cbc0f75\") " Dec 07 19:52:06 crc kubenswrapper[4815]: I1207 19:52:06.434194 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f943a60-acea-4d9c-84f4-d2a33cbc0f75-utilities\") pod \"9f943a60-acea-4d9c-84f4-d2a33cbc0f75\" (UID: \"9f943a60-acea-4d9c-84f4-d2a33cbc0f75\") " Dec 07 19:52:06 crc kubenswrapper[4815]: I1207 19:52:06.434248 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r5c87\" (UniqueName: \"kubernetes.io/projected/9f943a60-acea-4d9c-84f4-d2a33cbc0f75-kube-api-access-r5c87\") pod \"9f943a60-acea-4d9c-84f4-d2a33cbc0f75\" (UID: \"9f943a60-acea-4d9c-84f4-d2a33cbc0f75\") " Dec 07 19:52:06 crc kubenswrapper[4815]: I1207 19:52:06.435891 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f943a60-acea-4d9c-84f4-d2a33cbc0f75-utilities" (OuterVolumeSpecName: "utilities") pod "9f943a60-acea-4d9c-84f4-d2a33cbc0f75" (UID: "9f943a60-acea-4d9c-84f4-d2a33cbc0f75"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 07 19:52:06 crc kubenswrapper[4815]: I1207 19:52:06.441105 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f943a60-acea-4d9c-84f4-d2a33cbc0f75-kube-api-access-r5c87" (OuterVolumeSpecName: "kube-api-access-r5c87") pod "9f943a60-acea-4d9c-84f4-d2a33cbc0f75" (UID: "9f943a60-acea-4d9c-84f4-d2a33cbc0f75"). InnerVolumeSpecName "kube-api-access-r5c87". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:52:06 crc kubenswrapper[4815]: I1207 19:52:06.457193 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f943a60-acea-4d9c-84f4-d2a33cbc0f75-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9f943a60-acea-4d9c-84f4-d2a33cbc0f75" (UID: "9f943a60-acea-4d9c-84f4-d2a33cbc0f75"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 07 19:52:06 crc kubenswrapper[4815]: I1207 19:52:06.537249 4815 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f943a60-acea-4d9c-84f4-d2a33cbc0f75-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 07 19:52:06 crc kubenswrapper[4815]: I1207 19:52:06.537780 4815 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f943a60-acea-4d9c-84f4-d2a33cbc0f75-utilities\") on node \"crc\" DevicePath \"\"" Dec 07 19:52:06 crc kubenswrapper[4815]: I1207 19:52:06.537868 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r5c87\" (UniqueName: \"kubernetes.io/projected/9f943a60-acea-4d9c-84f4-d2a33cbc0f75-kube-api-access-r5c87\") on node \"crc\" DevicePath \"\"" Dec 07 19:52:06 crc kubenswrapper[4815]: I1207 19:52:06.815861 4815 generic.go:334] "Generic (PLEG): container finished" podID="9f943a60-acea-4d9c-84f4-d2a33cbc0f75" containerID="2b9cbc4743b1b0b7c7fe5e70696af93715f340fbf92a510494e2399f99fc165a" exitCode=0 Dec 07 19:52:06 crc kubenswrapper[4815]: I1207 19:52:06.816166 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vtfh5" event={"ID":"9f943a60-acea-4d9c-84f4-d2a33cbc0f75","Type":"ContainerDied","Data":"2b9cbc4743b1b0b7c7fe5e70696af93715f340fbf92a510494e2399f99fc165a"} Dec 07 19:52:06 crc kubenswrapper[4815]: I1207 19:52:06.816950 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vtfh5" event={"ID":"9f943a60-acea-4d9c-84f4-d2a33cbc0f75","Type":"ContainerDied","Data":"6daa27923142cef3fe2161088b5b0f5324581950673be28f49de5b687ba67226"} Dec 07 19:52:06 crc kubenswrapper[4815]: I1207 19:52:06.817015 4815 scope.go:117] "RemoveContainer" containerID="2b9cbc4743b1b0b7c7fe5e70696af93715f340fbf92a510494e2399f99fc165a" Dec 07 19:52:06 crc kubenswrapper[4815]: I1207 19:52:06.816355 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vtfh5" Dec 07 19:52:06 crc kubenswrapper[4815]: I1207 19:52:06.848644 4815 scope.go:117] "RemoveContainer" containerID="9cd565b304de6d6e6989a1e3530d04f5a0ebcc22c71e4718dbac28426c96c0e2" Dec 07 19:52:06 crc kubenswrapper[4815]: I1207 19:52:06.897283 4815 scope.go:117] "RemoveContainer" containerID="724681c9f5b7ab949145af7e55ce409084fa79ba4f3b4ef18f58d214c8c1f577" Dec 07 19:52:06 crc kubenswrapper[4815]: I1207 19:52:06.901295 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vtfh5"] Dec 07 19:52:06 crc kubenswrapper[4815]: I1207 19:52:06.909131 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-vtfh5"] Dec 07 19:52:06 crc kubenswrapper[4815]: I1207 19:52:06.944186 4815 scope.go:117] "RemoveContainer" containerID="2b9cbc4743b1b0b7c7fe5e70696af93715f340fbf92a510494e2399f99fc165a" Dec 07 19:52:06 crc kubenswrapper[4815]: E1207 19:52:06.944669 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b9cbc4743b1b0b7c7fe5e70696af93715f340fbf92a510494e2399f99fc165a\": container with ID starting with 2b9cbc4743b1b0b7c7fe5e70696af93715f340fbf92a510494e2399f99fc165a not found: ID does not exist" containerID="2b9cbc4743b1b0b7c7fe5e70696af93715f340fbf92a510494e2399f99fc165a" Dec 07 19:52:06 crc kubenswrapper[4815]: I1207 19:52:06.944709 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b9cbc4743b1b0b7c7fe5e70696af93715f340fbf92a510494e2399f99fc165a"} err="failed to get container status \"2b9cbc4743b1b0b7c7fe5e70696af93715f340fbf92a510494e2399f99fc165a\": rpc error: code = NotFound desc = could not find container \"2b9cbc4743b1b0b7c7fe5e70696af93715f340fbf92a510494e2399f99fc165a\": container with ID starting with 2b9cbc4743b1b0b7c7fe5e70696af93715f340fbf92a510494e2399f99fc165a not found: ID does not exist" Dec 07 19:52:06 crc kubenswrapper[4815]: I1207 19:52:06.944735 4815 scope.go:117] "RemoveContainer" containerID="9cd565b304de6d6e6989a1e3530d04f5a0ebcc22c71e4718dbac28426c96c0e2" Dec 07 19:52:06 crc kubenswrapper[4815]: E1207 19:52:06.945094 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9cd565b304de6d6e6989a1e3530d04f5a0ebcc22c71e4718dbac28426c96c0e2\": container with ID starting with 9cd565b304de6d6e6989a1e3530d04f5a0ebcc22c71e4718dbac28426c96c0e2 not found: ID does not exist" containerID="9cd565b304de6d6e6989a1e3530d04f5a0ebcc22c71e4718dbac28426c96c0e2" Dec 07 19:52:06 crc kubenswrapper[4815]: I1207 19:52:06.945137 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cd565b304de6d6e6989a1e3530d04f5a0ebcc22c71e4718dbac28426c96c0e2"} err="failed to get container status \"9cd565b304de6d6e6989a1e3530d04f5a0ebcc22c71e4718dbac28426c96c0e2\": rpc error: code = NotFound desc = could not find container \"9cd565b304de6d6e6989a1e3530d04f5a0ebcc22c71e4718dbac28426c96c0e2\": container with ID starting with 9cd565b304de6d6e6989a1e3530d04f5a0ebcc22c71e4718dbac28426c96c0e2 not found: ID does not exist" Dec 07 19:52:06 crc kubenswrapper[4815]: I1207 19:52:06.945151 4815 scope.go:117] "RemoveContainer" containerID="724681c9f5b7ab949145af7e55ce409084fa79ba4f3b4ef18f58d214c8c1f577" Dec 07 19:52:06 crc kubenswrapper[4815]: E1207 19:52:06.945555 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"724681c9f5b7ab949145af7e55ce409084fa79ba4f3b4ef18f58d214c8c1f577\": container with ID starting with 724681c9f5b7ab949145af7e55ce409084fa79ba4f3b4ef18f58d214c8c1f577 not found: ID does not exist" containerID="724681c9f5b7ab949145af7e55ce409084fa79ba4f3b4ef18f58d214c8c1f577" Dec 07 19:52:06 crc kubenswrapper[4815]: I1207 19:52:06.945575 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"724681c9f5b7ab949145af7e55ce409084fa79ba4f3b4ef18f58d214c8c1f577"} err="failed to get container status \"724681c9f5b7ab949145af7e55ce409084fa79ba4f3b4ef18f58d214c8c1f577\": rpc error: code = NotFound desc = could not find container \"724681c9f5b7ab949145af7e55ce409084fa79ba4f3b4ef18f58d214c8c1f577\": container with ID starting with 724681c9f5b7ab949145af7e55ce409084fa79ba4f3b4ef18f58d214c8c1f577 not found: ID does not exist" Dec 07 19:52:07 crc kubenswrapper[4815]: I1207 19:52:07.789609 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f943a60-acea-4d9c-84f4-d2a33cbc0f75" path="/var/lib/kubelet/pods/9f943a60-acea-4d9c-84f4-d2a33cbc0f75/volumes" Dec 07 19:52:26 crc kubenswrapper[4815]: I1207 19:52:26.360248 4815 patch_prober.go:28] interesting pod/machine-config-daemon-gkn4h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 07 19:52:26 crc kubenswrapper[4815]: I1207 19:52:26.360829 4815 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 07 19:52:56 crc kubenswrapper[4815]: I1207 19:52:56.359817 4815 patch_prober.go:28] interesting pod/machine-config-daemon-gkn4h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 07 19:52:56 crc kubenswrapper[4815]: I1207 19:52:56.360581 4815 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 07 19:52:56 crc kubenswrapper[4815]: I1207 19:52:56.360634 4815 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" Dec 07 19:52:56 crc kubenswrapper[4815]: I1207 19:52:56.361577 4815 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"340b584b058ac547a7c5e921d79e470d96d688318eb315a993317b994733d8b7"} pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 07 19:52:56 crc kubenswrapper[4815]: I1207 19:52:56.361647 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" containerName="machine-config-daemon" containerID="cri-o://340b584b058ac547a7c5e921d79e470d96d688318eb315a993317b994733d8b7" gracePeriod=600 Dec 07 19:52:56 crc kubenswrapper[4815]: E1207 19:52:56.487173 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gkn4h_openshift-machine-config-operator(3d662ba2-aa03-4eea-bd30-8ad40638f6c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" Dec 07 19:52:57 crc kubenswrapper[4815]: I1207 19:52:57.302618 4815 generic.go:334] "Generic (PLEG): container finished" podID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" containerID="340b584b058ac547a7c5e921d79e470d96d688318eb315a993317b994733d8b7" exitCode=0 Dec 07 19:52:57 crc kubenswrapper[4815]: I1207 19:52:57.302664 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" event={"ID":"3d662ba2-aa03-4eea-bd30-8ad40638f6c7","Type":"ContainerDied","Data":"340b584b058ac547a7c5e921d79e470d96d688318eb315a993317b994733d8b7"} Dec 07 19:52:57 crc kubenswrapper[4815]: I1207 19:52:57.302704 4815 scope.go:117] "RemoveContainer" containerID="a4d1ab49e52a51b21d475b2ec69e983b5266f8725eddc625be4fa0eac2776a34" Dec 07 19:52:57 crc kubenswrapper[4815]: I1207 19:52:57.304210 4815 scope.go:117] "RemoveContainer" containerID="340b584b058ac547a7c5e921d79e470d96d688318eb315a993317b994733d8b7" Dec 07 19:52:57 crc kubenswrapper[4815]: E1207 19:52:57.304814 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gkn4h_openshift-machine-config-operator(3d662ba2-aa03-4eea-bd30-8ad40638f6c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" Dec 07 19:53:00 crc kubenswrapper[4815]: I1207 19:53:00.502548 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fcsgn"] Dec 07 19:53:00 crc kubenswrapper[4815]: E1207 19:53:00.503937 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f943a60-acea-4d9c-84f4-d2a33cbc0f75" containerName="extract-utilities" Dec 07 19:53:00 crc kubenswrapper[4815]: I1207 19:53:00.503957 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f943a60-acea-4d9c-84f4-d2a33cbc0f75" containerName="extract-utilities" Dec 07 19:53:00 crc kubenswrapper[4815]: E1207 19:53:00.503971 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f943a60-acea-4d9c-84f4-d2a33cbc0f75" containerName="extract-content" Dec 07 19:53:00 crc kubenswrapper[4815]: I1207 19:53:00.503978 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f943a60-acea-4d9c-84f4-d2a33cbc0f75" containerName="extract-content" Dec 07 19:53:00 crc kubenswrapper[4815]: E1207 19:53:00.503995 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f943a60-acea-4d9c-84f4-d2a33cbc0f75" containerName="registry-server" Dec 07 19:53:00 crc kubenswrapper[4815]: I1207 19:53:00.504004 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f943a60-acea-4d9c-84f4-d2a33cbc0f75" containerName="registry-server" Dec 07 19:53:00 crc kubenswrapper[4815]: I1207 19:53:00.504207 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f943a60-acea-4d9c-84f4-d2a33cbc0f75" containerName="registry-server" Dec 07 19:53:00 crc kubenswrapper[4815]: I1207 19:53:00.506333 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fcsgn" Dec 07 19:53:00 crc kubenswrapper[4815]: I1207 19:53:00.533871 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fcsgn"] Dec 07 19:53:00 crc kubenswrapper[4815]: I1207 19:53:00.579086 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2lhl\" (UniqueName: \"kubernetes.io/projected/b2fcbf0e-2643-4cc0-9b87-913848a9c52c-kube-api-access-k2lhl\") pod \"community-operators-fcsgn\" (UID: \"b2fcbf0e-2643-4cc0-9b87-913848a9c52c\") " pod="openshift-marketplace/community-operators-fcsgn" Dec 07 19:53:00 crc kubenswrapper[4815]: I1207 19:53:00.579180 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2fcbf0e-2643-4cc0-9b87-913848a9c52c-utilities\") pod \"community-operators-fcsgn\" (UID: \"b2fcbf0e-2643-4cc0-9b87-913848a9c52c\") " pod="openshift-marketplace/community-operators-fcsgn" Dec 07 19:53:00 crc kubenswrapper[4815]: I1207 19:53:00.579231 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2fcbf0e-2643-4cc0-9b87-913848a9c52c-catalog-content\") pod \"community-operators-fcsgn\" (UID: \"b2fcbf0e-2643-4cc0-9b87-913848a9c52c\") " pod="openshift-marketplace/community-operators-fcsgn" Dec 07 19:53:00 crc kubenswrapper[4815]: I1207 19:53:00.680164 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2lhl\" (UniqueName: \"kubernetes.io/projected/b2fcbf0e-2643-4cc0-9b87-913848a9c52c-kube-api-access-k2lhl\") pod \"community-operators-fcsgn\" (UID: \"b2fcbf0e-2643-4cc0-9b87-913848a9c52c\") " pod="openshift-marketplace/community-operators-fcsgn" Dec 07 19:53:00 crc kubenswrapper[4815]: I1207 19:53:00.680224 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2fcbf0e-2643-4cc0-9b87-913848a9c52c-utilities\") pod \"community-operators-fcsgn\" (UID: \"b2fcbf0e-2643-4cc0-9b87-913848a9c52c\") " pod="openshift-marketplace/community-operators-fcsgn" Dec 07 19:53:00 crc kubenswrapper[4815]: I1207 19:53:00.680259 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2fcbf0e-2643-4cc0-9b87-913848a9c52c-catalog-content\") pod \"community-operators-fcsgn\" (UID: \"b2fcbf0e-2643-4cc0-9b87-913848a9c52c\") " pod="openshift-marketplace/community-operators-fcsgn" Dec 07 19:53:00 crc kubenswrapper[4815]: I1207 19:53:00.680941 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2fcbf0e-2643-4cc0-9b87-913848a9c52c-catalog-content\") pod \"community-operators-fcsgn\" (UID: \"b2fcbf0e-2643-4cc0-9b87-913848a9c52c\") " pod="openshift-marketplace/community-operators-fcsgn" Dec 07 19:53:00 crc kubenswrapper[4815]: I1207 19:53:00.681083 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2fcbf0e-2643-4cc0-9b87-913848a9c52c-utilities\") pod \"community-operators-fcsgn\" (UID: \"b2fcbf0e-2643-4cc0-9b87-913848a9c52c\") " pod="openshift-marketplace/community-operators-fcsgn" Dec 07 19:53:00 crc kubenswrapper[4815]: I1207 19:53:00.705758 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2lhl\" (UniqueName: \"kubernetes.io/projected/b2fcbf0e-2643-4cc0-9b87-913848a9c52c-kube-api-access-k2lhl\") pod \"community-operators-fcsgn\" (UID: \"b2fcbf0e-2643-4cc0-9b87-913848a9c52c\") " pod="openshift-marketplace/community-operators-fcsgn" Dec 07 19:53:00 crc kubenswrapper[4815]: I1207 19:53:00.839947 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fcsgn" Dec 07 19:53:01 crc kubenswrapper[4815]: I1207 19:53:01.484399 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fcsgn"] Dec 07 19:53:01 crc kubenswrapper[4815]: W1207 19:53:01.495113 4815 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2fcbf0e_2643_4cc0_9b87_913848a9c52c.slice/crio-0fa54f9b43fccb1c1d54b95617b8c110cd099b3e04adfe9cd568982dadfbd962 WatchSource:0}: Error finding container 0fa54f9b43fccb1c1d54b95617b8c110cd099b3e04adfe9cd568982dadfbd962: Status 404 returned error can't find the container with id 0fa54f9b43fccb1c1d54b95617b8c110cd099b3e04adfe9cd568982dadfbd962 Dec 07 19:53:02 crc kubenswrapper[4815]: I1207 19:53:02.357989 4815 generic.go:334] "Generic (PLEG): container finished" podID="b2fcbf0e-2643-4cc0-9b87-913848a9c52c" containerID="8c63421b89195346227ee64accbd7203e41b9307484759b56cc10003038485cf" exitCode=0 Dec 07 19:53:02 crc kubenswrapper[4815]: I1207 19:53:02.358114 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fcsgn" event={"ID":"b2fcbf0e-2643-4cc0-9b87-913848a9c52c","Type":"ContainerDied","Data":"8c63421b89195346227ee64accbd7203e41b9307484759b56cc10003038485cf"} Dec 07 19:53:02 crc kubenswrapper[4815]: I1207 19:53:02.358323 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fcsgn" event={"ID":"b2fcbf0e-2643-4cc0-9b87-913848a9c52c","Type":"ContainerStarted","Data":"0fa54f9b43fccb1c1d54b95617b8c110cd099b3e04adfe9cd568982dadfbd962"} Dec 07 19:53:03 crc kubenswrapper[4815]: I1207 19:53:03.370129 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fcsgn" event={"ID":"b2fcbf0e-2643-4cc0-9b87-913848a9c52c","Type":"ContainerStarted","Data":"400780fe4452b984b7606cc281109a737277ff05cbb5b32ab1d741aed88021a9"} Dec 07 19:53:04 crc kubenswrapper[4815]: I1207 19:53:04.382480 4815 generic.go:334] "Generic (PLEG): container finished" podID="b2fcbf0e-2643-4cc0-9b87-913848a9c52c" containerID="400780fe4452b984b7606cc281109a737277ff05cbb5b32ab1d741aed88021a9" exitCode=0 Dec 07 19:53:04 crc kubenswrapper[4815]: I1207 19:53:04.382716 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fcsgn" event={"ID":"b2fcbf0e-2643-4cc0-9b87-913848a9c52c","Type":"ContainerDied","Data":"400780fe4452b984b7606cc281109a737277ff05cbb5b32ab1d741aed88021a9"} Dec 07 19:53:05 crc kubenswrapper[4815]: I1207 19:53:05.394724 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fcsgn" event={"ID":"b2fcbf0e-2643-4cc0-9b87-913848a9c52c","Type":"ContainerStarted","Data":"bbe0e5f8f83f71a70c2abe4f41f022083ae0d505469bad4a194cf6a3f6b60068"} Dec 07 19:53:05 crc kubenswrapper[4815]: I1207 19:53:05.418466 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fcsgn" podStartSLOduration=3.000224949 podStartE2EDuration="5.418438588s" podCreationTimestamp="2025-12-07 19:53:00 +0000 UTC" firstStartedPulling="2025-12-07 19:53:02.360605697 +0000 UTC m=+2286.939595752" lastFinishedPulling="2025-12-07 19:53:04.778819336 +0000 UTC m=+2289.357809391" observedRunningTime="2025-12-07 19:53:05.416442771 +0000 UTC m=+2289.995432826" watchObservedRunningTime="2025-12-07 19:53:05.418438588 +0000 UTC m=+2289.997428633" Dec 07 19:53:09 crc kubenswrapper[4815]: I1207 19:53:09.894579 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wz5jv"] Dec 07 19:53:09 crc kubenswrapper[4815]: I1207 19:53:09.897394 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wz5jv" Dec 07 19:53:09 crc kubenswrapper[4815]: I1207 19:53:09.905349 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wz5jv"] Dec 07 19:53:09 crc kubenswrapper[4815]: I1207 19:53:09.979723 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b462a38-fd9e-4541-aa10-c816c6eb7113-utilities\") pod \"certified-operators-wz5jv\" (UID: \"6b462a38-fd9e-4541-aa10-c816c6eb7113\") " pod="openshift-marketplace/certified-operators-wz5jv" Dec 07 19:53:09 crc kubenswrapper[4815]: I1207 19:53:09.979808 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjqmg\" (UniqueName: \"kubernetes.io/projected/6b462a38-fd9e-4541-aa10-c816c6eb7113-kube-api-access-kjqmg\") pod \"certified-operators-wz5jv\" (UID: \"6b462a38-fd9e-4541-aa10-c816c6eb7113\") " pod="openshift-marketplace/certified-operators-wz5jv" Dec 07 19:53:09 crc kubenswrapper[4815]: I1207 19:53:09.979839 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b462a38-fd9e-4541-aa10-c816c6eb7113-catalog-content\") pod \"certified-operators-wz5jv\" (UID: \"6b462a38-fd9e-4541-aa10-c816c6eb7113\") " pod="openshift-marketplace/certified-operators-wz5jv" Dec 07 19:53:10 crc kubenswrapper[4815]: I1207 19:53:10.080822 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b462a38-fd9e-4541-aa10-c816c6eb7113-catalog-content\") pod \"certified-operators-wz5jv\" (UID: \"6b462a38-fd9e-4541-aa10-c816c6eb7113\") " pod="openshift-marketplace/certified-operators-wz5jv" Dec 07 19:53:10 crc kubenswrapper[4815]: I1207 19:53:10.081014 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b462a38-fd9e-4541-aa10-c816c6eb7113-utilities\") pod \"certified-operators-wz5jv\" (UID: \"6b462a38-fd9e-4541-aa10-c816c6eb7113\") " pod="openshift-marketplace/certified-operators-wz5jv" Dec 07 19:53:10 crc kubenswrapper[4815]: I1207 19:53:10.081073 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjqmg\" (UniqueName: \"kubernetes.io/projected/6b462a38-fd9e-4541-aa10-c816c6eb7113-kube-api-access-kjqmg\") pod \"certified-operators-wz5jv\" (UID: \"6b462a38-fd9e-4541-aa10-c816c6eb7113\") " pod="openshift-marketplace/certified-operators-wz5jv" Dec 07 19:53:10 crc kubenswrapper[4815]: I1207 19:53:10.081450 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b462a38-fd9e-4541-aa10-c816c6eb7113-catalog-content\") pod \"certified-operators-wz5jv\" (UID: \"6b462a38-fd9e-4541-aa10-c816c6eb7113\") " pod="openshift-marketplace/certified-operators-wz5jv" Dec 07 19:53:10 crc kubenswrapper[4815]: I1207 19:53:10.081459 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b462a38-fd9e-4541-aa10-c816c6eb7113-utilities\") pod \"certified-operators-wz5jv\" (UID: \"6b462a38-fd9e-4541-aa10-c816c6eb7113\") " pod="openshift-marketplace/certified-operators-wz5jv" Dec 07 19:53:10 crc kubenswrapper[4815]: I1207 19:53:10.098968 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjqmg\" (UniqueName: \"kubernetes.io/projected/6b462a38-fd9e-4541-aa10-c816c6eb7113-kube-api-access-kjqmg\") pod \"certified-operators-wz5jv\" (UID: \"6b462a38-fd9e-4541-aa10-c816c6eb7113\") " pod="openshift-marketplace/certified-operators-wz5jv" Dec 07 19:53:10 crc kubenswrapper[4815]: I1207 19:53:10.248301 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wz5jv" Dec 07 19:53:10 crc kubenswrapper[4815]: I1207 19:53:10.822070 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wz5jv"] Dec 07 19:53:10 crc kubenswrapper[4815]: I1207 19:53:10.840545 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fcsgn" Dec 07 19:53:10 crc kubenswrapper[4815]: I1207 19:53:10.840606 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fcsgn" Dec 07 19:53:10 crc kubenswrapper[4815]: I1207 19:53:10.917609 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fcsgn" Dec 07 19:53:11 crc kubenswrapper[4815]: I1207 19:53:11.568961 4815 generic.go:334] "Generic (PLEG): container finished" podID="6b462a38-fd9e-4541-aa10-c816c6eb7113" containerID="41c0aaf2dedb4e2052faa347149e1fb7858a3a22387f3819feb4dc51c4a2b5a0" exitCode=0 Dec 07 19:53:11 crc kubenswrapper[4815]: I1207 19:53:11.570473 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wz5jv" event={"ID":"6b462a38-fd9e-4541-aa10-c816c6eb7113","Type":"ContainerDied","Data":"41c0aaf2dedb4e2052faa347149e1fb7858a3a22387f3819feb4dc51c4a2b5a0"} Dec 07 19:53:11 crc kubenswrapper[4815]: I1207 19:53:11.570506 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wz5jv" event={"ID":"6b462a38-fd9e-4541-aa10-c816c6eb7113","Type":"ContainerStarted","Data":"6f9de82b572c950f36f0bc1e6fb22ffbd409c8bfb4888c89bf2b6f73b4619924"} Dec 07 19:53:11 crc kubenswrapper[4815]: I1207 19:53:11.628198 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fcsgn" Dec 07 19:53:11 crc kubenswrapper[4815]: I1207 19:53:11.769726 4815 scope.go:117] "RemoveContainer" containerID="340b584b058ac547a7c5e921d79e470d96d688318eb315a993317b994733d8b7" Dec 07 19:53:11 crc kubenswrapper[4815]: E1207 19:53:11.770012 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gkn4h_openshift-machine-config-operator(3d662ba2-aa03-4eea-bd30-8ad40638f6c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" Dec 07 19:53:12 crc kubenswrapper[4815]: I1207 19:53:12.579932 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wz5jv" event={"ID":"6b462a38-fd9e-4541-aa10-c816c6eb7113","Type":"ContainerStarted","Data":"cb51fed531f85f33d05ae38d224d622488a72e381d03fdb4bf212aac41c289ba"} Dec 07 19:53:14 crc kubenswrapper[4815]: I1207 19:53:14.267356 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fcsgn"] Dec 07 19:53:14 crc kubenswrapper[4815]: I1207 19:53:14.268597 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-fcsgn" podUID="b2fcbf0e-2643-4cc0-9b87-913848a9c52c" containerName="registry-server" containerID="cri-o://bbe0e5f8f83f71a70c2abe4f41f022083ae0d505469bad4a194cf6a3f6b60068" gracePeriod=2 Dec 07 19:53:14 crc kubenswrapper[4815]: I1207 19:53:14.600168 4815 generic.go:334] "Generic (PLEG): container finished" podID="b2fcbf0e-2643-4cc0-9b87-913848a9c52c" containerID="bbe0e5f8f83f71a70c2abe4f41f022083ae0d505469bad4a194cf6a3f6b60068" exitCode=0 Dec 07 19:53:14 crc kubenswrapper[4815]: I1207 19:53:14.600258 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fcsgn" event={"ID":"b2fcbf0e-2643-4cc0-9b87-913848a9c52c","Type":"ContainerDied","Data":"bbe0e5f8f83f71a70c2abe4f41f022083ae0d505469bad4a194cf6a3f6b60068"} Dec 07 19:53:14 crc kubenswrapper[4815]: I1207 19:53:14.602678 4815 generic.go:334] "Generic (PLEG): container finished" podID="6b462a38-fd9e-4541-aa10-c816c6eb7113" containerID="cb51fed531f85f33d05ae38d224d622488a72e381d03fdb4bf212aac41c289ba" exitCode=0 Dec 07 19:53:14 crc kubenswrapper[4815]: I1207 19:53:14.602721 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wz5jv" event={"ID":"6b462a38-fd9e-4541-aa10-c816c6eb7113","Type":"ContainerDied","Data":"cb51fed531f85f33d05ae38d224d622488a72e381d03fdb4bf212aac41c289ba"} Dec 07 19:53:14 crc kubenswrapper[4815]: I1207 19:53:14.723257 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fcsgn" Dec 07 19:53:14 crc kubenswrapper[4815]: I1207 19:53:14.820089 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2fcbf0e-2643-4cc0-9b87-913848a9c52c-utilities\") pod \"b2fcbf0e-2643-4cc0-9b87-913848a9c52c\" (UID: \"b2fcbf0e-2643-4cc0-9b87-913848a9c52c\") " Dec 07 19:53:14 crc kubenswrapper[4815]: I1207 19:53:14.820270 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2fcbf0e-2643-4cc0-9b87-913848a9c52c-catalog-content\") pod \"b2fcbf0e-2643-4cc0-9b87-913848a9c52c\" (UID: \"b2fcbf0e-2643-4cc0-9b87-913848a9c52c\") " Dec 07 19:53:14 crc kubenswrapper[4815]: I1207 19:53:14.820327 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k2lhl\" (UniqueName: \"kubernetes.io/projected/b2fcbf0e-2643-4cc0-9b87-913848a9c52c-kube-api-access-k2lhl\") pod \"b2fcbf0e-2643-4cc0-9b87-913848a9c52c\" (UID: \"b2fcbf0e-2643-4cc0-9b87-913848a9c52c\") " Dec 07 19:53:14 crc kubenswrapper[4815]: I1207 19:53:14.821044 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2fcbf0e-2643-4cc0-9b87-913848a9c52c-utilities" (OuterVolumeSpecName: "utilities") pod "b2fcbf0e-2643-4cc0-9b87-913848a9c52c" (UID: "b2fcbf0e-2643-4cc0-9b87-913848a9c52c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 07 19:53:14 crc kubenswrapper[4815]: I1207 19:53:14.826716 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2fcbf0e-2643-4cc0-9b87-913848a9c52c-kube-api-access-k2lhl" (OuterVolumeSpecName: "kube-api-access-k2lhl") pod "b2fcbf0e-2643-4cc0-9b87-913848a9c52c" (UID: "b2fcbf0e-2643-4cc0-9b87-913848a9c52c"). InnerVolumeSpecName "kube-api-access-k2lhl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:53:14 crc kubenswrapper[4815]: I1207 19:53:14.871321 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2fcbf0e-2643-4cc0-9b87-913848a9c52c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b2fcbf0e-2643-4cc0-9b87-913848a9c52c" (UID: "b2fcbf0e-2643-4cc0-9b87-913848a9c52c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 07 19:53:14 crc kubenswrapper[4815]: I1207 19:53:14.922623 4815 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2fcbf0e-2643-4cc0-9b87-913848a9c52c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 07 19:53:14 crc kubenswrapper[4815]: I1207 19:53:14.922654 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k2lhl\" (UniqueName: \"kubernetes.io/projected/b2fcbf0e-2643-4cc0-9b87-913848a9c52c-kube-api-access-k2lhl\") on node \"crc\" DevicePath \"\"" Dec 07 19:53:14 crc kubenswrapper[4815]: I1207 19:53:14.922666 4815 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2fcbf0e-2643-4cc0-9b87-913848a9c52c-utilities\") on node \"crc\" DevicePath \"\"" Dec 07 19:53:15 crc kubenswrapper[4815]: I1207 19:53:15.614326 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wz5jv" event={"ID":"6b462a38-fd9e-4541-aa10-c816c6eb7113","Type":"ContainerStarted","Data":"314e68d08a4a23ee592dcc31a05f8380de8b0b73231debad0a10536badce74a4"} Dec 07 19:53:15 crc kubenswrapper[4815]: I1207 19:53:15.618373 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fcsgn" event={"ID":"b2fcbf0e-2643-4cc0-9b87-913848a9c52c","Type":"ContainerDied","Data":"0fa54f9b43fccb1c1d54b95617b8c110cd099b3e04adfe9cd568982dadfbd962"} Dec 07 19:53:15 crc kubenswrapper[4815]: I1207 19:53:15.618480 4815 scope.go:117] "RemoveContainer" containerID="bbe0e5f8f83f71a70c2abe4f41f022083ae0d505469bad4a194cf6a3f6b60068" Dec 07 19:53:15 crc kubenswrapper[4815]: I1207 19:53:15.618640 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fcsgn" Dec 07 19:53:15 crc kubenswrapper[4815]: I1207 19:53:15.644516 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wz5jv" podStartSLOduration=3.074355073 podStartE2EDuration="6.644495666s" podCreationTimestamp="2025-12-07 19:53:09 +0000 UTC" firstStartedPulling="2025-12-07 19:53:11.571296463 +0000 UTC m=+2296.150286508" lastFinishedPulling="2025-12-07 19:53:15.141437036 +0000 UTC m=+2299.720427101" observedRunningTime="2025-12-07 19:53:15.643712364 +0000 UTC m=+2300.222702409" watchObservedRunningTime="2025-12-07 19:53:15.644495666 +0000 UTC m=+2300.223485711" Dec 07 19:53:15 crc kubenswrapper[4815]: I1207 19:53:15.647020 4815 scope.go:117] "RemoveContainer" containerID="400780fe4452b984b7606cc281109a737277ff05cbb5b32ab1d741aed88021a9" Dec 07 19:53:15 crc kubenswrapper[4815]: I1207 19:53:15.672614 4815 scope.go:117] "RemoveContainer" containerID="8c63421b89195346227ee64accbd7203e41b9307484759b56cc10003038485cf" Dec 07 19:53:15 crc kubenswrapper[4815]: I1207 19:53:15.682171 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fcsgn"] Dec 07 19:53:15 crc kubenswrapper[4815]: I1207 19:53:15.690819 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-fcsgn"] Dec 07 19:53:15 crc kubenswrapper[4815]: I1207 19:53:15.784326 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2fcbf0e-2643-4cc0-9b87-913848a9c52c" path="/var/lib/kubelet/pods/b2fcbf0e-2643-4cc0-9b87-913848a9c52c/volumes" Dec 07 19:53:20 crc kubenswrapper[4815]: I1207 19:53:20.249537 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wz5jv" Dec 07 19:53:20 crc kubenswrapper[4815]: I1207 19:53:20.250128 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wz5jv" Dec 07 19:53:20 crc kubenswrapper[4815]: I1207 19:53:20.297048 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wz5jv" Dec 07 19:53:20 crc kubenswrapper[4815]: I1207 19:53:20.719400 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wz5jv" Dec 07 19:53:20 crc kubenswrapper[4815]: I1207 19:53:20.781965 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wz5jv"] Dec 07 19:53:22 crc kubenswrapper[4815]: I1207 19:53:22.685121 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-wz5jv" podUID="6b462a38-fd9e-4541-aa10-c816c6eb7113" containerName="registry-server" containerID="cri-o://314e68d08a4a23ee592dcc31a05f8380de8b0b73231debad0a10536badce74a4" gracePeriod=2 Dec 07 19:53:23 crc kubenswrapper[4815]: I1207 19:53:23.121630 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wz5jv" Dec 07 19:53:23 crc kubenswrapper[4815]: I1207 19:53:23.282942 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b462a38-fd9e-4541-aa10-c816c6eb7113-catalog-content\") pod \"6b462a38-fd9e-4541-aa10-c816c6eb7113\" (UID: \"6b462a38-fd9e-4541-aa10-c816c6eb7113\") " Dec 07 19:53:23 crc kubenswrapper[4815]: I1207 19:53:23.283370 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kjqmg\" (UniqueName: \"kubernetes.io/projected/6b462a38-fd9e-4541-aa10-c816c6eb7113-kube-api-access-kjqmg\") pod \"6b462a38-fd9e-4541-aa10-c816c6eb7113\" (UID: \"6b462a38-fd9e-4541-aa10-c816c6eb7113\") " Dec 07 19:53:23 crc kubenswrapper[4815]: I1207 19:53:23.283568 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b462a38-fd9e-4541-aa10-c816c6eb7113-utilities\") pod \"6b462a38-fd9e-4541-aa10-c816c6eb7113\" (UID: \"6b462a38-fd9e-4541-aa10-c816c6eb7113\") " Dec 07 19:53:23 crc kubenswrapper[4815]: I1207 19:53:23.284541 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b462a38-fd9e-4541-aa10-c816c6eb7113-utilities" (OuterVolumeSpecName: "utilities") pod "6b462a38-fd9e-4541-aa10-c816c6eb7113" (UID: "6b462a38-fd9e-4541-aa10-c816c6eb7113"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 07 19:53:23 crc kubenswrapper[4815]: I1207 19:53:23.299115 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b462a38-fd9e-4541-aa10-c816c6eb7113-kube-api-access-kjqmg" (OuterVolumeSpecName: "kube-api-access-kjqmg") pod "6b462a38-fd9e-4541-aa10-c816c6eb7113" (UID: "6b462a38-fd9e-4541-aa10-c816c6eb7113"). InnerVolumeSpecName "kube-api-access-kjqmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:53:23 crc kubenswrapper[4815]: I1207 19:53:23.341197 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b462a38-fd9e-4541-aa10-c816c6eb7113-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6b462a38-fd9e-4541-aa10-c816c6eb7113" (UID: "6b462a38-fd9e-4541-aa10-c816c6eb7113"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 07 19:53:23 crc kubenswrapper[4815]: I1207 19:53:23.386290 4815 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b462a38-fd9e-4541-aa10-c816c6eb7113-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 07 19:53:23 crc kubenswrapper[4815]: I1207 19:53:23.386338 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kjqmg\" (UniqueName: \"kubernetes.io/projected/6b462a38-fd9e-4541-aa10-c816c6eb7113-kube-api-access-kjqmg\") on node \"crc\" DevicePath \"\"" Dec 07 19:53:23 crc kubenswrapper[4815]: I1207 19:53:23.386355 4815 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b462a38-fd9e-4541-aa10-c816c6eb7113-utilities\") on node \"crc\" DevicePath \"\"" Dec 07 19:53:23 crc kubenswrapper[4815]: I1207 19:53:23.698753 4815 generic.go:334] "Generic (PLEG): container finished" podID="6b462a38-fd9e-4541-aa10-c816c6eb7113" containerID="314e68d08a4a23ee592dcc31a05f8380de8b0b73231debad0a10536badce74a4" exitCode=0 Dec 07 19:53:23 crc kubenswrapper[4815]: I1207 19:53:23.698806 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wz5jv" event={"ID":"6b462a38-fd9e-4541-aa10-c816c6eb7113","Type":"ContainerDied","Data":"314e68d08a4a23ee592dcc31a05f8380de8b0b73231debad0a10536badce74a4"} Dec 07 19:53:23 crc kubenswrapper[4815]: I1207 19:53:23.698822 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wz5jv" Dec 07 19:53:23 crc kubenswrapper[4815]: I1207 19:53:23.698845 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wz5jv" event={"ID":"6b462a38-fd9e-4541-aa10-c816c6eb7113","Type":"ContainerDied","Data":"6f9de82b572c950f36f0bc1e6fb22ffbd409c8bfb4888c89bf2b6f73b4619924"} Dec 07 19:53:23 crc kubenswrapper[4815]: I1207 19:53:23.698869 4815 scope.go:117] "RemoveContainer" containerID="314e68d08a4a23ee592dcc31a05f8380de8b0b73231debad0a10536badce74a4" Dec 07 19:53:23 crc kubenswrapper[4815]: I1207 19:53:23.726560 4815 scope.go:117] "RemoveContainer" containerID="cb51fed531f85f33d05ae38d224d622488a72e381d03fdb4bf212aac41c289ba" Dec 07 19:53:23 crc kubenswrapper[4815]: I1207 19:53:23.746316 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wz5jv"] Dec 07 19:53:23 crc kubenswrapper[4815]: I1207 19:53:23.755498 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-wz5jv"] Dec 07 19:53:23 crc kubenswrapper[4815]: I1207 19:53:23.763683 4815 scope.go:117] "RemoveContainer" containerID="41c0aaf2dedb4e2052faa347149e1fb7858a3a22387f3819feb4dc51c4a2b5a0" Dec 07 19:53:23 crc kubenswrapper[4815]: I1207 19:53:23.785047 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b462a38-fd9e-4541-aa10-c816c6eb7113" path="/var/lib/kubelet/pods/6b462a38-fd9e-4541-aa10-c816c6eb7113/volumes" Dec 07 19:53:23 crc kubenswrapper[4815]: I1207 19:53:23.801969 4815 scope.go:117] "RemoveContainer" containerID="314e68d08a4a23ee592dcc31a05f8380de8b0b73231debad0a10536badce74a4" Dec 07 19:53:23 crc kubenswrapper[4815]: E1207 19:53:23.802581 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"314e68d08a4a23ee592dcc31a05f8380de8b0b73231debad0a10536badce74a4\": container with ID starting with 314e68d08a4a23ee592dcc31a05f8380de8b0b73231debad0a10536badce74a4 not found: ID does not exist" containerID="314e68d08a4a23ee592dcc31a05f8380de8b0b73231debad0a10536badce74a4" Dec 07 19:53:23 crc kubenswrapper[4815]: I1207 19:53:23.802662 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"314e68d08a4a23ee592dcc31a05f8380de8b0b73231debad0a10536badce74a4"} err="failed to get container status \"314e68d08a4a23ee592dcc31a05f8380de8b0b73231debad0a10536badce74a4\": rpc error: code = NotFound desc = could not find container \"314e68d08a4a23ee592dcc31a05f8380de8b0b73231debad0a10536badce74a4\": container with ID starting with 314e68d08a4a23ee592dcc31a05f8380de8b0b73231debad0a10536badce74a4 not found: ID does not exist" Dec 07 19:53:23 crc kubenswrapper[4815]: I1207 19:53:23.802692 4815 scope.go:117] "RemoveContainer" containerID="cb51fed531f85f33d05ae38d224d622488a72e381d03fdb4bf212aac41c289ba" Dec 07 19:53:23 crc kubenswrapper[4815]: E1207 19:53:23.803160 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb51fed531f85f33d05ae38d224d622488a72e381d03fdb4bf212aac41c289ba\": container with ID starting with cb51fed531f85f33d05ae38d224d622488a72e381d03fdb4bf212aac41c289ba not found: ID does not exist" containerID="cb51fed531f85f33d05ae38d224d622488a72e381d03fdb4bf212aac41c289ba" Dec 07 19:53:23 crc kubenswrapper[4815]: I1207 19:53:23.803187 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb51fed531f85f33d05ae38d224d622488a72e381d03fdb4bf212aac41c289ba"} err="failed to get container status \"cb51fed531f85f33d05ae38d224d622488a72e381d03fdb4bf212aac41c289ba\": rpc error: code = NotFound desc = could not find container \"cb51fed531f85f33d05ae38d224d622488a72e381d03fdb4bf212aac41c289ba\": container with ID starting with cb51fed531f85f33d05ae38d224d622488a72e381d03fdb4bf212aac41c289ba not found: ID does not exist" Dec 07 19:53:23 crc kubenswrapper[4815]: I1207 19:53:23.803208 4815 scope.go:117] "RemoveContainer" containerID="41c0aaf2dedb4e2052faa347149e1fb7858a3a22387f3819feb4dc51c4a2b5a0" Dec 07 19:53:23 crc kubenswrapper[4815]: E1207 19:53:23.803480 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41c0aaf2dedb4e2052faa347149e1fb7858a3a22387f3819feb4dc51c4a2b5a0\": container with ID starting with 41c0aaf2dedb4e2052faa347149e1fb7858a3a22387f3819feb4dc51c4a2b5a0 not found: ID does not exist" containerID="41c0aaf2dedb4e2052faa347149e1fb7858a3a22387f3819feb4dc51c4a2b5a0" Dec 07 19:53:23 crc kubenswrapper[4815]: I1207 19:53:23.803509 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41c0aaf2dedb4e2052faa347149e1fb7858a3a22387f3819feb4dc51c4a2b5a0"} err="failed to get container status \"41c0aaf2dedb4e2052faa347149e1fb7858a3a22387f3819feb4dc51c4a2b5a0\": rpc error: code = NotFound desc = could not find container \"41c0aaf2dedb4e2052faa347149e1fb7858a3a22387f3819feb4dc51c4a2b5a0\": container with ID starting with 41c0aaf2dedb4e2052faa347149e1fb7858a3a22387f3819feb4dc51c4a2b5a0 not found: ID does not exist" Dec 07 19:53:26 crc kubenswrapper[4815]: I1207 19:53:26.769932 4815 scope.go:117] "RemoveContainer" containerID="340b584b058ac547a7c5e921d79e470d96d688318eb315a993317b994733d8b7" Dec 07 19:53:26 crc kubenswrapper[4815]: E1207 19:53:26.770498 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gkn4h_openshift-machine-config-operator(3d662ba2-aa03-4eea-bd30-8ad40638f6c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" Dec 07 19:53:39 crc kubenswrapper[4815]: I1207 19:53:39.770495 4815 scope.go:117] "RemoveContainer" containerID="340b584b058ac547a7c5e921d79e470d96d688318eb315a993317b994733d8b7" Dec 07 19:53:39 crc kubenswrapper[4815]: E1207 19:53:39.771224 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gkn4h_openshift-machine-config-operator(3d662ba2-aa03-4eea-bd30-8ad40638f6c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" Dec 07 19:53:50 crc kubenswrapper[4815]: I1207 19:53:50.769669 4815 scope.go:117] "RemoveContainer" containerID="340b584b058ac547a7c5e921d79e470d96d688318eb315a993317b994733d8b7" Dec 07 19:53:50 crc kubenswrapper[4815]: E1207 19:53:50.770423 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gkn4h_openshift-machine-config-operator(3d662ba2-aa03-4eea-bd30-8ad40638f6c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" Dec 07 19:54:01 crc kubenswrapper[4815]: I1207 19:54:01.770211 4815 scope.go:117] "RemoveContainer" containerID="340b584b058ac547a7c5e921d79e470d96d688318eb315a993317b994733d8b7" Dec 07 19:54:01 crc kubenswrapper[4815]: E1207 19:54:01.771594 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gkn4h_openshift-machine-config-operator(3d662ba2-aa03-4eea-bd30-8ad40638f6c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" Dec 07 19:54:13 crc kubenswrapper[4815]: I1207 19:54:13.770595 4815 scope.go:117] "RemoveContainer" containerID="340b584b058ac547a7c5e921d79e470d96d688318eb315a993317b994733d8b7" Dec 07 19:54:13 crc kubenswrapper[4815]: E1207 19:54:13.771332 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gkn4h_openshift-machine-config-operator(3d662ba2-aa03-4eea-bd30-8ad40638f6c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" Dec 07 19:54:26 crc kubenswrapper[4815]: I1207 19:54:26.770872 4815 scope.go:117] "RemoveContainer" containerID="340b584b058ac547a7c5e921d79e470d96d688318eb315a993317b994733d8b7" Dec 07 19:54:26 crc kubenswrapper[4815]: E1207 19:54:26.771550 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gkn4h_openshift-machine-config-operator(3d662ba2-aa03-4eea-bd30-8ad40638f6c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" Dec 07 19:54:37 crc kubenswrapper[4815]: I1207 19:54:37.770104 4815 scope.go:117] "RemoveContainer" containerID="340b584b058ac547a7c5e921d79e470d96d688318eb315a993317b994733d8b7" Dec 07 19:54:37 crc kubenswrapper[4815]: E1207 19:54:37.771466 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gkn4h_openshift-machine-config-operator(3d662ba2-aa03-4eea-bd30-8ad40638f6c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" Dec 07 19:54:46 crc kubenswrapper[4815]: I1207 19:54:46.267860 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-qb8tq/must-gather-nlthc"] Dec 07 19:54:46 crc kubenswrapper[4815]: E1207 19:54:46.268875 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2fcbf0e-2643-4cc0-9b87-913848a9c52c" containerName="extract-content" Dec 07 19:54:46 crc kubenswrapper[4815]: I1207 19:54:46.268889 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2fcbf0e-2643-4cc0-9b87-913848a9c52c" containerName="extract-content" Dec 07 19:54:46 crc kubenswrapper[4815]: E1207 19:54:46.268908 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b462a38-fd9e-4541-aa10-c816c6eb7113" containerName="extract-utilities" Dec 07 19:54:46 crc kubenswrapper[4815]: I1207 19:54:46.268930 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b462a38-fd9e-4541-aa10-c816c6eb7113" containerName="extract-utilities" Dec 07 19:54:46 crc kubenswrapper[4815]: E1207 19:54:46.268976 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2fcbf0e-2643-4cc0-9b87-913848a9c52c" containerName="extract-utilities" Dec 07 19:54:46 crc kubenswrapper[4815]: I1207 19:54:46.268984 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2fcbf0e-2643-4cc0-9b87-913848a9c52c" containerName="extract-utilities" Dec 07 19:54:46 crc kubenswrapper[4815]: E1207 19:54:46.268996 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b462a38-fd9e-4541-aa10-c816c6eb7113" containerName="registry-server" Dec 07 19:54:46 crc kubenswrapper[4815]: I1207 19:54:46.269003 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b462a38-fd9e-4541-aa10-c816c6eb7113" containerName="registry-server" Dec 07 19:54:46 crc kubenswrapper[4815]: E1207 19:54:46.269019 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b462a38-fd9e-4541-aa10-c816c6eb7113" containerName="extract-content" Dec 07 19:54:46 crc kubenswrapper[4815]: I1207 19:54:46.269026 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b462a38-fd9e-4541-aa10-c816c6eb7113" containerName="extract-content" Dec 07 19:54:46 crc kubenswrapper[4815]: E1207 19:54:46.269047 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2fcbf0e-2643-4cc0-9b87-913848a9c52c" containerName="registry-server" Dec 07 19:54:46 crc kubenswrapper[4815]: I1207 19:54:46.269053 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2fcbf0e-2643-4cc0-9b87-913848a9c52c" containerName="registry-server" Dec 07 19:54:46 crc kubenswrapper[4815]: I1207 19:54:46.269276 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b462a38-fd9e-4541-aa10-c816c6eb7113" containerName="registry-server" Dec 07 19:54:46 crc kubenswrapper[4815]: I1207 19:54:46.269297 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2fcbf0e-2643-4cc0-9b87-913848a9c52c" containerName="registry-server" Dec 07 19:54:46 crc kubenswrapper[4815]: I1207 19:54:46.270256 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qb8tq/must-gather-nlthc" Dec 07 19:54:46 crc kubenswrapper[4815]: I1207 19:54:46.272674 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-qb8tq"/"kube-root-ca.crt" Dec 07 19:54:46 crc kubenswrapper[4815]: I1207 19:54:46.279985 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-qb8tq"/"openshift-service-ca.crt" Dec 07 19:54:46 crc kubenswrapper[4815]: I1207 19:54:46.295522 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-qb8tq/must-gather-nlthc"] Dec 07 19:54:46 crc kubenswrapper[4815]: I1207 19:54:46.334635 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwx9r\" (UniqueName: \"kubernetes.io/projected/a5376abd-47e3-4a79-a593-05cf789b4e16-kube-api-access-pwx9r\") pod \"must-gather-nlthc\" (UID: \"a5376abd-47e3-4a79-a593-05cf789b4e16\") " pod="openshift-must-gather-qb8tq/must-gather-nlthc" Dec 07 19:54:46 crc kubenswrapper[4815]: I1207 19:54:46.334739 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a5376abd-47e3-4a79-a593-05cf789b4e16-must-gather-output\") pod \"must-gather-nlthc\" (UID: \"a5376abd-47e3-4a79-a593-05cf789b4e16\") " pod="openshift-must-gather-qb8tq/must-gather-nlthc" Dec 07 19:54:46 crc kubenswrapper[4815]: I1207 19:54:46.435985 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a5376abd-47e3-4a79-a593-05cf789b4e16-must-gather-output\") pod \"must-gather-nlthc\" (UID: \"a5376abd-47e3-4a79-a593-05cf789b4e16\") " pod="openshift-must-gather-qb8tq/must-gather-nlthc" Dec 07 19:54:46 crc kubenswrapper[4815]: I1207 19:54:46.436103 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwx9r\" (UniqueName: \"kubernetes.io/projected/a5376abd-47e3-4a79-a593-05cf789b4e16-kube-api-access-pwx9r\") pod \"must-gather-nlthc\" (UID: \"a5376abd-47e3-4a79-a593-05cf789b4e16\") " pod="openshift-must-gather-qb8tq/must-gather-nlthc" Dec 07 19:54:46 crc kubenswrapper[4815]: I1207 19:54:46.436661 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a5376abd-47e3-4a79-a593-05cf789b4e16-must-gather-output\") pod \"must-gather-nlthc\" (UID: \"a5376abd-47e3-4a79-a593-05cf789b4e16\") " pod="openshift-must-gather-qb8tq/must-gather-nlthc" Dec 07 19:54:46 crc kubenswrapper[4815]: I1207 19:54:46.470969 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwx9r\" (UniqueName: \"kubernetes.io/projected/a5376abd-47e3-4a79-a593-05cf789b4e16-kube-api-access-pwx9r\") pod \"must-gather-nlthc\" (UID: \"a5376abd-47e3-4a79-a593-05cf789b4e16\") " pod="openshift-must-gather-qb8tq/must-gather-nlthc" Dec 07 19:54:46 crc kubenswrapper[4815]: I1207 19:54:46.593768 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qb8tq/must-gather-nlthc" Dec 07 19:54:47 crc kubenswrapper[4815]: I1207 19:54:47.108935 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-qb8tq/must-gather-nlthc"] Dec 07 19:54:47 crc kubenswrapper[4815]: I1207 19:54:47.114095 4815 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 07 19:54:47 crc kubenswrapper[4815]: I1207 19:54:47.564800 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qb8tq/must-gather-nlthc" event={"ID":"a5376abd-47e3-4a79-a593-05cf789b4e16","Type":"ContainerStarted","Data":"806d3392997aa0344002ea716edf9b4dc2c832f1480e5aa62391911ba8164bab"} Dec 07 19:54:52 crc kubenswrapper[4815]: I1207 19:54:52.769621 4815 scope.go:117] "RemoveContainer" containerID="340b584b058ac547a7c5e921d79e470d96d688318eb315a993317b994733d8b7" Dec 07 19:54:52 crc kubenswrapper[4815]: E1207 19:54:52.770368 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gkn4h_openshift-machine-config-operator(3d662ba2-aa03-4eea-bd30-8ad40638f6c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" Dec 07 19:54:54 crc kubenswrapper[4815]: I1207 19:54:54.627243 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qb8tq/must-gather-nlthc" event={"ID":"a5376abd-47e3-4a79-a593-05cf789b4e16","Type":"ContainerStarted","Data":"a6230ffa4b25b0286d26e2ea5b7549738bdade33db32732a6c327dce420331dc"} Dec 07 19:54:55 crc kubenswrapper[4815]: I1207 19:54:55.810250 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qb8tq/must-gather-nlthc" event={"ID":"a5376abd-47e3-4a79-a593-05cf789b4e16","Type":"ContainerStarted","Data":"5005ec3c631cef5a12a6161db09f8a74fef2d877c88d782af4f9c8bdebddfac9"} Dec 07 19:54:55 crc kubenswrapper[4815]: I1207 19:54:55.877504 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-qb8tq/must-gather-nlthc" podStartSLOduration=2.7295410540000002 podStartE2EDuration="9.877483732s" podCreationTimestamp="2025-12-07 19:54:46 +0000 UTC" firstStartedPulling="2025-12-07 19:54:47.114044084 +0000 UTC m=+2391.693034129" lastFinishedPulling="2025-12-07 19:54:54.261986762 +0000 UTC m=+2398.840976807" observedRunningTime="2025-12-07 19:54:55.873463938 +0000 UTC m=+2400.452453983" watchObservedRunningTime="2025-12-07 19:54:55.877483732 +0000 UTC m=+2400.456473777" Dec 07 19:54:58 crc kubenswrapper[4815]: E1207 19:54:58.105369 4815 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.2:53578->38.102.83.2:39939: write tcp 38.102.83.2:53578->38.102.83.2:39939: write: broken pipe Dec 07 19:54:59 crc kubenswrapper[4815]: I1207 19:54:59.097001 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-qb8tq/crc-debug-c57fn"] Dec 07 19:54:59 crc kubenswrapper[4815]: I1207 19:54:59.098579 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qb8tq/crc-debug-c57fn" Dec 07 19:54:59 crc kubenswrapper[4815]: I1207 19:54:59.100294 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c5aad30c-3282-4b76-b5f5-a0d8f7462f30-host\") pod \"crc-debug-c57fn\" (UID: \"c5aad30c-3282-4b76-b5f5-a0d8f7462f30\") " pod="openshift-must-gather-qb8tq/crc-debug-c57fn" Dec 07 19:54:59 crc kubenswrapper[4815]: I1207 19:54:59.100462 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqrkt\" (UniqueName: \"kubernetes.io/projected/c5aad30c-3282-4b76-b5f5-a0d8f7462f30-kube-api-access-tqrkt\") pod \"crc-debug-c57fn\" (UID: \"c5aad30c-3282-4b76-b5f5-a0d8f7462f30\") " pod="openshift-must-gather-qb8tq/crc-debug-c57fn" Dec 07 19:54:59 crc kubenswrapper[4815]: I1207 19:54:59.102149 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-qb8tq"/"default-dockercfg-7kgrn" Dec 07 19:54:59 crc kubenswrapper[4815]: I1207 19:54:59.201990 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c5aad30c-3282-4b76-b5f5-a0d8f7462f30-host\") pod \"crc-debug-c57fn\" (UID: \"c5aad30c-3282-4b76-b5f5-a0d8f7462f30\") " pod="openshift-must-gather-qb8tq/crc-debug-c57fn" Dec 07 19:54:59 crc kubenswrapper[4815]: I1207 19:54:59.202109 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqrkt\" (UniqueName: \"kubernetes.io/projected/c5aad30c-3282-4b76-b5f5-a0d8f7462f30-kube-api-access-tqrkt\") pod \"crc-debug-c57fn\" (UID: \"c5aad30c-3282-4b76-b5f5-a0d8f7462f30\") " pod="openshift-must-gather-qb8tq/crc-debug-c57fn" Dec 07 19:54:59 crc kubenswrapper[4815]: I1207 19:54:59.202137 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c5aad30c-3282-4b76-b5f5-a0d8f7462f30-host\") pod \"crc-debug-c57fn\" (UID: \"c5aad30c-3282-4b76-b5f5-a0d8f7462f30\") " pod="openshift-must-gather-qb8tq/crc-debug-c57fn" Dec 07 19:54:59 crc kubenswrapper[4815]: I1207 19:54:59.222271 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqrkt\" (UniqueName: \"kubernetes.io/projected/c5aad30c-3282-4b76-b5f5-a0d8f7462f30-kube-api-access-tqrkt\") pod \"crc-debug-c57fn\" (UID: \"c5aad30c-3282-4b76-b5f5-a0d8f7462f30\") " pod="openshift-must-gather-qb8tq/crc-debug-c57fn" Dec 07 19:54:59 crc kubenswrapper[4815]: I1207 19:54:59.414066 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qb8tq/crc-debug-c57fn" Dec 07 19:54:59 crc kubenswrapper[4815]: I1207 19:54:59.910265 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qb8tq/crc-debug-c57fn" event={"ID":"c5aad30c-3282-4b76-b5f5-a0d8f7462f30","Type":"ContainerStarted","Data":"88fb6a5c725cdf354ec31fb86c55fdfb5d2ac481daa3fc2bcf3b85f0afe7d277"} Dec 07 19:55:05 crc kubenswrapper[4815]: I1207 19:55:05.784215 4815 scope.go:117] "RemoveContainer" containerID="340b584b058ac547a7c5e921d79e470d96d688318eb315a993317b994733d8b7" Dec 07 19:55:05 crc kubenswrapper[4815]: E1207 19:55:05.786197 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gkn4h_openshift-machine-config-operator(3d662ba2-aa03-4eea-bd30-8ad40638f6c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" Dec 07 19:55:13 crc kubenswrapper[4815]: I1207 19:55:13.031252 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qb8tq/crc-debug-c57fn" event={"ID":"c5aad30c-3282-4b76-b5f5-a0d8f7462f30","Type":"ContainerStarted","Data":"3ab313a71c7383bd71e5bbd5132f8a4f4ffb5309d70922e87b979c5d16a94146"} Dec 07 19:55:13 crc kubenswrapper[4815]: I1207 19:55:13.047036 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-qb8tq/crc-debug-c57fn" podStartSLOduration=0.869252973 podStartE2EDuration="14.047014109s" podCreationTimestamp="2025-12-07 19:54:59 +0000 UTC" firstStartedPulling="2025-12-07 19:54:59.466792396 +0000 UTC m=+2404.045782461" lastFinishedPulling="2025-12-07 19:55:12.644553552 +0000 UTC m=+2417.223543597" observedRunningTime="2025-12-07 19:55:13.044819427 +0000 UTC m=+2417.623809482" watchObservedRunningTime="2025-12-07 19:55:13.047014109 +0000 UTC m=+2417.626004154" Dec 07 19:55:16 crc kubenswrapper[4815]: I1207 19:55:16.770149 4815 scope.go:117] "RemoveContainer" containerID="340b584b058ac547a7c5e921d79e470d96d688318eb315a993317b994733d8b7" Dec 07 19:55:16 crc kubenswrapper[4815]: E1207 19:55:16.771168 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gkn4h_openshift-machine-config-operator(3d662ba2-aa03-4eea-bd30-8ad40638f6c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" Dec 07 19:55:31 crc kubenswrapper[4815]: I1207 19:55:31.770454 4815 scope.go:117] "RemoveContainer" containerID="340b584b058ac547a7c5e921d79e470d96d688318eb315a993317b994733d8b7" Dec 07 19:55:31 crc kubenswrapper[4815]: E1207 19:55:31.771780 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gkn4h_openshift-machine-config-operator(3d662ba2-aa03-4eea-bd30-8ad40638f6c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" Dec 07 19:55:35 crc kubenswrapper[4815]: I1207 19:55:35.230502 4815 generic.go:334] "Generic (PLEG): container finished" podID="c5aad30c-3282-4b76-b5f5-a0d8f7462f30" containerID="3ab313a71c7383bd71e5bbd5132f8a4f4ffb5309d70922e87b979c5d16a94146" exitCode=0 Dec 07 19:55:35 crc kubenswrapper[4815]: I1207 19:55:35.230588 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qb8tq/crc-debug-c57fn" event={"ID":"c5aad30c-3282-4b76-b5f5-a0d8f7462f30","Type":"ContainerDied","Data":"3ab313a71c7383bd71e5bbd5132f8a4f4ffb5309d70922e87b979c5d16a94146"} Dec 07 19:55:36 crc kubenswrapper[4815]: I1207 19:55:36.338280 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qb8tq/crc-debug-c57fn" Dec 07 19:55:36 crc kubenswrapper[4815]: I1207 19:55:36.365431 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-qb8tq/crc-debug-c57fn"] Dec 07 19:55:36 crc kubenswrapper[4815]: I1207 19:55:36.373239 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-qb8tq/crc-debug-c57fn"] Dec 07 19:55:36 crc kubenswrapper[4815]: I1207 19:55:36.474725 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c5aad30c-3282-4b76-b5f5-a0d8f7462f30-host\") pod \"c5aad30c-3282-4b76-b5f5-a0d8f7462f30\" (UID: \"c5aad30c-3282-4b76-b5f5-a0d8f7462f30\") " Dec 07 19:55:36 crc kubenswrapper[4815]: I1207 19:55:36.474975 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tqrkt\" (UniqueName: \"kubernetes.io/projected/c5aad30c-3282-4b76-b5f5-a0d8f7462f30-kube-api-access-tqrkt\") pod \"c5aad30c-3282-4b76-b5f5-a0d8f7462f30\" (UID: \"c5aad30c-3282-4b76-b5f5-a0d8f7462f30\") " Dec 07 19:55:36 crc kubenswrapper[4815]: I1207 19:55:36.475113 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c5aad30c-3282-4b76-b5f5-a0d8f7462f30-host" (OuterVolumeSpecName: "host") pod "c5aad30c-3282-4b76-b5f5-a0d8f7462f30" (UID: "c5aad30c-3282-4b76-b5f5-a0d8f7462f30"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 07 19:55:36 crc kubenswrapper[4815]: I1207 19:55:36.480469 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5aad30c-3282-4b76-b5f5-a0d8f7462f30-kube-api-access-tqrkt" (OuterVolumeSpecName: "kube-api-access-tqrkt") pod "c5aad30c-3282-4b76-b5f5-a0d8f7462f30" (UID: "c5aad30c-3282-4b76-b5f5-a0d8f7462f30"). InnerVolumeSpecName "kube-api-access-tqrkt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:55:36 crc kubenswrapper[4815]: I1207 19:55:36.577099 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tqrkt\" (UniqueName: \"kubernetes.io/projected/c5aad30c-3282-4b76-b5f5-a0d8f7462f30-kube-api-access-tqrkt\") on node \"crc\" DevicePath \"\"" Dec 07 19:55:36 crc kubenswrapper[4815]: I1207 19:55:36.577328 4815 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c5aad30c-3282-4b76-b5f5-a0d8f7462f30-host\") on node \"crc\" DevicePath \"\"" Dec 07 19:55:37 crc kubenswrapper[4815]: I1207 19:55:37.247511 4815 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="88fb6a5c725cdf354ec31fb86c55fdfb5d2ac481daa3fc2bcf3b85f0afe7d277" Dec 07 19:55:37 crc kubenswrapper[4815]: I1207 19:55:37.247570 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qb8tq/crc-debug-c57fn" Dec 07 19:55:37 crc kubenswrapper[4815]: I1207 19:55:37.561290 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-qb8tq/crc-debug-c2sx4"] Dec 07 19:55:37 crc kubenswrapper[4815]: E1207 19:55:37.561779 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5aad30c-3282-4b76-b5f5-a0d8f7462f30" containerName="container-00" Dec 07 19:55:37 crc kubenswrapper[4815]: I1207 19:55:37.561796 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5aad30c-3282-4b76-b5f5-a0d8f7462f30" containerName="container-00" Dec 07 19:55:37 crc kubenswrapper[4815]: I1207 19:55:37.562008 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5aad30c-3282-4b76-b5f5-a0d8f7462f30" containerName="container-00" Dec 07 19:55:37 crc kubenswrapper[4815]: I1207 19:55:37.562706 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qb8tq/crc-debug-c2sx4" Dec 07 19:55:37 crc kubenswrapper[4815]: I1207 19:55:37.564830 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-qb8tq"/"default-dockercfg-7kgrn" Dec 07 19:55:37 crc kubenswrapper[4815]: I1207 19:55:37.703361 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mscb\" (UniqueName: \"kubernetes.io/projected/04798220-974a-40e6-abfb-cc2df600088c-kube-api-access-5mscb\") pod \"crc-debug-c2sx4\" (UID: \"04798220-974a-40e6-abfb-cc2df600088c\") " pod="openshift-must-gather-qb8tq/crc-debug-c2sx4" Dec 07 19:55:37 crc kubenswrapper[4815]: I1207 19:55:37.703451 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/04798220-974a-40e6-abfb-cc2df600088c-host\") pod \"crc-debug-c2sx4\" (UID: \"04798220-974a-40e6-abfb-cc2df600088c\") " pod="openshift-must-gather-qb8tq/crc-debug-c2sx4" Dec 07 19:55:37 crc kubenswrapper[4815]: I1207 19:55:37.783998 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5aad30c-3282-4b76-b5f5-a0d8f7462f30" path="/var/lib/kubelet/pods/c5aad30c-3282-4b76-b5f5-a0d8f7462f30/volumes" Dec 07 19:55:37 crc kubenswrapper[4815]: I1207 19:55:37.805021 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mscb\" (UniqueName: \"kubernetes.io/projected/04798220-974a-40e6-abfb-cc2df600088c-kube-api-access-5mscb\") pod \"crc-debug-c2sx4\" (UID: \"04798220-974a-40e6-abfb-cc2df600088c\") " pod="openshift-must-gather-qb8tq/crc-debug-c2sx4" Dec 07 19:55:37 crc kubenswrapper[4815]: I1207 19:55:37.805137 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/04798220-974a-40e6-abfb-cc2df600088c-host\") pod \"crc-debug-c2sx4\" (UID: \"04798220-974a-40e6-abfb-cc2df600088c\") " pod="openshift-must-gather-qb8tq/crc-debug-c2sx4" Dec 07 19:55:37 crc kubenswrapper[4815]: I1207 19:55:37.805254 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/04798220-974a-40e6-abfb-cc2df600088c-host\") pod \"crc-debug-c2sx4\" (UID: \"04798220-974a-40e6-abfb-cc2df600088c\") " pod="openshift-must-gather-qb8tq/crc-debug-c2sx4" Dec 07 19:55:37 crc kubenswrapper[4815]: I1207 19:55:37.840143 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mscb\" (UniqueName: \"kubernetes.io/projected/04798220-974a-40e6-abfb-cc2df600088c-kube-api-access-5mscb\") pod \"crc-debug-c2sx4\" (UID: \"04798220-974a-40e6-abfb-cc2df600088c\") " pod="openshift-must-gather-qb8tq/crc-debug-c2sx4" Dec 07 19:55:37 crc kubenswrapper[4815]: I1207 19:55:37.881144 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qb8tq/crc-debug-c2sx4" Dec 07 19:55:38 crc kubenswrapper[4815]: I1207 19:55:38.256395 4815 generic.go:334] "Generic (PLEG): container finished" podID="04798220-974a-40e6-abfb-cc2df600088c" containerID="7b9f409fc018bb1a61421c45262dad14fc50dadfabd5a15f87a1f03e8641d502" exitCode=1 Dec 07 19:55:38 crc kubenswrapper[4815]: I1207 19:55:38.256431 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qb8tq/crc-debug-c2sx4" event={"ID":"04798220-974a-40e6-abfb-cc2df600088c","Type":"ContainerDied","Data":"7b9f409fc018bb1a61421c45262dad14fc50dadfabd5a15f87a1f03e8641d502"} Dec 07 19:55:38 crc kubenswrapper[4815]: I1207 19:55:38.256747 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qb8tq/crc-debug-c2sx4" event={"ID":"04798220-974a-40e6-abfb-cc2df600088c","Type":"ContainerStarted","Data":"b60a871e1646b3b2db56f35cbe84db5b918831ddeb3c694e8d077210560cea59"} Dec 07 19:55:38 crc kubenswrapper[4815]: I1207 19:55:38.295150 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-qb8tq/crc-debug-c2sx4"] Dec 07 19:55:38 crc kubenswrapper[4815]: I1207 19:55:38.332030 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-qb8tq/crc-debug-c2sx4"] Dec 07 19:55:39 crc kubenswrapper[4815]: I1207 19:55:39.354279 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qb8tq/crc-debug-c2sx4" Dec 07 19:55:39 crc kubenswrapper[4815]: I1207 19:55:39.439062 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5mscb\" (UniqueName: \"kubernetes.io/projected/04798220-974a-40e6-abfb-cc2df600088c-kube-api-access-5mscb\") pod \"04798220-974a-40e6-abfb-cc2df600088c\" (UID: \"04798220-974a-40e6-abfb-cc2df600088c\") " Dec 07 19:55:39 crc kubenswrapper[4815]: I1207 19:55:39.439154 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/04798220-974a-40e6-abfb-cc2df600088c-host\") pod \"04798220-974a-40e6-abfb-cc2df600088c\" (UID: \"04798220-974a-40e6-abfb-cc2df600088c\") " Dec 07 19:55:39 crc kubenswrapper[4815]: I1207 19:55:39.439303 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/04798220-974a-40e6-abfb-cc2df600088c-host" (OuterVolumeSpecName: "host") pod "04798220-974a-40e6-abfb-cc2df600088c" (UID: "04798220-974a-40e6-abfb-cc2df600088c"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 07 19:55:39 crc kubenswrapper[4815]: I1207 19:55:39.439625 4815 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/04798220-974a-40e6-abfb-cc2df600088c-host\") on node \"crc\" DevicePath \"\"" Dec 07 19:55:39 crc kubenswrapper[4815]: I1207 19:55:39.451647 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04798220-974a-40e6-abfb-cc2df600088c-kube-api-access-5mscb" (OuterVolumeSpecName: "kube-api-access-5mscb") pod "04798220-974a-40e6-abfb-cc2df600088c" (UID: "04798220-974a-40e6-abfb-cc2df600088c"). InnerVolumeSpecName "kube-api-access-5mscb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:55:39 crc kubenswrapper[4815]: I1207 19:55:39.541676 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5mscb\" (UniqueName: \"kubernetes.io/projected/04798220-974a-40e6-abfb-cc2df600088c-kube-api-access-5mscb\") on node \"crc\" DevicePath \"\"" Dec 07 19:55:39 crc kubenswrapper[4815]: I1207 19:55:39.780401 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04798220-974a-40e6-abfb-cc2df600088c" path="/var/lib/kubelet/pods/04798220-974a-40e6-abfb-cc2df600088c/volumes" Dec 07 19:55:40 crc kubenswrapper[4815]: I1207 19:55:40.273510 4815 scope.go:117] "RemoveContainer" containerID="7b9f409fc018bb1a61421c45262dad14fc50dadfabd5a15f87a1f03e8641d502" Dec 07 19:55:40 crc kubenswrapper[4815]: I1207 19:55:40.273668 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qb8tq/crc-debug-c2sx4" Dec 07 19:55:46 crc kubenswrapper[4815]: I1207 19:55:46.771110 4815 scope.go:117] "RemoveContainer" containerID="340b584b058ac547a7c5e921d79e470d96d688318eb315a993317b994733d8b7" Dec 07 19:55:46 crc kubenswrapper[4815]: E1207 19:55:46.771776 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gkn4h_openshift-machine-config-operator(3d662ba2-aa03-4eea-bd30-8ad40638f6c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" Dec 07 19:56:01 crc kubenswrapper[4815]: I1207 19:56:01.770426 4815 scope.go:117] "RemoveContainer" containerID="340b584b058ac547a7c5e921d79e470d96d688318eb315a993317b994733d8b7" Dec 07 19:56:01 crc kubenswrapper[4815]: E1207 19:56:01.772543 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gkn4h_openshift-machine-config-operator(3d662ba2-aa03-4eea-bd30-8ad40638f6c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" Dec 07 19:56:15 crc kubenswrapper[4815]: I1207 19:56:15.776072 4815 scope.go:117] "RemoveContainer" containerID="340b584b058ac547a7c5e921d79e470d96d688318eb315a993317b994733d8b7" Dec 07 19:56:15 crc kubenswrapper[4815]: E1207 19:56:15.777199 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gkn4h_openshift-machine-config-operator(3d662ba2-aa03-4eea-bd30-8ad40638f6c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" Dec 07 19:56:20 crc kubenswrapper[4815]: I1207 19:56:20.970093 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6cfdf5698d-xlqfg_5eb493fe-dc18-41f8-8102-7d6b906d1a63/barbican-api/0.log" Dec 07 19:56:21 crc kubenswrapper[4815]: I1207 19:56:21.139002 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6cfdf5698d-xlqfg_5eb493fe-dc18-41f8-8102-7d6b906d1a63/barbican-api-log/0.log" Dec 07 19:56:21 crc kubenswrapper[4815]: I1207 19:56:21.166809 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-bc648dd8-m5s6x_7b554cad-0000-4ecb-97df-7f0fbdb8c7e8/barbican-keystone-listener/0.log" Dec 07 19:56:21 crc kubenswrapper[4815]: I1207 19:56:21.242613 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-bc648dd8-m5s6x_7b554cad-0000-4ecb-97df-7f0fbdb8c7e8/barbican-keystone-listener-log/0.log" Dec 07 19:56:21 crc kubenswrapper[4815]: I1207 19:56:21.380804 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-867657f79f-kqjvk_22a26306-1a35-4196-8538-3361e51808fc/barbican-worker/0.log" Dec 07 19:56:21 crc kubenswrapper[4815]: I1207 19:56:21.508887 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-867657f79f-kqjvk_22a26306-1a35-4196-8538-3361e51808fc/barbican-worker-log/0.log" Dec 07 19:56:21 crc kubenswrapper[4815]: I1207 19:56:21.616319 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-zcf4l_ee1d4597-bd09-4333-b5bd-e50c52c92cd3/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Dec 07 19:56:21 crc kubenswrapper[4815]: I1207 19:56:21.743260 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2871977e-b513-4279-b52b-c9612ddc9005/ceilometer-central-agent/0.log" Dec 07 19:56:21 crc kubenswrapper[4815]: I1207 19:56:21.804836 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2871977e-b513-4279-b52b-c9612ddc9005/proxy-httpd/0.log" Dec 07 19:56:21 crc kubenswrapper[4815]: I1207 19:56:21.827701 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2871977e-b513-4279-b52b-c9612ddc9005/ceilometer-notification-agent/0.log" Dec 07 19:56:21 crc kubenswrapper[4815]: I1207 19:56:21.893558 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2871977e-b513-4279-b52b-c9612ddc9005/sg-core/0.log" Dec 07 19:56:22 crc kubenswrapper[4815]: I1207 19:56:22.038056 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-wnbh9_af58b4e7-d807-47a1-9c1c-21ab66fe7605/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam/0.log" Dec 07 19:56:22 crc kubenswrapper[4815]: I1207 19:56:22.133798 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_b698eff3-3885-47f0-bdf5-3de49fa89141/cinder-api/0.log" Dec 07 19:56:22 crc kubenswrapper[4815]: I1207 19:56:22.525180 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_b698eff3-3885-47f0-bdf5-3de49fa89141/cinder-api-log/0.log" Dec 07 19:56:22 crc kubenswrapper[4815]: I1207 19:56:22.638393 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_82a1e678-f13e-4971-86e1-9e53a78da17a/cinder-scheduler/0.log" Dec 07 19:56:22 crc kubenswrapper[4815]: I1207 19:56:22.682433 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_82a1e678-f13e-4971-86e1-9e53a78da17a/probe/0.log" Dec 07 19:56:22 crc kubenswrapper[4815]: I1207 19:56:22.819289 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-9pvhw_7a7fbd43-dafb-4aaa-be96-f84484865f64/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 07 19:56:22 crc kubenswrapper[4815]: I1207 19:56:22.911599 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-zfw9g_c6416bd8-ba0c-491b-b5b3-3a2ed9dff393/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 07 19:56:23 crc kubenswrapper[4815]: I1207 19:56:23.042163 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-667ff9c869-wkhmg_8f7331f8-b581-46c1-9e19-4c4fc88a2f29/init/0.log" Dec 07 19:56:23 crc kubenswrapper[4815]: I1207 19:56:23.196115 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-667ff9c869-wkhmg_8f7331f8-b581-46c1-9e19-4c4fc88a2f29/init/0.log" Dec 07 19:56:23 crc kubenswrapper[4815]: I1207 19:56:23.279581 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-667ff9c869-wkhmg_8f7331f8-b581-46c1-9e19-4c4fc88a2f29/dnsmasq-dns/0.log" Dec 07 19:56:23 crc kubenswrapper[4815]: I1207 19:56:23.304558 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-t8v85_acfde0af-ed46-4636-b117-d2c2c7b2c0c8/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 07 19:56:23 crc kubenswrapper[4815]: I1207 19:56:23.635785 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_2b5ed6cd-cad8-491a-b2a6-71868243996f/kube-state-metrics/0.log" Dec 07 19:56:23 crc kubenswrapper[4815]: I1207 19:56:23.644664 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-687459c5bc-4dzjp_010b84d9-e9f5-4902-994f-9cbf1ce02d26/keystone-api/0.log" Dec 07 19:56:23 crc kubenswrapper[4815]: I1207 19:56:23.960481 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-869d694d95-htxnx_565a9de4-ca65-41ab-83e0-1e7091486701/neutron-api/0.log" Dec 07 19:56:24 crc kubenswrapper[4815]: I1207 19:56:24.116603 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-869d694d95-htxnx_565a9de4-ca65-41ab-83e0-1e7091486701/neutron-httpd/0.log" Dec 07 19:56:24 crc kubenswrapper[4815]: I1207 19:56:24.449235 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_a8df9c38-f244-46b7-a580-c8882a5a73bf/nova-api-api/0.log" Dec 07 19:56:24 crc kubenswrapper[4815]: I1207 19:56:24.600159 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_a8df9c38-f244-46b7-a580-c8882a5a73bf/nova-api-log/0.log" Dec 07 19:56:24 crc kubenswrapper[4815]: I1207 19:56:24.871083 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_2efcb819-9e42-4fbb-ad60-8264a02dc9a0/nova-cell0-conductor-conductor/0.log" Dec 07 19:56:25 crc kubenswrapper[4815]: I1207 19:56:25.041197 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_c9b7680d-2790-4020-8305-fb018ebbee97/nova-cell1-conductor-conductor/0.log" Dec 07 19:56:25 crc kubenswrapper[4815]: I1207 19:56:25.311569 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_1986217b-0022-4645-bbd4-a0e3be8a0d03/nova-cell1-novncproxy-novncproxy/0.log" Dec 07 19:56:25 crc kubenswrapper[4815]: I1207 19:56:25.472335 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_809d96c5-69a9-4892-b39c-34cf1313aff6/nova-metadata-log/0.log" Dec 07 19:56:25 crc kubenswrapper[4815]: I1207 19:56:25.979170 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_1dbe253c-608d-4711-904b-44926572c998/mysql-bootstrap/0.log" Dec 07 19:56:25 crc kubenswrapper[4815]: I1207 19:56:25.983899 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_a3117427-1c45-4b4b-bf1d-d0ed52ff2bfa/nova-scheduler-scheduler/0.log" Dec 07 19:56:26 crc kubenswrapper[4815]: I1207 19:56:26.166051 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_1dbe253c-608d-4711-904b-44926572c998/mysql-bootstrap/0.log" Dec 07 19:56:26 crc kubenswrapper[4815]: I1207 19:56:26.189348 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_1dbe253c-608d-4711-904b-44926572c998/galera/0.log" Dec 07 19:56:26 crc kubenswrapper[4815]: I1207 19:56:26.227829 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_809d96c5-69a9-4892-b39c-34cf1313aff6/nova-metadata-metadata/0.log" Dec 07 19:56:26 crc kubenswrapper[4815]: I1207 19:56:26.409421 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_b428ca75-b6e8-428e-be32-eb320bacbdda/mysql-bootstrap/0.log" Dec 07 19:56:26 crc kubenswrapper[4815]: I1207 19:56:26.607791 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_b428ca75-b6e8-428e-be32-eb320bacbdda/mysql-bootstrap/0.log" Dec 07 19:56:26 crc kubenswrapper[4815]: I1207 19:56:26.687217 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_b428ca75-b6e8-428e-be32-eb320bacbdda/galera/0.log" Dec 07 19:56:26 crc kubenswrapper[4815]: I1207 19:56:26.770179 4815 scope.go:117] "RemoveContainer" containerID="340b584b058ac547a7c5e921d79e470d96d688318eb315a993317b994733d8b7" Dec 07 19:56:26 crc kubenswrapper[4815]: E1207 19:56:26.770400 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gkn4h_openshift-machine-config-operator(3d662ba2-aa03-4eea-bd30-8ad40638f6c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" Dec 07 19:56:26 crc kubenswrapper[4815]: I1207 19:56:26.798416 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_d115a489-cf3a-4e3c-af23-fbe3875a65f2/openstackclient/0.log" Dec 07 19:56:27 crc kubenswrapper[4815]: I1207 19:56:27.042574 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-24gkx_5d3ed0f7-1ea2-48e7-bab4-f5a709da4850/ovn-controller/0.log" Dec 07 19:56:27 crc kubenswrapper[4815]: I1207 19:56:27.136531 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-7fzms_83334f9a-9d33-4229-9552-9f69c525dd82/openstack-network-exporter/0.log" Dec 07 19:56:27 crc kubenswrapper[4815]: I1207 19:56:27.331890 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-bhvc5_2889719b-c3b6-4a36-8c6e-4b3e9f9a2c21/ovsdb-server-init/0.log" Dec 07 19:56:27 crc kubenswrapper[4815]: I1207 19:56:27.596048 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-bhvc5_2889719b-c3b6-4a36-8c6e-4b3e9f9a2c21/ovsdb-server-init/0.log" Dec 07 19:56:27 crc kubenswrapper[4815]: I1207 19:56:27.628950 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-bhvc5_2889719b-c3b6-4a36-8c6e-4b3e9f9a2c21/ovsdb-server/0.log" Dec 07 19:56:27 crc kubenswrapper[4815]: I1207 19:56:27.660448 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-bhvc5_2889719b-c3b6-4a36-8c6e-4b3e9f9a2c21/ovs-vswitchd/0.log" Dec 07 19:56:27 crc kubenswrapper[4815]: I1207 19:56:27.808106 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_46adfee5-dc94-49a0-bf62-33d15b10e89c/openstack-network-exporter/0.log" Dec 07 19:56:27 crc kubenswrapper[4815]: I1207 19:56:27.882929 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_46adfee5-dc94-49a0-bf62-33d15b10e89c/ovn-northd/0.log" Dec 07 19:56:28 crc kubenswrapper[4815]: I1207 19:56:28.001982 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_4ebfeb2d-526f-4f02-be1d-def3f49c555c/openstack-network-exporter/0.log" Dec 07 19:56:28 crc kubenswrapper[4815]: I1207 19:56:28.228657 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_4ebfeb2d-526f-4f02-be1d-def3f49c555c/ovsdbserver-nb/0.log" Dec 07 19:56:28 crc kubenswrapper[4815]: I1207 19:56:28.274464 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_17df86ce-9c36-46d9-b5f2-d5dae8ae4675/openstack-network-exporter/0.log" Dec 07 19:56:28 crc kubenswrapper[4815]: I1207 19:56:28.308888 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_17df86ce-9c36-46d9-b5f2-d5dae8ae4675/ovsdbserver-sb/0.log" Dec 07 19:56:28 crc kubenswrapper[4815]: I1207 19:56:28.464559 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-8bbdc95bd-g568f_7f1e0e26-2abf-4719-8973-301f1d821a4e/placement-api/0.log" Dec 07 19:56:28 crc kubenswrapper[4815]: I1207 19:56:28.588870 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-8bbdc95bd-g568f_7f1e0e26-2abf-4719-8973-301f1d821a4e/placement-log/0.log" Dec 07 19:56:28 crc kubenswrapper[4815]: I1207 19:56:28.735540 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_22bd77dc-c382-467f-928a-4be062c951ca/setup-container/0.log" Dec 07 19:56:29 crc kubenswrapper[4815]: I1207 19:56:29.003249 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_22bd77dc-c382-467f-928a-4be062c951ca/rabbitmq/0.log" Dec 07 19:56:29 crc kubenswrapper[4815]: I1207 19:56:29.043045 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_22bd77dc-c382-467f-928a-4be062c951ca/setup-container/0.log" Dec 07 19:56:29 crc kubenswrapper[4815]: I1207 19:56:29.081291 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_95704672-75eb-411c-a866-09ed671263f7/setup-container/0.log" Dec 07 19:56:29 crc kubenswrapper[4815]: I1207 19:56:29.311344 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_95704672-75eb-411c-a866-09ed671263f7/rabbitmq/0.log" Dec 07 19:56:29 crc kubenswrapper[4815]: I1207 19:56:29.411261 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_95704672-75eb-411c-a866-09ed671263f7/setup-container/0.log" Dec 07 19:56:29 crc kubenswrapper[4815]: I1207 19:56:29.499748 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-phdwz_f593e5ae-6df7-4ec1-b41b-e1ddc2f5e35e/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 07 19:56:29 crc kubenswrapper[4815]: I1207 19:56:29.652749 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-g84n8_6db783ed-5795-4761-8176-3d425073a274/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Dec 07 19:56:29 crc kubenswrapper[4815]: I1207 19:56:29.784877 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-mp7vp_bfe86916-a7d7-41a5-853c-acb6c37a85c1/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 07 19:56:30 crc kubenswrapper[4815]: I1207 19:56:30.165334 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-mcc2r_f094fb15-aab4-422c-b2a1-1d0fd8d7ff22/ssh-known-hosts-edpm-deployment/0.log" Dec 07 19:56:30 crc kubenswrapper[4815]: I1207 19:56:30.174011 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-5kdl5_91293094-d529-4fbf-84db-c166a7ebcb7f/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 07 19:56:31 crc kubenswrapper[4815]: I1207 19:56:31.839414 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_77640a82-dd82-4fe6-89f8-58d8ab094b71/memcached/0.log" Dec 07 19:56:37 crc kubenswrapper[4815]: I1207 19:56:37.769616 4815 scope.go:117] "RemoveContainer" containerID="340b584b058ac547a7c5e921d79e470d96d688318eb315a993317b994733d8b7" Dec 07 19:56:37 crc kubenswrapper[4815]: E1207 19:56:37.771639 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gkn4h_openshift-machine-config-operator(3d662ba2-aa03-4eea-bd30-8ad40638f6c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" Dec 07 19:56:51 crc kubenswrapper[4815]: I1207 19:56:51.883666 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4fcc6706180aae165b51ffb35b91d9cba63d1d00f0c693434074c2a16etbzsv_2c3a7f0a-597d-40ee-a0ad-7a83ff99a3c8/util/0.log" Dec 07 19:56:52 crc kubenswrapper[4815]: I1207 19:56:52.119651 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4fcc6706180aae165b51ffb35b91d9cba63d1d00f0c693434074c2a16etbzsv_2c3a7f0a-597d-40ee-a0ad-7a83ff99a3c8/pull/0.log" Dec 07 19:56:52 crc kubenswrapper[4815]: I1207 19:56:52.154783 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4fcc6706180aae165b51ffb35b91d9cba63d1d00f0c693434074c2a16etbzsv_2c3a7f0a-597d-40ee-a0ad-7a83ff99a3c8/pull/0.log" Dec 07 19:56:52 crc kubenswrapper[4815]: I1207 19:56:52.164272 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4fcc6706180aae165b51ffb35b91d9cba63d1d00f0c693434074c2a16etbzsv_2c3a7f0a-597d-40ee-a0ad-7a83ff99a3c8/util/0.log" Dec 07 19:56:52 crc kubenswrapper[4815]: I1207 19:56:52.361744 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4fcc6706180aae165b51ffb35b91d9cba63d1d00f0c693434074c2a16etbzsv_2c3a7f0a-597d-40ee-a0ad-7a83ff99a3c8/util/0.log" Dec 07 19:56:52 crc kubenswrapper[4815]: I1207 19:56:52.421588 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4fcc6706180aae165b51ffb35b91d9cba63d1d00f0c693434074c2a16etbzsv_2c3a7f0a-597d-40ee-a0ad-7a83ff99a3c8/pull/0.log" Dec 07 19:56:52 crc kubenswrapper[4815]: I1207 19:56:52.464224 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4fcc6706180aae165b51ffb35b91d9cba63d1d00f0c693434074c2a16etbzsv_2c3a7f0a-597d-40ee-a0ad-7a83ff99a3c8/extract/0.log" Dec 07 19:56:52 crc kubenswrapper[4815]: I1207 19:56:52.602317 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-jr22f_4c8e53f4-6dec-4655-b931-b8d0b8ddc8da/kube-rbac-proxy/0.log" Dec 07 19:56:52 crc kubenswrapper[4815]: I1207 19:56:52.642908 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-6c677c69b-bq2qd_0cc9e387-27e4-4b5d-ac7a-d9f098acb973/kube-rbac-proxy/0.log" Dec 07 19:56:52 crc kubenswrapper[4815]: I1207 19:56:52.673755 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-jr22f_4c8e53f4-6dec-4655-b931-b8d0b8ddc8da/manager/0.log" Dec 07 19:56:52 crc kubenswrapper[4815]: I1207 19:56:52.770354 4815 scope.go:117] "RemoveContainer" containerID="340b584b058ac547a7c5e921d79e470d96d688318eb315a993317b994733d8b7" Dec 07 19:56:52 crc kubenswrapper[4815]: E1207 19:56:52.771049 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gkn4h_openshift-machine-config-operator(3d662ba2-aa03-4eea-bd30-8ad40638f6c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" Dec 07 19:56:52 crc kubenswrapper[4815]: I1207 19:56:52.854090 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-6c677c69b-bq2qd_0cc9e387-27e4-4b5d-ac7a-d9f098acb973/manager/0.log" Dec 07 19:56:52 crc kubenswrapper[4815]: I1207 19:56:52.890989 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-697fb699cf-psfhw_cbeacff5-ac96-4444-aa57-04a320582348/kube-rbac-proxy/0.log" Dec 07 19:56:52 crc kubenswrapper[4815]: I1207 19:56:52.958826 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-697fb699cf-psfhw_cbeacff5-ac96-4444-aa57-04a320582348/manager/0.log" Dec 07 19:56:53 crc kubenswrapper[4815]: I1207 19:56:53.094854 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5697bb5779-7wwz9_603612f4-25fa-4356-a99c-b054645d8919/kube-rbac-proxy/0.log" Dec 07 19:56:53 crc kubenswrapper[4815]: I1207 19:56:53.156652 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5697bb5779-7wwz9_603612f4-25fa-4356-a99c-b054645d8919/manager/0.log" Dec 07 19:56:53 crc kubenswrapper[4815]: I1207 19:56:53.262126 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-trnlm_d6eb6a40-b713-4a4a-9554-749f94cf1137/kube-rbac-proxy/0.log" Dec 07 19:56:53 crc kubenswrapper[4815]: I1207 19:56:53.301191 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-trnlm_d6eb6a40-b713-4a4a-9554-749f94cf1137/manager/0.log" Dec 07 19:56:53 crc kubenswrapper[4815]: I1207 19:56:53.414091 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-spddl_33bdbbd1-62ad-42d5-a10a-a5da1344af19/kube-rbac-proxy/0.log" Dec 07 19:56:53 crc kubenswrapper[4815]: I1207 19:56:53.466414 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-spddl_33bdbbd1-62ad-42d5-a10a-a5da1344af19/manager/0.log" Dec 07 19:56:53 crc kubenswrapper[4815]: I1207 19:56:53.609688 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-78d48bff9d-5rmk7_01adf042-9afe-46be-ba0c-0c1a3f86ed8d/kube-rbac-proxy/0.log" Dec 07 19:56:53 crc kubenswrapper[4815]: I1207 19:56:53.819899 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-78d48bff9d-5rmk7_01adf042-9afe-46be-ba0c-0c1a3f86ed8d/manager/0.log" Dec 07 19:56:53 crc kubenswrapper[4815]: I1207 19:56:53.850502 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-967d97867-z7p7t_954764fa-df14-4604-96d2-6ddc12155406/manager/0.log" Dec 07 19:56:53 crc kubenswrapper[4815]: I1207 19:56:53.879057 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-967d97867-z7p7t_954764fa-df14-4604-96d2-6ddc12155406/kube-rbac-proxy/0.log" Dec 07 19:56:54 crc kubenswrapper[4815]: I1207 19:56:54.002623 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-5shnd_0404b56e-87cb-40c5-b11c-64dc7c960718/kube-rbac-proxy/0.log" Dec 07 19:56:54 crc kubenswrapper[4815]: I1207 19:56:54.134660 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-5shnd_0404b56e-87cb-40c5-b11c-64dc7c960718/manager/0.log" Dec 07 19:56:54 crc kubenswrapper[4815]: I1207 19:56:54.213903 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-5b5fd79c9c-7d9wv_b7efe346-4e91-45a1-84d6-6a5aac1c739c/kube-rbac-proxy/0.log" Dec 07 19:56:54 crc kubenswrapper[4815]: I1207 19:56:54.382143 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-79c8c4686c-4wf77_c727b772-a57c-4564-bf8d-7c8917b4bb0d/kube-rbac-proxy/0.log" Dec 07 19:56:54 crc kubenswrapper[4815]: I1207 19:56:54.702396 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-x59hf_8055e37c-efd4-4c82-a9df-4d5e2a12ef63/kube-rbac-proxy/0.log" Dec 07 19:56:54 crc kubenswrapper[4815]: I1207 19:56:54.867569 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-5b5fd79c9c-7d9wv_b7efe346-4e91-45a1-84d6-6a5aac1c739c/manager/0.log" Dec 07 19:56:54 crc kubenswrapper[4815]: I1207 19:56:54.936072 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-x59hf_8055e37c-efd4-4c82-a9df-4d5e2a12ef63/manager/0.log" Dec 07 19:56:54 crc kubenswrapper[4815]: I1207 19:56:54.956186 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-79c8c4686c-4wf77_c727b772-a57c-4564-bf8d-7c8917b4bb0d/manager/0.log" Dec 07 19:56:55 crc kubenswrapper[4815]: I1207 19:56:55.118633 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-8mg5j_0dc4f082-f669-444f-a46e-9bca4cc20f31/kube-rbac-proxy/0.log" Dec 07 19:56:55 crc kubenswrapper[4815]: I1207 19:56:55.232261 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-8mg5j_0dc4f082-f669-444f-a46e-9bca4cc20f31/manager/0.log" Dec 07 19:56:55 crc kubenswrapper[4815]: I1207 19:56:55.350942 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-m6p4v_2cd3ca37-5d31-4068-88df-1344ebfad5e7/kube-rbac-proxy/0.log" Dec 07 19:56:55 crc kubenswrapper[4815]: I1207 19:56:55.429062 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-m6p4v_2cd3ca37-5d31-4068-88df-1344ebfad5e7/manager/0.log" Dec 07 19:56:55 crc kubenswrapper[4815]: I1207 19:56:55.553097 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-84b575879fvvqrm_097ed41c-7445-4a2f-ba72-c9ff11bb0e28/kube-rbac-proxy/0.log" Dec 07 19:56:55 crc kubenswrapper[4815]: I1207 19:56:55.587952 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-84b575879fvvqrm_097ed41c-7445-4a2f-ba72-c9ff11bb0e28/manager/0.log" Dec 07 19:56:56 crc kubenswrapper[4815]: I1207 19:56:56.266213 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-76dbcc48db-k25j8_0d9bc620-6be8-4f29-aa0a-40c6f9d9c341/operator/0.log" Dec 07 19:56:56 crc kubenswrapper[4815]: I1207 19:56:56.399411 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-xw277_58821771-f5ba-40e9-8ba4-ccced4549d86/registry-server/0.log" Dec 07 19:56:56 crc kubenswrapper[4815]: I1207 19:56:56.643523 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-dtpkv_34bdeed9-8374-4eea-a1d9-562d933324e9/kube-rbac-proxy/0.log" Dec 07 19:56:56 crc kubenswrapper[4815]: I1207 19:56:56.741516 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-dtpkv_34bdeed9-8374-4eea-a1d9-562d933324e9/manager/0.log" Dec 07 19:56:56 crc kubenswrapper[4815]: I1207 19:56:56.927873 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-7dc7d5d6ff-dnrvh_33f773e2-304c-4d0b-98e9-6fd309462297/manager/0.log" Dec 07 19:56:56 crc kubenswrapper[4815]: I1207 19:56:56.934194 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-4sb78_94d115be-3d8c-46e7-9a22-e09bb888afc8/kube-rbac-proxy/0.log" Dec 07 19:56:56 crc kubenswrapper[4815]: I1207 19:56:56.972003 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-4sb78_94d115be-3d8c-46e7-9a22-e09bb888afc8/manager/0.log" Dec 07 19:56:57 crc kubenswrapper[4815]: I1207 19:56:57.049205 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-xx86c_2aeb4ccc-f6a6-4614-917c-55cf0a46c3cd/operator/0.log" Dec 07 19:56:57 crc kubenswrapper[4815]: I1207 19:56:57.216314 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-9d58d64bc-jsdgf_2e427dde-fbfa-4b36-9749-e83080d8733a/kube-rbac-proxy/0.log" Dec 07 19:56:57 crc kubenswrapper[4815]: I1207 19:56:57.240289 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-9d58d64bc-jsdgf_2e427dde-fbfa-4b36-9749-e83080d8733a/manager/0.log" Dec 07 19:56:57 crc kubenswrapper[4815]: I1207 19:56:57.272801 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-58d5ff84df-8spbl_8e51367a-70c8-4b67-b15f-ee4202171e38/kube-rbac-proxy/0.log" Dec 07 19:56:57 crc kubenswrapper[4815]: I1207 19:56:57.427221 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-gfvgj_bed3f9f7-d38b-4987-90fd-1c4a380165f4/manager/0.log" Dec 07 19:56:57 crc kubenswrapper[4815]: I1207 19:56:57.427909 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-gfvgj_bed3f9f7-d38b-4987-90fd-1c4a380165f4/kube-rbac-proxy/0.log" Dec 07 19:56:57 crc kubenswrapper[4815]: I1207 19:56:57.504397 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-58d5ff84df-8spbl_8e51367a-70c8-4b67-b15f-ee4202171e38/manager/0.log" Dec 07 19:56:57 crc kubenswrapper[4815]: I1207 19:56:57.617169 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-667bd8d554-kdbct_21b81ae2-586e-418c-867d-0d10c3c094eb/kube-rbac-proxy/0.log" Dec 07 19:56:57 crc kubenswrapper[4815]: I1207 19:56:57.656663 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-667bd8d554-kdbct_21b81ae2-586e-418c-867d-0d10c3c094eb/manager/0.log" Dec 07 19:57:06 crc kubenswrapper[4815]: I1207 19:57:06.770009 4815 scope.go:117] "RemoveContainer" containerID="340b584b058ac547a7c5e921d79e470d96d688318eb315a993317b994733d8b7" Dec 07 19:57:06 crc kubenswrapper[4815]: E1207 19:57:06.770826 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gkn4h_openshift-machine-config-operator(3d662ba2-aa03-4eea-bd30-8ad40638f6c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" Dec 07 19:57:16 crc kubenswrapper[4815]: I1207 19:57:16.473234 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-xnwvg_7ddadb50-73b1-4948-807b-fe26ca78ea76/control-plane-machine-set-operator/0.log" Dec 07 19:57:16 crc kubenswrapper[4815]: I1207 19:57:16.660515 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-mxwsj_d1f45a14-a0f6-4585-8171-2ef3dc5e5d9b/kube-rbac-proxy/0.log" Dec 07 19:57:16 crc kubenswrapper[4815]: I1207 19:57:16.776840 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-mxwsj_d1f45a14-a0f6-4585-8171-2ef3dc5e5d9b/machine-api-operator/0.log" Dec 07 19:57:17 crc kubenswrapper[4815]: I1207 19:57:17.770317 4815 scope.go:117] "RemoveContainer" containerID="340b584b058ac547a7c5e921d79e470d96d688318eb315a993317b994733d8b7" Dec 07 19:57:17 crc kubenswrapper[4815]: E1207 19:57:17.770624 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gkn4h_openshift-machine-config-operator(3d662ba2-aa03-4eea-bd30-8ad40638f6c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" Dec 07 19:57:29 crc kubenswrapper[4815]: I1207 19:57:29.004550 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-52gfx_2e33a89c-dfcf-456f-bf42-5f98dbe2cbca/cert-manager-controller/0.log" Dec 07 19:57:29 crc kubenswrapper[4815]: I1207 19:57:29.182665 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-t2msm_c3a5ced1-3996-46a0-9762-bf0ff63654db/cert-manager-cainjector/0.log" Dec 07 19:57:29 crc kubenswrapper[4815]: I1207 19:57:29.297195 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-gghm8_9181caec-fc89-487a-ba9b-269b81d5a18a/cert-manager-webhook/0.log" Dec 07 19:57:30 crc kubenswrapper[4815]: I1207 19:57:30.770629 4815 scope.go:117] "RemoveContainer" containerID="340b584b058ac547a7c5e921d79e470d96d688318eb315a993317b994733d8b7" Dec 07 19:57:30 crc kubenswrapper[4815]: E1207 19:57:30.771287 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gkn4h_openshift-machine-config-operator(3d662ba2-aa03-4eea-bd30-8ad40638f6c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" Dec 07 19:57:41 crc kubenswrapper[4815]: I1207 19:57:41.983383 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-dr8jc_b6174174-1523-4a61-84e4-3d2d8f12303f/nmstate-console-plugin/0.log" Dec 07 19:57:42 crc kubenswrapper[4815]: I1207 19:57:42.233422 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-fnq5m_20b5ab6f-84d1-4d0f-a133-c800fbc797c1/nmstate-handler/0.log" Dec 07 19:57:42 crc kubenswrapper[4815]: I1207 19:57:42.284169 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-4z7hm_de2e93af-0942-4efc-a33f-c9685c554154/kube-rbac-proxy/0.log" Dec 07 19:57:42 crc kubenswrapper[4815]: I1207 19:57:42.342365 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-4z7hm_de2e93af-0942-4efc-a33f-c9685c554154/nmstate-metrics/0.log" Dec 07 19:57:42 crc kubenswrapper[4815]: I1207 19:57:42.534076 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-zg96h_49c1d011-c4fa-4a22-b8e6-102fcc05362f/nmstate-operator/0.log" Dec 07 19:57:42 crc kubenswrapper[4815]: I1207 19:57:42.583968 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-8khj4_a75fb831-cdc3-4a5b-b4ae-451acdaef347/nmstate-webhook/0.log" Dec 07 19:57:43 crc kubenswrapper[4815]: I1207 19:57:43.770029 4815 scope.go:117] "RemoveContainer" containerID="340b584b058ac547a7c5e921d79e470d96d688318eb315a993317b994733d8b7" Dec 07 19:57:43 crc kubenswrapper[4815]: E1207 19:57:43.770384 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gkn4h_openshift-machine-config-operator(3d662ba2-aa03-4eea-bd30-8ad40638f6c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" Dec 07 19:57:55 crc kubenswrapper[4815]: I1207 19:57:55.775473 4815 scope.go:117] "RemoveContainer" containerID="340b584b058ac547a7c5e921d79e470d96d688318eb315a993317b994733d8b7" Dec 07 19:57:55 crc kubenswrapper[4815]: E1207 19:57:55.776252 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gkn4h_openshift-machine-config-operator(3d662ba2-aa03-4eea-bd30-8ad40638f6c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" Dec 07 19:57:57 crc kubenswrapper[4815]: I1207 19:57:57.275174 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-29cdv_6454b83c-e2d6-413c-abbf-21a8f749750c/kube-rbac-proxy/0.log" Dec 07 19:57:57 crc kubenswrapper[4815]: I1207 19:57:57.334560 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-29cdv_6454b83c-e2d6-413c-abbf-21a8f749750c/controller/0.log" Dec 07 19:57:57 crc kubenswrapper[4815]: I1207 19:57:57.513074 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-7zz67_b91cf016-e00e-4b0c-a1ef-2b851971e00b/frr-k8s-webhook-server/0.log" Dec 07 19:57:57 crc kubenswrapper[4815]: I1207 19:57:57.597453 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wp7l7_f91a6d74-3342-4de5-92c4-354251161c5d/cp-frr-files/0.log" Dec 07 19:57:57 crc kubenswrapper[4815]: I1207 19:57:57.764050 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wp7l7_f91a6d74-3342-4de5-92c4-354251161c5d/cp-frr-files/0.log" Dec 07 19:57:57 crc kubenswrapper[4815]: I1207 19:57:57.806381 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wp7l7_f91a6d74-3342-4de5-92c4-354251161c5d/cp-metrics/0.log" Dec 07 19:57:57 crc kubenswrapper[4815]: I1207 19:57:57.828961 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wp7l7_f91a6d74-3342-4de5-92c4-354251161c5d/cp-reloader/0.log" Dec 07 19:57:57 crc kubenswrapper[4815]: I1207 19:57:57.851592 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wp7l7_f91a6d74-3342-4de5-92c4-354251161c5d/cp-reloader/0.log" Dec 07 19:57:58 crc kubenswrapper[4815]: I1207 19:57:58.019779 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wp7l7_f91a6d74-3342-4de5-92c4-354251161c5d/cp-reloader/0.log" Dec 07 19:57:58 crc kubenswrapper[4815]: I1207 19:57:58.099571 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wp7l7_f91a6d74-3342-4de5-92c4-354251161c5d/cp-frr-files/0.log" Dec 07 19:57:58 crc kubenswrapper[4815]: I1207 19:57:58.113797 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wp7l7_f91a6d74-3342-4de5-92c4-354251161c5d/cp-metrics/0.log" Dec 07 19:57:58 crc kubenswrapper[4815]: I1207 19:57:58.178407 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wp7l7_f91a6d74-3342-4de5-92c4-354251161c5d/cp-metrics/0.log" Dec 07 19:57:58 crc kubenswrapper[4815]: I1207 19:57:58.374574 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wp7l7_f91a6d74-3342-4de5-92c4-354251161c5d/cp-frr-files/0.log" Dec 07 19:57:58 crc kubenswrapper[4815]: I1207 19:57:58.393541 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wp7l7_f91a6d74-3342-4de5-92c4-354251161c5d/cp-reloader/0.log" Dec 07 19:57:58 crc kubenswrapper[4815]: I1207 19:57:58.405742 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wp7l7_f91a6d74-3342-4de5-92c4-354251161c5d/controller/0.log" Dec 07 19:57:58 crc kubenswrapper[4815]: I1207 19:57:58.440854 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wp7l7_f91a6d74-3342-4de5-92c4-354251161c5d/cp-metrics/0.log" Dec 07 19:57:58 crc kubenswrapper[4815]: I1207 19:57:58.601102 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wp7l7_f91a6d74-3342-4de5-92c4-354251161c5d/frr-metrics/0.log" Dec 07 19:57:58 crc kubenswrapper[4815]: I1207 19:57:58.605978 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wp7l7_f91a6d74-3342-4de5-92c4-354251161c5d/kube-rbac-proxy-frr/0.log" Dec 07 19:57:58 crc kubenswrapper[4815]: I1207 19:57:58.683101 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wp7l7_f91a6d74-3342-4de5-92c4-354251161c5d/kube-rbac-proxy/0.log" Dec 07 19:57:58 crc kubenswrapper[4815]: I1207 19:57:58.844283 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wp7l7_f91a6d74-3342-4de5-92c4-354251161c5d/reloader/0.log" Dec 07 19:57:58 crc kubenswrapper[4815]: I1207 19:57:58.953461 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-b799b5946-s42vm_2855ae17-4194-4def-b747-15972b85f28f/manager/0.log" Dec 07 19:57:59 crc kubenswrapper[4815]: I1207 19:57:59.316846 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-858ccc4d87-28ck9_a2b44f0b-e1f1-472c-816e-acfca6f08db5/webhook-server/0.log" Dec 07 19:57:59 crc kubenswrapper[4815]: I1207 19:57:59.541753 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-xdlfl_cd570d1e-fecb-4987-b842-d3e40a89a7a7/kube-rbac-proxy/0.log" Dec 07 19:57:59 crc kubenswrapper[4815]: I1207 19:57:59.568498 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wp7l7_f91a6d74-3342-4de5-92c4-354251161c5d/frr/0.log" Dec 07 19:57:59 crc kubenswrapper[4815]: I1207 19:57:59.901316 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-xdlfl_cd570d1e-fecb-4987-b842-d3e40a89a7a7/speaker/0.log" Dec 07 19:58:08 crc kubenswrapper[4815]: I1207 19:58:08.771119 4815 scope.go:117] "RemoveContainer" containerID="340b584b058ac547a7c5e921d79e470d96d688318eb315a993317b994733d8b7" Dec 07 19:58:09 crc kubenswrapper[4815]: I1207 19:58:09.625032 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" event={"ID":"3d662ba2-aa03-4eea-bd30-8ad40638f6c7","Type":"ContainerStarted","Data":"cfd4ba17f36b5ec097ece042ae7edbfb01b68a6555e6ad1f367486be28f1eea1"} Dec 07 19:58:12 crc kubenswrapper[4815]: I1207 19:58:12.541094 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftgl7c_6fa0357a-e808-457c-b9dc-728eca137551/util/0.log" Dec 07 19:58:12 crc kubenswrapper[4815]: I1207 19:58:12.788930 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftgl7c_6fa0357a-e808-457c-b9dc-728eca137551/util/0.log" Dec 07 19:58:12 crc kubenswrapper[4815]: I1207 19:58:12.792341 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftgl7c_6fa0357a-e808-457c-b9dc-728eca137551/pull/0.log" Dec 07 19:58:12 crc kubenswrapper[4815]: I1207 19:58:12.797036 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftgl7c_6fa0357a-e808-457c-b9dc-728eca137551/pull/0.log" Dec 07 19:58:13 crc kubenswrapper[4815]: I1207 19:58:13.005583 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftgl7c_6fa0357a-e808-457c-b9dc-728eca137551/util/0.log" Dec 07 19:58:13 crc kubenswrapper[4815]: I1207 19:58:13.019124 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftgl7c_6fa0357a-e808-457c-b9dc-728eca137551/pull/0.log" Dec 07 19:58:13 crc kubenswrapper[4815]: I1207 19:58:13.074120 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftgl7c_6fa0357a-e808-457c-b9dc-728eca137551/extract/0.log" Dec 07 19:58:13 crc kubenswrapper[4815]: I1207 19:58:13.212413 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83xcc8m_2599f100-fc55-4ced-96e8-1a6cc7de37f0/util/0.log" Dec 07 19:58:13 crc kubenswrapper[4815]: I1207 19:58:13.367467 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83xcc8m_2599f100-fc55-4ced-96e8-1a6cc7de37f0/pull/0.log" Dec 07 19:58:13 crc kubenswrapper[4815]: I1207 19:58:13.371345 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83xcc8m_2599f100-fc55-4ced-96e8-1a6cc7de37f0/util/0.log" Dec 07 19:58:13 crc kubenswrapper[4815]: I1207 19:58:13.385791 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83xcc8m_2599f100-fc55-4ced-96e8-1a6cc7de37f0/pull/0.log" Dec 07 19:58:13 crc kubenswrapper[4815]: I1207 19:58:13.522230 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83xcc8m_2599f100-fc55-4ced-96e8-1a6cc7de37f0/util/0.log" Dec 07 19:58:13 crc kubenswrapper[4815]: I1207 19:58:13.574990 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83xcc8m_2599f100-fc55-4ced-96e8-1a6cc7de37f0/extract/0.log" Dec 07 19:58:13 crc kubenswrapper[4815]: I1207 19:58:13.583365 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83xcc8m_2599f100-fc55-4ced-96e8-1a6cc7de37f0/pull/0.log" Dec 07 19:58:13 crc kubenswrapper[4815]: I1207 19:58:13.744516 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qz8lw_3c190de3-1f0d-4455-ad0d-551ead201424/extract-utilities/0.log" Dec 07 19:58:13 crc kubenswrapper[4815]: I1207 19:58:13.881834 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qz8lw_3c190de3-1f0d-4455-ad0d-551ead201424/extract-content/0.log" Dec 07 19:58:13 crc kubenswrapper[4815]: I1207 19:58:13.929778 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qz8lw_3c190de3-1f0d-4455-ad0d-551ead201424/extract-utilities/0.log" Dec 07 19:58:13 crc kubenswrapper[4815]: I1207 19:58:13.931848 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qz8lw_3c190de3-1f0d-4455-ad0d-551ead201424/extract-content/0.log" Dec 07 19:58:14 crc kubenswrapper[4815]: I1207 19:58:14.165450 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qz8lw_3c190de3-1f0d-4455-ad0d-551ead201424/extract-content/0.log" Dec 07 19:58:14 crc kubenswrapper[4815]: I1207 19:58:14.174233 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qz8lw_3c190de3-1f0d-4455-ad0d-551ead201424/extract-utilities/0.log" Dec 07 19:58:14 crc kubenswrapper[4815]: I1207 19:58:14.429473 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-9xq4b_bbe87d56-0208-4acc-aa62-97ba74eb4163/extract-utilities/0.log" Dec 07 19:58:14 crc kubenswrapper[4815]: I1207 19:58:14.485491 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qz8lw_3c190de3-1f0d-4455-ad0d-551ead201424/registry-server/0.log" Dec 07 19:58:14 crc kubenswrapper[4815]: I1207 19:58:14.673184 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-9xq4b_bbe87d56-0208-4acc-aa62-97ba74eb4163/extract-content/0.log" Dec 07 19:58:14 crc kubenswrapper[4815]: I1207 19:58:14.695185 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-9xq4b_bbe87d56-0208-4acc-aa62-97ba74eb4163/extract-utilities/0.log" Dec 07 19:58:14 crc kubenswrapper[4815]: I1207 19:58:14.717237 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-9xq4b_bbe87d56-0208-4acc-aa62-97ba74eb4163/extract-content/0.log" Dec 07 19:58:14 crc kubenswrapper[4815]: I1207 19:58:14.884099 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-9xq4b_bbe87d56-0208-4acc-aa62-97ba74eb4163/extract-content/0.log" Dec 07 19:58:14 crc kubenswrapper[4815]: I1207 19:58:14.915581 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-9xq4b_bbe87d56-0208-4acc-aa62-97ba74eb4163/extract-utilities/0.log" Dec 07 19:58:15 crc kubenswrapper[4815]: I1207 19:58:15.256337 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-x887s_c32bb3fd-40b2-4f28-9fad-9283162b80c1/marketplace-operator/0.log" Dec 07 19:58:15 crc kubenswrapper[4815]: I1207 19:58:15.262872 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-9xq4b_bbe87d56-0208-4acc-aa62-97ba74eb4163/registry-server/0.log" Dec 07 19:58:15 crc kubenswrapper[4815]: I1207 19:58:15.336352 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-4qt4c_20784fa7-7f38-4dde-a229-37dc1db2e351/extract-utilities/0.log" Dec 07 19:58:15 crc kubenswrapper[4815]: I1207 19:58:15.522019 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-4qt4c_20784fa7-7f38-4dde-a229-37dc1db2e351/extract-content/0.log" Dec 07 19:58:15 crc kubenswrapper[4815]: I1207 19:58:15.574812 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-4qt4c_20784fa7-7f38-4dde-a229-37dc1db2e351/extract-utilities/0.log" Dec 07 19:58:15 crc kubenswrapper[4815]: I1207 19:58:15.588821 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-4qt4c_20784fa7-7f38-4dde-a229-37dc1db2e351/extract-content/0.log" Dec 07 19:58:15 crc kubenswrapper[4815]: I1207 19:58:15.731159 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-4qt4c_20784fa7-7f38-4dde-a229-37dc1db2e351/extract-utilities/0.log" Dec 07 19:58:15 crc kubenswrapper[4815]: I1207 19:58:15.731326 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-4qt4c_20784fa7-7f38-4dde-a229-37dc1db2e351/extract-content/0.log" Dec 07 19:58:15 crc kubenswrapper[4815]: I1207 19:58:15.906089 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-4qt4c_20784fa7-7f38-4dde-a229-37dc1db2e351/registry-server/0.log" Dec 07 19:58:15 crc kubenswrapper[4815]: I1207 19:58:15.929500 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-rgtd5_ca317190-c788-4b00-9702-237d043cb5ed/extract-utilities/0.log" Dec 07 19:58:16 crc kubenswrapper[4815]: I1207 19:58:16.098053 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-rgtd5_ca317190-c788-4b00-9702-237d043cb5ed/extract-content/0.log" Dec 07 19:58:16 crc kubenswrapper[4815]: I1207 19:58:16.136949 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-rgtd5_ca317190-c788-4b00-9702-237d043cb5ed/extract-content/0.log" Dec 07 19:58:16 crc kubenswrapper[4815]: I1207 19:58:16.153552 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-rgtd5_ca317190-c788-4b00-9702-237d043cb5ed/extract-utilities/0.log" Dec 07 19:58:16 crc kubenswrapper[4815]: I1207 19:58:16.338451 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-rgtd5_ca317190-c788-4b00-9702-237d043cb5ed/extract-content/0.log" Dec 07 19:58:16 crc kubenswrapper[4815]: I1207 19:58:16.359881 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-rgtd5_ca317190-c788-4b00-9702-237d043cb5ed/extract-utilities/0.log" Dec 07 19:58:16 crc kubenswrapper[4815]: I1207 19:58:16.976469 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-rgtd5_ca317190-c788-4b00-9702-237d043cb5ed/registry-server/0.log" Dec 07 19:58:54 crc kubenswrapper[4815]: E1207 19:58:54.591550 4815 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.2:33488->38.102.83.2:39939: write tcp 38.102.83.2:33488->38.102.83.2:39939: write: broken pipe Dec 07 19:59:27 crc kubenswrapper[4815]: I1207 19:59:27.701475 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-kz4n9"] Dec 07 19:59:27 crc kubenswrapper[4815]: E1207 19:59:27.702468 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04798220-974a-40e6-abfb-cc2df600088c" containerName="container-00" Dec 07 19:59:27 crc kubenswrapper[4815]: I1207 19:59:27.702487 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="04798220-974a-40e6-abfb-cc2df600088c" containerName="container-00" Dec 07 19:59:27 crc kubenswrapper[4815]: I1207 19:59:27.702749 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="04798220-974a-40e6-abfb-cc2df600088c" containerName="container-00" Dec 07 19:59:27 crc kubenswrapper[4815]: I1207 19:59:27.704326 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kz4n9" Dec 07 19:59:27 crc kubenswrapper[4815]: I1207 19:59:27.719445 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kz4n9"] Dec 07 19:59:27 crc kubenswrapper[4815]: I1207 19:59:27.734523 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b637b08-8573-4ee1-99ed-eddb8f37b47a-utilities\") pod \"redhat-operators-kz4n9\" (UID: \"8b637b08-8573-4ee1-99ed-eddb8f37b47a\") " pod="openshift-marketplace/redhat-operators-kz4n9" Dec 07 19:59:27 crc kubenswrapper[4815]: I1207 19:59:27.734567 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7wwx\" (UniqueName: \"kubernetes.io/projected/8b637b08-8573-4ee1-99ed-eddb8f37b47a-kube-api-access-r7wwx\") pod \"redhat-operators-kz4n9\" (UID: \"8b637b08-8573-4ee1-99ed-eddb8f37b47a\") " pod="openshift-marketplace/redhat-operators-kz4n9" Dec 07 19:59:27 crc kubenswrapper[4815]: I1207 19:59:27.734645 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b637b08-8573-4ee1-99ed-eddb8f37b47a-catalog-content\") pod \"redhat-operators-kz4n9\" (UID: \"8b637b08-8573-4ee1-99ed-eddb8f37b47a\") " pod="openshift-marketplace/redhat-operators-kz4n9" Dec 07 19:59:27 crc kubenswrapper[4815]: I1207 19:59:27.835901 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b637b08-8573-4ee1-99ed-eddb8f37b47a-catalog-content\") pod \"redhat-operators-kz4n9\" (UID: \"8b637b08-8573-4ee1-99ed-eddb8f37b47a\") " pod="openshift-marketplace/redhat-operators-kz4n9" Dec 07 19:59:27 crc kubenswrapper[4815]: I1207 19:59:27.836039 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b637b08-8573-4ee1-99ed-eddb8f37b47a-utilities\") pod \"redhat-operators-kz4n9\" (UID: \"8b637b08-8573-4ee1-99ed-eddb8f37b47a\") " pod="openshift-marketplace/redhat-operators-kz4n9" Dec 07 19:59:27 crc kubenswrapper[4815]: I1207 19:59:27.836070 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7wwx\" (UniqueName: \"kubernetes.io/projected/8b637b08-8573-4ee1-99ed-eddb8f37b47a-kube-api-access-r7wwx\") pod \"redhat-operators-kz4n9\" (UID: \"8b637b08-8573-4ee1-99ed-eddb8f37b47a\") " pod="openshift-marketplace/redhat-operators-kz4n9" Dec 07 19:59:27 crc kubenswrapper[4815]: I1207 19:59:27.837840 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b637b08-8573-4ee1-99ed-eddb8f37b47a-catalog-content\") pod \"redhat-operators-kz4n9\" (UID: \"8b637b08-8573-4ee1-99ed-eddb8f37b47a\") " pod="openshift-marketplace/redhat-operators-kz4n9" Dec 07 19:59:27 crc kubenswrapper[4815]: I1207 19:59:27.838877 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b637b08-8573-4ee1-99ed-eddb8f37b47a-utilities\") pod \"redhat-operators-kz4n9\" (UID: \"8b637b08-8573-4ee1-99ed-eddb8f37b47a\") " pod="openshift-marketplace/redhat-operators-kz4n9" Dec 07 19:59:27 crc kubenswrapper[4815]: I1207 19:59:27.873202 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7wwx\" (UniqueName: \"kubernetes.io/projected/8b637b08-8573-4ee1-99ed-eddb8f37b47a-kube-api-access-r7wwx\") pod \"redhat-operators-kz4n9\" (UID: \"8b637b08-8573-4ee1-99ed-eddb8f37b47a\") " pod="openshift-marketplace/redhat-operators-kz4n9" Dec 07 19:59:28 crc kubenswrapper[4815]: I1207 19:59:28.060526 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kz4n9" Dec 07 19:59:28 crc kubenswrapper[4815]: I1207 19:59:28.537570 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kz4n9"] Dec 07 19:59:29 crc kubenswrapper[4815]: I1207 19:59:29.375397 4815 generic.go:334] "Generic (PLEG): container finished" podID="8b637b08-8573-4ee1-99ed-eddb8f37b47a" containerID="fea6bb274418d73e9a48e4d08c0352ca61aab0834f77bc7abaaeca99e046204d" exitCode=0 Dec 07 19:59:29 crc kubenswrapper[4815]: I1207 19:59:29.375468 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kz4n9" event={"ID":"8b637b08-8573-4ee1-99ed-eddb8f37b47a","Type":"ContainerDied","Data":"fea6bb274418d73e9a48e4d08c0352ca61aab0834f77bc7abaaeca99e046204d"} Dec 07 19:59:29 crc kubenswrapper[4815]: I1207 19:59:29.375625 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kz4n9" event={"ID":"8b637b08-8573-4ee1-99ed-eddb8f37b47a","Type":"ContainerStarted","Data":"efeff8a3d645648f67bc920bec094b9c92092a8cffbf773d884f135aee5d8ade"} Dec 07 19:59:30 crc kubenswrapper[4815]: I1207 19:59:30.385300 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kz4n9" event={"ID":"8b637b08-8573-4ee1-99ed-eddb8f37b47a","Type":"ContainerStarted","Data":"044a7b473cc5001bbbfd0d08c73f53f5d21548253a40c3e8f52943f462a1ea84"} Dec 07 19:59:34 crc kubenswrapper[4815]: I1207 19:59:34.450055 4815 generic.go:334] "Generic (PLEG): container finished" podID="8b637b08-8573-4ee1-99ed-eddb8f37b47a" containerID="044a7b473cc5001bbbfd0d08c73f53f5d21548253a40c3e8f52943f462a1ea84" exitCode=0 Dec 07 19:59:34 crc kubenswrapper[4815]: I1207 19:59:34.450122 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kz4n9" event={"ID":"8b637b08-8573-4ee1-99ed-eddb8f37b47a","Type":"ContainerDied","Data":"044a7b473cc5001bbbfd0d08c73f53f5d21548253a40c3e8f52943f462a1ea84"} Dec 07 19:59:36 crc kubenswrapper[4815]: I1207 19:59:36.468619 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kz4n9" event={"ID":"8b637b08-8573-4ee1-99ed-eddb8f37b47a","Type":"ContainerStarted","Data":"371c625aee4164db68eda9bbf423eb9b105abb1b80e6eabcd1c703ca74ebb326"} Dec 07 19:59:36 crc kubenswrapper[4815]: I1207 19:59:36.491649 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-kz4n9" podStartSLOduration=3.170070285 podStartE2EDuration="9.49162895s" podCreationTimestamp="2025-12-07 19:59:27 +0000 UTC" firstStartedPulling="2025-12-07 19:59:29.377197522 +0000 UTC m=+2673.956187577" lastFinishedPulling="2025-12-07 19:59:35.698756157 +0000 UTC m=+2680.277746242" observedRunningTime="2025-12-07 19:59:36.488705717 +0000 UTC m=+2681.067695782" watchObservedRunningTime="2025-12-07 19:59:36.49162895 +0000 UTC m=+2681.070619005" Dec 07 19:59:38 crc kubenswrapper[4815]: I1207 19:59:38.061010 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-kz4n9" Dec 07 19:59:38 crc kubenswrapper[4815]: I1207 19:59:38.061082 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-kz4n9" Dec 07 19:59:39 crc kubenswrapper[4815]: I1207 19:59:39.115410 4815 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-kz4n9" podUID="8b637b08-8573-4ee1-99ed-eddb8f37b47a" containerName="registry-server" probeResult="failure" output=< Dec 07 19:59:39 crc kubenswrapper[4815]: timeout: failed to connect service ":50051" within 1s Dec 07 19:59:39 crc kubenswrapper[4815]: > Dec 07 19:59:48 crc kubenswrapper[4815]: I1207 19:59:48.105153 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-kz4n9" Dec 07 19:59:48 crc kubenswrapper[4815]: I1207 19:59:48.162886 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-kz4n9" Dec 07 19:59:48 crc kubenswrapper[4815]: I1207 19:59:48.340482 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kz4n9"] Dec 07 19:59:49 crc kubenswrapper[4815]: I1207 19:59:49.583068 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-kz4n9" podUID="8b637b08-8573-4ee1-99ed-eddb8f37b47a" containerName="registry-server" containerID="cri-o://371c625aee4164db68eda9bbf423eb9b105abb1b80e6eabcd1c703ca74ebb326" gracePeriod=2 Dec 07 19:59:50 crc kubenswrapper[4815]: I1207 19:59:50.029322 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kz4n9" Dec 07 19:59:50 crc kubenswrapper[4815]: I1207 19:59:50.156161 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7wwx\" (UniqueName: \"kubernetes.io/projected/8b637b08-8573-4ee1-99ed-eddb8f37b47a-kube-api-access-r7wwx\") pod \"8b637b08-8573-4ee1-99ed-eddb8f37b47a\" (UID: \"8b637b08-8573-4ee1-99ed-eddb8f37b47a\") " Dec 07 19:59:50 crc kubenswrapper[4815]: I1207 19:59:50.156347 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b637b08-8573-4ee1-99ed-eddb8f37b47a-catalog-content\") pod \"8b637b08-8573-4ee1-99ed-eddb8f37b47a\" (UID: \"8b637b08-8573-4ee1-99ed-eddb8f37b47a\") " Dec 07 19:59:50 crc kubenswrapper[4815]: I1207 19:59:50.156378 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b637b08-8573-4ee1-99ed-eddb8f37b47a-utilities\") pod \"8b637b08-8573-4ee1-99ed-eddb8f37b47a\" (UID: \"8b637b08-8573-4ee1-99ed-eddb8f37b47a\") " Dec 07 19:59:50 crc kubenswrapper[4815]: I1207 19:59:50.157556 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b637b08-8573-4ee1-99ed-eddb8f37b47a-utilities" (OuterVolumeSpecName: "utilities") pod "8b637b08-8573-4ee1-99ed-eddb8f37b47a" (UID: "8b637b08-8573-4ee1-99ed-eddb8f37b47a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 07 19:59:50 crc kubenswrapper[4815]: I1207 19:59:50.168565 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b637b08-8573-4ee1-99ed-eddb8f37b47a-kube-api-access-r7wwx" (OuterVolumeSpecName: "kube-api-access-r7wwx") pod "8b637b08-8573-4ee1-99ed-eddb8f37b47a" (UID: "8b637b08-8573-4ee1-99ed-eddb8f37b47a"). InnerVolumeSpecName "kube-api-access-r7wwx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 19:59:50 crc kubenswrapper[4815]: I1207 19:59:50.259210 4815 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b637b08-8573-4ee1-99ed-eddb8f37b47a-utilities\") on node \"crc\" DevicePath \"\"" Dec 07 19:59:50 crc kubenswrapper[4815]: I1207 19:59:50.259270 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7wwx\" (UniqueName: \"kubernetes.io/projected/8b637b08-8573-4ee1-99ed-eddb8f37b47a-kube-api-access-r7wwx\") on node \"crc\" DevicePath \"\"" Dec 07 19:59:50 crc kubenswrapper[4815]: I1207 19:59:50.284200 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b637b08-8573-4ee1-99ed-eddb8f37b47a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8b637b08-8573-4ee1-99ed-eddb8f37b47a" (UID: "8b637b08-8573-4ee1-99ed-eddb8f37b47a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 07 19:59:50 crc kubenswrapper[4815]: I1207 19:59:50.360995 4815 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b637b08-8573-4ee1-99ed-eddb8f37b47a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 07 19:59:50 crc kubenswrapper[4815]: I1207 19:59:50.607880 4815 generic.go:334] "Generic (PLEG): container finished" podID="8b637b08-8573-4ee1-99ed-eddb8f37b47a" containerID="371c625aee4164db68eda9bbf423eb9b105abb1b80e6eabcd1c703ca74ebb326" exitCode=0 Dec 07 19:59:50 crc kubenswrapper[4815]: I1207 19:59:50.607969 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kz4n9" event={"ID":"8b637b08-8573-4ee1-99ed-eddb8f37b47a","Type":"ContainerDied","Data":"371c625aee4164db68eda9bbf423eb9b105abb1b80e6eabcd1c703ca74ebb326"} Dec 07 19:59:50 crc kubenswrapper[4815]: I1207 19:59:50.608045 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kz4n9" event={"ID":"8b637b08-8573-4ee1-99ed-eddb8f37b47a","Type":"ContainerDied","Data":"efeff8a3d645648f67bc920bec094b9c92092a8cffbf773d884f135aee5d8ade"} Dec 07 19:59:50 crc kubenswrapper[4815]: I1207 19:59:50.607995 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kz4n9" Dec 07 19:59:50 crc kubenswrapper[4815]: I1207 19:59:50.608122 4815 scope.go:117] "RemoveContainer" containerID="371c625aee4164db68eda9bbf423eb9b105abb1b80e6eabcd1c703ca74ebb326" Dec 07 19:59:50 crc kubenswrapper[4815]: I1207 19:59:50.655507 4815 scope.go:117] "RemoveContainer" containerID="044a7b473cc5001bbbfd0d08c73f53f5d21548253a40c3e8f52943f462a1ea84" Dec 07 19:59:50 crc kubenswrapper[4815]: I1207 19:59:50.684876 4815 scope.go:117] "RemoveContainer" containerID="fea6bb274418d73e9a48e4d08c0352ca61aab0834f77bc7abaaeca99e046204d" Dec 07 19:59:50 crc kubenswrapper[4815]: I1207 19:59:50.694470 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kz4n9"] Dec 07 19:59:50 crc kubenswrapper[4815]: I1207 19:59:50.710682 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-kz4n9"] Dec 07 19:59:50 crc kubenswrapper[4815]: I1207 19:59:50.729719 4815 scope.go:117] "RemoveContainer" containerID="371c625aee4164db68eda9bbf423eb9b105abb1b80e6eabcd1c703ca74ebb326" Dec 07 19:59:50 crc kubenswrapper[4815]: E1207 19:59:50.731184 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"371c625aee4164db68eda9bbf423eb9b105abb1b80e6eabcd1c703ca74ebb326\": container with ID starting with 371c625aee4164db68eda9bbf423eb9b105abb1b80e6eabcd1c703ca74ebb326 not found: ID does not exist" containerID="371c625aee4164db68eda9bbf423eb9b105abb1b80e6eabcd1c703ca74ebb326" Dec 07 19:59:50 crc kubenswrapper[4815]: I1207 19:59:50.731214 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"371c625aee4164db68eda9bbf423eb9b105abb1b80e6eabcd1c703ca74ebb326"} err="failed to get container status \"371c625aee4164db68eda9bbf423eb9b105abb1b80e6eabcd1c703ca74ebb326\": rpc error: code = NotFound desc = could not find container \"371c625aee4164db68eda9bbf423eb9b105abb1b80e6eabcd1c703ca74ebb326\": container with ID starting with 371c625aee4164db68eda9bbf423eb9b105abb1b80e6eabcd1c703ca74ebb326 not found: ID does not exist" Dec 07 19:59:50 crc kubenswrapper[4815]: I1207 19:59:50.731237 4815 scope.go:117] "RemoveContainer" containerID="044a7b473cc5001bbbfd0d08c73f53f5d21548253a40c3e8f52943f462a1ea84" Dec 07 19:59:50 crc kubenswrapper[4815]: E1207 19:59:50.731640 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"044a7b473cc5001bbbfd0d08c73f53f5d21548253a40c3e8f52943f462a1ea84\": container with ID starting with 044a7b473cc5001bbbfd0d08c73f53f5d21548253a40c3e8f52943f462a1ea84 not found: ID does not exist" containerID="044a7b473cc5001bbbfd0d08c73f53f5d21548253a40c3e8f52943f462a1ea84" Dec 07 19:59:50 crc kubenswrapper[4815]: I1207 19:59:50.731774 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"044a7b473cc5001bbbfd0d08c73f53f5d21548253a40c3e8f52943f462a1ea84"} err="failed to get container status \"044a7b473cc5001bbbfd0d08c73f53f5d21548253a40c3e8f52943f462a1ea84\": rpc error: code = NotFound desc = could not find container \"044a7b473cc5001bbbfd0d08c73f53f5d21548253a40c3e8f52943f462a1ea84\": container with ID starting with 044a7b473cc5001bbbfd0d08c73f53f5d21548253a40c3e8f52943f462a1ea84 not found: ID does not exist" Dec 07 19:59:50 crc kubenswrapper[4815]: I1207 19:59:50.731859 4815 scope.go:117] "RemoveContainer" containerID="fea6bb274418d73e9a48e4d08c0352ca61aab0834f77bc7abaaeca99e046204d" Dec 07 19:59:50 crc kubenswrapper[4815]: E1207 19:59:50.732442 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fea6bb274418d73e9a48e4d08c0352ca61aab0834f77bc7abaaeca99e046204d\": container with ID starting with fea6bb274418d73e9a48e4d08c0352ca61aab0834f77bc7abaaeca99e046204d not found: ID does not exist" containerID="fea6bb274418d73e9a48e4d08c0352ca61aab0834f77bc7abaaeca99e046204d" Dec 07 19:59:50 crc kubenswrapper[4815]: I1207 19:59:50.732532 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fea6bb274418d73e9a48e4d08c0352ca61aab0834f77bc7abaaeca99e046204d"} err="failed to get container status \"fea6bb274418d73e9a48e4d08c0352ca61aab0834f77bc7abaaeca99e046204d\": rpc error: code = NotFound desc = could not find container \"fea6bb274418d73e9a48e4d08c0352ca61aab0834f77bc7abaaeca99e046204d\": container with ID starting with fea6bb274418d73e9a48e4d08c0352ca61aab0834f77bc7abaaeca99e046204d not found: ID does not exist" Dec 07 19:59:51 crc kubenswrapper[4815]: I1207 19:59:51.781606 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b637b08-8573-4ee1-99ed-eddb8f37b47a" path="/var/lib/kubelet/pods/8b637b08-8573-4ee1-99ed-eddb8f37b47a/volumes" Dec 07 20:00:00 crc kubenswrapper[4815]: I1207 20:00:00.149633 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29418960-dgzvh"] Dec 07 20:00:00 crc kubenswrapper[4815]: E1207 20:00:00.150643 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b637b08-8573-4ee1-99ed-eddb8f37b47a" containerName="extract-utilities" Dec 07 20:00:00 crc kubenswrapper[4815]: I1207 20:00:00.150686 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b637b08-8573-4ee1-99ed-eddb8f37b47a" containerName="extract-utilities" Dec 07 20:00:00 crc kubenswrapper[4815]: E1207 20:00:00.150709 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b637b08-8573-4ee1-99ed-eddb8f37b47a" containerName="registry-server" Dec 07 20:00:00 crc kubenswrapper[4815]: I1207 20:00:00.150716 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b637b08-8573-4ee1-99ed-eddb8f37b47a" containerName="registry-server" Dec 07 20:00:00 crc kubenswrapper[4815]: E1207 20:00:00.150731 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b637b08-8573-4ee1-99ed-eddb8f37b47a" containerName="extract-content" Dec 07 20:00:00 crc kubenswrapper[4815]: I1207 20:00:00.150739 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b637b08-8573-4ee1-99ed-eddb8f37b47a" containerName="extract-content" Dec 07 20:00:00 crc kubenswrapper[4815]: I1207 20:00:00.151006 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b637b08-8573-4ee1-99ed-eddb8f37b47a" containerName="registry-server" Dec 07 20:00:00 crc kubenswrapper[4815]: I1207 20:00:00.151810 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29418960-dgzvh" Dec 07 20:00:00 crc kubenswrapper[4815]: I1207 20:00:00.154388 4815 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 07 20:00:00 crc kubenswrapper[4815]: I1207 20:00:00.154568 4815 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 07 20:00:00 crc kubenswrapper[4815]: I1207 20:00:00.158631 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29418960-dgzvh"] Dec 07 20:00:00 crc kubenswrapper[4815]: I1207 20:00:00.247153 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ecc53464-102b-4869-b13e-0caedaca6c05-secret-volume\") pod \"collect-profiles-29418960-dgzvh\" (UID: \"ecc53464-102b-4869-b13e-0caedaca6c05\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29418960-dgzvh" Dec 07 20:00:00 crc kubenswrapper[4815]: I1207 20:00:00.247290 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ecc53464-102b-4869-b13e-0caedaca6c05-config-volume\") pod \"collect-profiles-29418960-dgzvh\" (UID: \"ecc53464-102b-4869-b13e-0caedaca6c05\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29418960-dgzvh" Dec 07 20:00:00 crc kubenswrapper[4815]: I1207 20:00:00.247317 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lr6nw\" (UniqueName: \"kubernetes.io/projected/ecc53464-102b-4869-b13e-0caedaca6c05-kube-api-access-lr6nw\") pod \"collect-profiles-29418960-dgzvh\" (UID: \"ecc53464-102b-4869-b13e-0caedaca6c05\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29418960-dgzvh" Dec 07 20:00:00 crc kubenswrapper[4815]: I1207 20:00:00.349150 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ecc53464-102b-4869-b13e-0caedaca6c05-secret-volume\") pod \"collect-profiles-29418960-dgzvh\" (UID: \"ecc53464-102b-4869-b13e-0caedaca6c05\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29418960-dgzvh" Dec 07 20:00:00 crc kubenswrapper[4815]: I1207 20:00:00.349287 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ecc53464-102b-4869-b13e-0caedaca6c05-config-volume\") pod \"collect-profiles-29418960-dgzvh\" (UID: \"ecc53464-102b-4869-b13e-0caedaca6c05\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29418960-dgzvh" Dec 07 20:00:00 crc kubenswrapper[4815]: I1207 20:00:00.349319 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lr6nw\" (UniqueName: \"kubernetes.io/projected/ecc53464-102b-4869-b13e-0caedaca6c05-kube-api-access-lr6nw\") pod \"collect-profiles-29418960-dgzvh\" (UID: \"ecc53464-102b-4869-b13e-0caedaca6c05\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29418960-dgzvh" Dec 07 20:00:00 crc kubenswrapper[4815]: I1207 20:00:00.351843 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ecc53464-102b-4869-b13e-0caedaca6c05-config-volume\") pod \"collect-profiles-29418960-dgzvh\" (UID: \"ecc53464-102b-4869-b13e-0caedaca6c05\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29418960-dgzvh" Dec 07 20:00:00 crc kubenswrapper[4815]: I1207 20:00:00.357824 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ecc53464-102b-4869-b13e-0caedaca6c05-secret-volume\") pod \"collect-profiles-29418960-dgzvh\" (UID: \"ecc53464-102b-4869-b13e-0caedaca6c05\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29418960-dgzvh" Dec 07 20:00:00 crc kubenswrapper[4815]: I1207 20:00:00.372006 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lr6nw\" (UniqueName: \"kubernetes.io/projected/ecc53464-102b-4869-b13e-0caedaca6c05-kube-api-access-lr6nw\") pod \"collect-profiles-29418960-dgzvh\" (UID: \"ecc53464-102b-4869-b13e-0caedaca6c05\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29418960-dgzvh" Dec 07 20:00:00 crc kubenswrapper[4815]: I1207 20:00:00.473906 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29418960-dgzvh" Dec 07 20:00:00 crc kubenswrapper[4815]: I1207 20:00:00.959907 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29418960-dgzvh"] Dec 07 20:00:01 crc kubenswrapper[4815]: I1207 20:00:01.710617 4815 generic.go:334] "Generic (PLEG): container finished" podID="ecc53464-102b-4869-b13e-0caedaca6c05" containerID="e7b865bc0f84da5e82c5d9e19798c152ed6f58d5d26dea0482937534a039cb58" exitCode=0 Dec 07 20:00:01 crc kubenswrapper[4815]: I1207 20:00:01.710672 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29418960-dgzvh" event={"ID":"ecc53464-102b-4869-b13e-0caedaca6c05","Type":"ContainerDied","Data":"e7b865bc0f84da5e82c5d9e19798c152ed6f58d5d26dea0482937534a039cb58"} Dec 07 20:00:01 crc kubenswrapper[4815]: I1207 20:00:01.710977 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29418960-dgzvh" event={"ID":"ecc53464-102b-4869-b13e-0caedaca6c05","Type":"ContainerStarted","Data":"54aafc845d9b1fd22b7745ae38bcecefab812bab7b31f08698a5a6544f498299"} Dec 07 20:00:03 crc kubenswrapper[4815]: I1207 20:00:03.006038 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29418960-dgzvh" Dec 07 20:00:03 crc kubenswrapper[4815]: I1207 20:00:03.109766 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lr6nw\" (UniqueName: \"kubernetes.io/projected/ecc53464-102b-4869-b13e-0caedaca6c05-kube-api-access-lr6nw\") pod \"ecc53464-102b-4869-b13e-0caedaca6c05\" (UID: \"ecc53464-102b-4869-b13e-0caedaca6c05\") " Dec 07 20:00:03 crc kubenswrapper[4815]: I1207 20:00:03.109832 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ecc53464-102b-4869-b13e-0caedaca6c05-config-volume\") pod \"ecc53464-102b-4869-b13e-0caedaca6c05\" (UID: \"ecc53464-102b-4869-b13e-0caedaca6c05\") " Dec 07 20:00:03 crc kubenswrapper[4815]: I1207 20:00:03.109879 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ecc53464-102b-4869-b13e-0caedaca6c05-secret-volume\") pod \"ecc53464-102b-4869-b13e-0caedaca6c05\" (UID: \"ecc53464-102b-4869-b13e-0caedaca6c05\") " Dec 07 20:00:03 crc kubenswrapper[4815]: I1207 20:00:03.110787 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ecc53464-102b-4869-b13e-0caedaca6c05-config-volume" (OuterVolumeSpecName: "config-volume") pod "ecc53464-102b-4869-b13e-0caedaca6c05" (UID: "ecc53464-102b-4869-b13e-0caedaca6c05"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 07 20:00:03 crc kubenswrapper[4815]: I1207 20:00:03.128565 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecc53464-102b-4869-b13e-0caedaca6c05-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "ecc53464-102b-4869-b13e-0caedaca6c05" (UID: "ecc53464-102b-4869-b13e-0caedaca6c05"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 20:00:03 crc kubenswrapper[4815]: I1207 20:00:03.141342 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ecc53464-102b-4869-b13e-0caedaca6c05-kube-api-access-lr6nw" (OuterVolumeSpecName: "kube-api-access-lr6nw") pod "ecc53464-102b-4869-b13e-0caedaca6c05" (UID: "ecc53464-102b-4869-b13e-0caedaca6c05"). InnerVolumeSpecName "kube-api-access-lr6nw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 20:00:03 crc kubenswrapper[4815]: I1207 20:00:03.252483 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lr6nw\" (UniqueName: \"kubernetes.io/projected/ecc53464-102b-4869-b13e-0caedaca6c05-kube-api-access-lr6nw\") on node \"crc\" DevicePath \"\"" Dec 07 20:00:03 crc kubenswrapper[4815]: I1207 20:00:03.252510 4815 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ecc53464-102b-4869-b13e-0caedaca6c05-config-volume\") on node \"crc\" DevicePath \"\"" Dec 07 20:00:03 crc kubenswrapper[4815]: I1207 20:00:03.252518 4815 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ecc53464-102b-4869-b13e-0caedaca6c05-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 07 20:00:03 crc kubenswrapper[4815]: I1207 20:00:03.727437 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29418960-dgzvh" event={"ID":"ecc53464-102b-4869-b13e-0caedaca6c05","Type":"ContainerDied","Data":"54aafc845d9b1fd22b7745ae38bcecefab812bab7b31f08698a5a6544f498299"} Dec 07 20:00:03 crc kubenswrapper[4815]: I1207 20:00:03.727494 4815 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="54aafc845d9b1fd22b7745ae38bcecefab812bab7b31f08698a5a6544f498299" Dec 07 20:00:03 crc kubenswrapper[4815]: I1207 20:00:03.727573 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29418960-dgzvh" Dec 07 20:00:04 crc kubenswrapper[4815]: I1207 20:00:04.104253 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29418915-rch2v"] Dec 07 20:00:04 crc kubenswrapper[4815]: I1207 20:00:04.110797 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29418915-rch2v"] Dec 07 20:00:05 crc kubenswrapper[4815]: I1207 20:00:05.804845 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b1f9a05-fdb4-42cd-8835-44ab845941ad" path="/var/lib/kubelet/pods/0b1f9a05-fdb4-42cd-8835-44ab845941ad/volumes" Dec 07 20:00:20 crc kubenswrapper[4815]: I1207 20:00:20.885247 4815 generic.go:334] "Generic (PLEG): container finished" podID="a5376abd-47e3-4a79-a593-05cf789b4e16" containerID="a6230ffa4b25b0286d26e2ea5b7549738bdade33db32732a6c327dce420331dc" exitCode=0 Dec 07 20:00:20 crc kubenswrapper[4815]: I1207 20:00:20.885336 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qb8tq/must-gather-nlthc" event={"ID":"a5376abd-47e3-4a79-a593-05cf789b4e16","Type":"ContainerDied","Data":"a6230ffa4b25b0286d26e2ea5b7549738bdade33db32732a6c327dce420331dc"} Dec 07 20:00:20 crc kubenswrapper[4815]: I1207 20:00:20.886260 4815 scope.go:117] "RemoveContainer" containerID="a6230ffa4b25b0286d26e2ea5b7549738bdade33db32732a6c327dce420331dc" Dec 07 20:00:21 crc kubenswrapper[4815]: I1207 20:00:21.855270 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-qb8tq_must-gather-nlthc_a5376abd-47e3-4a79-a593-05cf789b4e16/gather/0.log" Dec 07 20:00:24 crc kubenswrapper[4815]: I1207 20:00:24.174575 4815 scope.go:117] "RemoveContainer" containerID="bb77becfe2fd380c377e5c61a4161e67d662c57941151ee64ec98d16711606a6" Dec 07 20:00:24 crc kubenswrapper[4815]: E1207 20:00:24.373747 4815 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.2:54230->38.102.83.2:39939: read tcp 38.102.83.2:54230->38.102.83.2:39939: read: connection reset by peer Dec 07 20:00:26 crc kubenswrapper[4815]: I1207 20:00:26.359555 4815 patch_prober.go:28] interesting pod/machine-config-daemon-gkn4h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 07 20:00:26 crc kubenswrapper[4815]: I1207 20:00:26.360091 4815 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 07 20:00:30 crc kubenswrapper[4815]: I1207 20:00:30.726298 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-qb8tq/must-gather-nlthc"] Dec 07 20:00:30 crc kubenswrapper[4815]: I1207 20:00:30.726945 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-qb8tq/must-gather-nlthc" podUID="a5376abd-47e3-4a79-a593-05cf789b4e16" containerName="copy" containerID="cri-o://5005ec3c631cef5a12a6161db09f8a74fef2d877c88d782af4f9c8bdebddfac9" gracePeriod=2 Dec 07 20:00:30 crc kubenswrapper[4815]: I1207 20:00:30.746873 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-qb8tq/must-gather-nlthc"] Dec 07 20:00:30 crc kubenswrapper[4815]: I1207 20:00:30.970276 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-qb8tq_must-gather-nlthc_a5376abd-47e3-4a79-a593-05cf789b4e16/copy/0.log" Dec 07 20:00:30 crc kubenswrapper[4815]: I1207 20:00:30.970704 4815 generic.go:334] "Generic (PLEG): container finished" podID="a5376abd-47e3-4a79-a593-05cf789b4e16" containerID="5005ec3c631cef5a12a6161db09f8a74fef2d877c88d782af4f9c8bdebddfac9" exitCode=143 Dec 07 20:00:31 crc kubenswrapper[4815]: I1207 20:00:31.348838 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-qb8tq_must-gather-nlthc_a5376abd-47e3-4a79-a593-05cf789b4e16/copy/0.log" Dec 07 20:00:31 crc kubenswrapper[4815]: I1207 20:00:31.349459 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qb8tq/must-gather-nlthc" Dec 07 20:00:31 crc kubenswrapper[4815]: I1207 20:00:31.461620 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pwx9r\" (UniqueName: \"kubernetes.io/projected/a5376abd-47e3-4a79-a593-05cf789b4e16-kube-api-access-pwx9r\") pod \"a5376abd-47e3-4a79-a593-05cf789b4e16\" (UID: \"a5376abd-47e3-4a79-a593-05cf789b4e16\") " Dec 07 20:00:31 crc kubenswrapper[4815]: I1207 20:00:31.461730 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a5376abd-47e3-4a79-a593-05cf789b4e16-must-gather-output\") pod \"a5376abd-47e3-4a79-a593-05cf789b4e16\" (UID: \"a5376abd-47e3-4a79-a593-05cf789b4e16\") " Dec 07 20:00:31 crc kubenswrapper[4815]: I1207 20:00:31.483225 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5376abd-47e3-4a79-a593-05cf789b4e16-kube-api-access-pwx9r" (OuterVolumeSpecName: "kube-api-access-pwx9r") pod "a5376abd-47e3-4a79-a593-05cf789b4e16" (UID: "a5376abd-47e3-4a79-a593-05cf789b4e16"). InnerVolumeSpecName "kube-api-access-pwx9r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 20:00:31 crc kubenswrapper[4815]: I1207 20:00:31.567637 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pwx9r\" (UniqueName: \"kubernetes.io/projected/a5376abd-47e3-4a79-a593-05cf789b4e16-kube-api-access-pwx9r\") on node \"crc\" DevicePath \"\"" Dec 07 20:00:31 crc kubenswrapper[4815]: I1207 20:00:31.612730 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5376abd-47e3-4a79-a593-05cf789b4e16-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "a5376abd-47e3-4a79-a593-05cf789b4e16" (UID: "a5376abd-47e3-4a79-a593-05cf789b4e16"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 07 20:00:31 crc kubenswrapper[4815]: I1207 20:00:31.669650 4815 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a5376abd-47e3-4a79-a593-05cf789b4e16-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 07 20:00:31 crc kubenswrapper[4815]: I1207 20:00:31.782604 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5376abd-47e3-4a79-a593-05cf789b4e16" path="/var/lib/kubelet/pods/a5376abd-47e3-4a79-a593-05cf789b4e16/volumes" Dec 07 20:00:31 crc kubenswrapper[4815]: I1207 20:00:31.982812 4815 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-qb8tq_must-gather-nlthc_a5376abd-47e3-4a79-a593-05cf789b4e16/copy/0.log" Dec 07 20:00:31 crc kubenswrapper[4815]: I1207 20:00:31.986388 4815 scope.go:117] "RemoveContainer" containerID="5005ec3c631cef5a12a6161db09f8a74fef2d877c88d782af4f9c8bdebddfac9" Dec 07 20:00:31 crc kubenswrapper[4815]: I1207 20:00:31.986441 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qb8tq/must-gather-nlthc" Dec 07 20:00:32 crc kubenswrapper[4815]: I1207 20:00:32.014288 4815 scope.go:117] "RemoveContainer" containerID="a6230ffa4b25b0286d26e2ea5b7549738bdade33db32732a6c327dce420331dc" Dec 07 20:00:56 crc kubenswrapper[4815]: I1207 20:00:56.359794 4815 patch_prober.go:28] interesting pod/machine-config-daemon-gkn4h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 07 20:00:56 crc kubenswrapper[4815]: I1207 20:00:56.360298 4815 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 07 20:01:00 crc kubenswrapper[4815]: I1207 20:01:00.216379 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29418961-jbrs5"] Dec 07 20:01:00 crc kubenswrapper[4815]: E1207 20:01:00.217103 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5376abd-47e3-4a79-a593-05cf789b4e16" containerName="copy" Dec 07 20:01:00 crc kubenswrapper[4815]: I1207 20:01:00.217117 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5376abd-47e3-4a79-a593-05cf789b4e16" containerName="copy" Dec 07 20:01:00 crc kubenswrapper[4815]: E1207 20:01:00.217135 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecc53464-102b-4869-b13e-0caedaca6c05" containerName="collect-profiles" Dec 07 20:01:00 crc kubenswrapper[4815]: I1207 20:01:00.217142 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecc53464-102b-4869-b13e-0caedaca6c05" containerName="collect-profiles" Dec 07 20:01:00 crc kubenswrapper[4815]: E1207 20:01:00.217162 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5376abd-47e3-4a79-a593-05cf789b4e16" containerName="gather" Dec 07 20:01:00 crc kubenswrapper[4815]: I1207 20:01:00.217169 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5376abd-47e3-4a79-a593-05cf789b4e16" containerName="gather" Dec 07 20:01:00 crc kubenswrapper[4815]: I1207 20:01:00.217399 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5376abd-47e3-4a79-a593-05cf789b4e16" containerName="gather" Dec 07 20:01:00 crc kubenswrapper[4815]: I1207 20:01:00.217413 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecc53464-102b-4869-b13e-0caedaca6c05" containerName="collect-profiles" Dec 07 20:01:00 crc kubenswrapper[4815]: I1207 20:01:00.217430 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5376abd-47e3-4a79-a593-05cf789b4e16" containerName="copy" Dec 07 20:01:00 crc kubenswrapper[4815]: I1207 20:01:00.218122 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29418961-jbrs5" Dec 07 20:01:00 crc kubenswrapper[4815]: I1207 20:01:00.234910 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29418961-jbrs5"] Dec 07 20:01:00 crc kubenswrapper[4815]: I1207 20:01:00.383088 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4768e92e-8223-4847-b49c-38593d2ebf9b-combined-ca-bundle\") pod \"keystone-cron-29418961-jbrs5\" (UID: \"4768e92e-8223-4847-b49c-38593d2ebf9b\") " pod="openstack/keystone-cron-29418961-jbrs5" Dec 07 20:01:00 crc kubenswrapper[4815]: I1207 20:01:00.383403 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4768e92e-8223-4847-b49c-38593d2ebf9b-fernet-keys\") pod \"keystone-cron-29418961-jbrs5\" (UID: \"4768e92e-8223-4847-b49c-38593d2ebf9b\") " pod="openstack/keystone-cron-29418961-jbrs5" Dec 07 20:01:00 crc kubenswrapper[4815]: I1207 20:01:00.383484 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4768e92e-8223-4847-b49c-38593d2ebf9b-config-data\") pod \"keystone-cron-29418961-jbrs5\" (UID: \"4768e92e-8223-4847-b49c-38593d2ebf9b\") " pod="openstack/keystone-cron-29418961-jbrs5" Dec 07 20:01:00 crc kubenswrapper[4815]: I1207 20:01:00.383600 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zph4g\" (UniqueName: \"kubernetes.io/projected/4768e92e-8223-4847-b49c-38593d2ebf9b-kube-api-access-zph4g\") pod \"keystone-cron-29418961-jbrs5\" (UID: \"4768e92e-8223-4847-b49c-38593d2ebf9b\") " pod="openstack/keystone-cron-29418961-jbrs5" Dec 07 20:01:00 crc kubenswrapper[4815]: I1207 20:01:00.486047 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4768e92e-8223-4847-b49c-38593d2ebf9b-fernet-keys\") pod \"keystone-cron-29418961-jbrs5\" (UID: \"4768e92e-8223-4847-b49c-38593d2ebf9b\") " pod="openstack/keystone-cron-29418961-jbrs5" Dec 07 20:01:00 crc kubenswrapper[4815]: I1207 20:01:00.486112 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4768e92e-8223-4847-b49c-38593d2ebf9b-config-data\") pod \"keystone-cron-29418961-jbrs5\" (UID: \"4768e92e-8223-4847-b49c-38593d2ebf9b\") " pod="openstack/keystone-cron-29418961-jbrs5" Dec 07 20:01:00 crc kubenswrapper[4815]: I1207 20:01:00.486229 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zph4g\" (UniqueName: \"kubernetes.io/projected/4768e92e-8223-4847-b49c-38593d2ebf9b-kube-api-access-zph4g\") pod \"keystone-cron-29418961-jbrs5\" (UID: \"4768e92e-8223-4847-b49c-38593d2ebf9b\") " pod="openstack/keystone-cron-29418961-jbrs5" Dec 07 20:01:00 crc kubenswrapper[4815]: I1207 20:01:00.486298 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4768e92e-8223-4847-b49c-38593d2ebf9b-combined-ca-bundle\") pod \"keystone-cron-29418961-jbrs5\" (UID: \"4768e92e-8223-4847-b49c-38593d2ebf9b\") " pod="openstack/keystone-cron-29418961-jbrs5" Dec 07 20:01:00 crc kubenswrapper[4815]: I1207 20:01:00.495242 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4768e92e-8223-4847-b49c-38593d2ebf9b-fernet-keys\") pod \"keystone-cron-29418961-jbrs5\" (UID: \"4768e92e-8223-4847-b49c-38593d2ebf9b\") " pod="openstack/keystone-cron-29418961-jbrs5" Dec 07 20:01:00 crc kubenswrapper[4815]: I1207 20:01:00.495501 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4768e92e-8223-4847-b49c-38593d2ebf9b-config-data\") pod \"keystone-cron-29418961-jbrs5\" (UID: \"4768e92e-8223-4847-b49c-38593d2ebf9b\") " pod="openstack/keystone-cron-29418961-jbrs5" Dec 07 20:01:00 crc kubenswrapper[4815]: I1207 20:01:00.500989 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4768e92e-8223-4847-b49c-38593d2ebf9b-combined-ca-bundle\") pod \"keystone-cron-29418961-jbrs5\" (UID: \"4768e92e-8223-4847-b49c-38593d2ebf9b\") " pod="openstack/keystone-cron-29418961-jbrs5" Dec 07 20:01:00 crc kubenswrapper[4815]: I1207 20:01:00.508805 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zph4g\" (UniqueName: \"kubernetes.io/projected/4768e92e-8223-4847-b49c-38593d2ebf9b-kube-api-access-zph4g\") pod \"keystone-cron-29418961-jbrs5\" (UID: \"4768e92e-8223-4847-b49c-38593d2ebf9b\") " pod="openstack/keystone-cron-29418961-jbrs5" Dec 07 20:01:00 crc kubenswrapper[4815]: I1207 20:01:00.596054 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29418961-jbrs5" Dec 07 20:01:01 crc kubenswrapper[4815]: I1207 20:01:01.106789 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29418961-jbrs5"] Dec 07 20:01:01 crc kubenswrapper[4815]: I1207 20:01:01.223220 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29418961-jbrs5" event={"ID":"4768e92e-8223-4847-b49c-38593d2ebf9b","Type":"ContainerStarted","Data":"e8cef62a38a6fe2b712d1871e880474bfbd628fd3a8905f4fe92a2f2a831bab1"} Dec 07 20:01:02 crc kubenswrapper[4815]: I1207 20:01:02.233326 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29418961-jbrs5" event={"ID":"4768e92e-8223-4847-b49c-38593d2ebf9b","Type":"ContainerStarted","Data":"5ed970f74b59bcc81949e379c0ee9e9540436aed43a68e6fb6d47123cdd5eb1a"} Dec 07 20:01:02 crc kubenswrapper[4815]: I1207 20:01:02.257470 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29418961-jbrs5" podStartSLOduration=2.257454644 podStartE2EDuration="2.257454644s" podCreationTimestamp="2025-12-07 20:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-07 20:01:02.250827256 +0000 UTC m=+2766.829817301" watchObservedRunningTime="2025-12-07 20:01:02.257454644 +0000 UTC m=+2766.836444689" Dec 07 20:01:04 crc kubenswrapper[4815]: I1207 20:01:04.252027 4815 generic.go:334] "Generic (PLEG): container finished" podID="4768e92e-8223-4847-b49c-38593d2ebf9b" containerID="5ed970f74b59bcc81949e379c0ee9e9540436aed43a68e6fb6d47123cdd5eb1a" exitCode=0 Dec 07 20:01:04 crc kubenswrapper[4815]: I1207 20:01:04.252112 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29418961-jbrs5" event={"ID":"4768e92e-8223-4847-b49c-38593d2ebf9b","Type":"ContainerDied","Data":"5ed970f74b59bcc81949e379c0ee9e9540436aed43a68e6fb6d47123cdd5eb1a"} Dec 07 20:01:05 crc kubenswrapper[4815]: I1207 20:01:05.613988 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29418961-jbrs5" Dec 07 20:01:05 crc kubenswrapper[4815]: I1207 20:01:05.687246 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4768e92e-8223-4847-b49c-38593d2ebf9b-config-data\") pod \"4768e92e-8223-4847-b49c-38593d2ebf9b\" (UID: \"4768e92e-8223-4847-b49c-38593d2ebf9b\") " Dec 07 20:01:05 crc kubenswrapper[4815]: I1207 20:01:05.687389 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4768e92e-8223-4847-b49c-38593d2ebf9b-fernet-keys\") pod \"4768e92e-8223-4847-b49c-38593d2ebf9b\" (UID: \"4768e92e-8223-4847-b49c-38593d2ebf9b\") " Dec 07 20:01:05 crc kubenswrapper[4815]: I1207 20:01:05.687604 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zph4g\" (UniqueName: \"kubernetes.io/projected/4768e92e-8223-4847-b49c-38593d2ebf9b-kube-api-access-zph4g\") pod \"4768e92e-8223-4847-b49c-38593d2ebf9b\" (UID: \"4768e92e-8223-4847-b49c-38593d2ebf9b\") " Dec 07 20:01:05 crc kubenswrapper[4815]: I1207 20:01:05.687697 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4768e92e-8223-4847-b49c-38593d2ebf9b-combined-ca-bundle\") pod \"4768e92e-8223-4847-b49c-38593d2ebf9b\" (UID: \"4768e92e-8223-4847-b49c-38593d2ebf9b\") " Dec 07 20:01:05 crc kubenswrapper[4815]: I1207 20:01:05.696135 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4768e92e-8223-4847-b49c-38593d2ebf9b-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "4768e92e-8223-4847-b49c-38593d2ebf9b" (UID: "4768e92e-8223-4847-b49c-38593d2ebf9b"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 20:01:05 crc kubenswrapper[4815]: I1207 20:01:05.696885 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4768e92e-8223-4847-b49c-38593d2ebf9b-kube-api-access-zph4g" (OuterVolumeSpecName: "kube-api-access-zph4g") pod "4768e92e-8223-4847-b49c-38593d2ebf9b" (UID: "4768e92e-8223-4847-b49c-38593d2ebf9b"). InnerVolumeSpecName "kube-api-access-zph4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 20:01:05 crc kubenswrapper[4815]: I1207 20:01:05.716465 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4768e92e-8223-4847-b49c-38593d2ebf9b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4768e92e-8223-4847-b49c-38593d2ebf9b" (UID: "4768e92e-8223-4847-b49c-38593d2ebf9b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 20:01:05 crc kubenswrapper[4815]: I1207 20:01:05.748635 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4768e92e-8223-4847-b49c-38593d2ebf9b-config-data" (OuterVolumeSpecName: "config-data") pod "4768e92e-8223-4847-b49c-38593d2ebf9b" (UID: "4768e92e-8223-4847-b49c-38593d2ebf9b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 07 20:01:05 crc kubenswrapper[4815]: I1207 20:01:05.791571 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zph4g\" (UniqueName: \"kubernetes.io/projected/4768e92e-8223-4847-b49c-38593d2ebf9b-kube-api-access-zph4g\") on node \"crc\" DevicePath \"\"" Dec 07 20:01:05 crc kubenswrapper[4815]: I1207 20:01:05.791618 4815 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4768e92e-8223-4847-b49c-38593d2ebf9b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 07 20:01:05 crc kubenswrapper[4815]: I1207 20:01:05.791637 4815 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4768e92e-8223-4847-b49c-38593d2ebf9b-config-data\") on node \"crc\" DevicePath \"\"" Dec 07 20:01:05 crc kubenswrapper[4815]: I1207 20:01:05.791648 4815 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4768e92e-8223-4847-b49c-38593d2ebf9b-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 07 20:01:06 crc kubenswrapper[4815]: I1207 20:01:06.270466 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29418961-jbrs5" event={"ID":"4768e92e-8223-4847-b49c-38593d2ebf9b","Type":"ContainerDied","Data":"e8cef62a38a6fe2b712d1871e880474bfbd628fd3a8905f4fe92a2f2a831bab1"} Dec 07 20:01:06 crc kubenswrapper[4815]: I1207 20:01:06.270510 4815 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e8cef62a38a6fe2b712d1871e880474bfbd628fd3a8905f4fe92a2f2a831bab1" Dec 07 20:01:06 crc kubenswrapper[4815]: I1207 20:01:06.270615 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29418961-jbrs5" Dec 07 20:01:24 crc kubenswrapper[4815]: I1207 20:01:24.270726 4815 scope.go:117] "RemoveContainer" containerID="3ab313a71c7383bd71e5bbd5132f8a4f4ffb5309d70922e87b979c5d16a94146" Dec 07 20:01:26 crc kubenswrapper[4815]: I1207 20:01:26.360110 4815 patch_prober.go:28] interesting pod/machine-config-daemon-gkn4h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 07 20:01:26 crc kubenswrapper[4815]: I1207 20:01:26.360696 4815 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 07 20:01:26 crc kubenswrapper[4815]: I1207 20:01:26.360746 4815 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" Dec 07 20:01:26 crc kubenswrapper[4815]: I1207 20:01:26.361581 4815 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cfd4ba17f36b5ec097ece042ae7edbfb01b68a6555e6ad1f367486be28f1eea1"} pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 07 20:01:26 crc kubenswrapper[4815]: I1207 20:01:26.361642 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" containerName="machine-config-daemon" containerID="cri-o://cfd4ba17f36b5ec097ece042ae7edbfb01b68a6555e6ad1f367486be28f1eea1" gracePeriod=600 Dec 07 20:01:27 crc kubenswrapper[4815]: I1207 20:01:27.468963 4815 generic.go:334] "Generic (PLEG): container finished" podID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" containerID="cfd4ba17f36b5ec097ece042ae7edbfb01b68a6555e6ad1f367486be28f1eea1" exitCode=0 Dec 07 20:01:27 crc kubenswrapper[4815]: I1207 20:01:27.469066 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" event={"ID":"3d662ba2-aa03-4eea-bd30-8ad40638f6c7","Type":"ContainerDied","Data":"cfd4ba17f36b5ec097ece042ae7edbfb01b68a6555e6ad1f367486be28f1eea1"} Dec 07 20:01:27 crc kubenswrapper[4815]: I1207 20:01:27.469748 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" event={"ID":"3d662ba2-aa03-4eea-bd30-8ad40638f6c7","Type":"ContainerStarted","Data":"149247e582280a661fc3724e5798bed3a0dd823ca1dea0d7f214efa0bfd7873c"} Dec 07 20:01:27 crc kubenswrapper[4815]: I1207 20:01:27.469786 4815 scope.go:117] "RemoveContainer" containerID="340b584b058ac547a7c5e921d79e470d96d688318eb315a993317b994733d8b7" Dec 07 20:02:27 crc kubenswrapper[4815]: I1207 20:02:27.472160 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-tn6gn"] Dec 07 20:02:27 crc kubenswrapper[4815]: E1207 20:02:27.473455 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4768e92e-8223-4847-b49c-38593d2ebf9b" containerName="keystone-cron" Dec 07 20:02:27 crc kubenswrapper[4815]: I1207 20:02:27.473483 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="4768e92e-8223-4847-b49c-38593d2ebf9b" containerName="keystone-cron" Dec 07 20:02:27 crc kubenswrapper[4815]: I1207 20:02:27.473886 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="4768e92e-8223-4847-b49c-38593d2ebf9b" containerName="keystone-cron" Dec 07 20:02:27 crc kubenswrapper[4815]: I1207 20:02:27.476346 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tn6gn" Dec 07 20:02:27 crc kubenswrapper[4815]: I1207 20:02:27.494851 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tn6gn"] Dec 07 20:02:27 crc kubenswrapper[4815]: I1207 20:02:27.561836 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkrgb\" (UniqueName: \"kubernetes.io/projected/27245652-b1aa-4341-96a0-c4addd1c2804-kube-api-access-bkrgb\") pod \"redhat-marketplace-tn6gn\" (UID: \"27245652-b1aa-4341-96a0-c4addd1c2804\") " pod="openshift-marketplace/redhat-marketplace-tn6gn" Dec 07 20:02:27 crc kubenswrapper[4815]: I1207 20:02:27.561901 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27245652-b1aa-4341-96a0-c4addd1c2804-utilities\") pod \"redhat-marketplace-tn6gn\" (UID: \"27245652-b1aa-4341-96a0-c4addd1c2804\") " pod="openshift-marketplace/redhat-marketplace-tn6gn" Dec 07 20:02:27 crc kubenswrapper[4815]: I1207 20:02:27.562007 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27245652-b1aa-4341-96a0-c4addd1c2804-catalog-content\") pod \"redhat-marketplace-tn6gn\" (UID: \"27245652-b1aa-4341-96a0-c4addd1c2804\") " pod="openshift-marketplace/redhat-marketplace-tn6gn" Dec 07 20:02:27 crc kubenswrapper[4815]: I1207 20:02:27.663416 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkrgb\" (UniqueName: \"kubernetes.io/projected/27245652-b1aa-4341-96a0-c4addd1c2804-kube-api-access-bkrgb\") pod \"redhat-marketplace-tn6gn\" (UID: \"27245652-b1aa-4341-96a0-c4addd1c2804\") " pod="openshift-marketplace/redhat-marketplace-tn6gn" Dec 07 20:02:27 crc kubenswrapper[4815]: I1207 20:02:27.663479 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27245652-b1aa-4341-96a0-c4addd1c2804-utilities\") pod \"redhat-marketplace-tn6gn\" (UID: \"27245652-b1aa-4341-96a0-c4addd1c2804\") " pod="openshift-marketplace/redhat-marketplace-tn6gn" Dec 07 20:02:27 crc kubenswrapper[4815]: I1207 20:02:27.663522 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27245652-b1aa-4341-96a0-c4addd1c2804-catalog-content\") pod \"redhat-marketplace-tn6gn\" (UID: \"27245652-b1aa-4341-96a0-c4addd1c2804\") " pod="openshift-marketplace/redhat-marketplace-tn6gn" Dec 07 20:02:27 crc kubenswrapper[4815]: I1207 20:02:27.664043 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27245652-b1aa-4341-96a0-c4addd1c2804-catalog-content\") pod \"redhat-marketplace-tn6gn\" (UID: \"27245652-b1aa-4341-96a0-c4addd1c2804\") " pod="openshift-marketplace/redhat-marketplace-tn6gn" Dec 07 20:02:27 crc kubenswrapper[4815]: I1207 20:02:27.664095 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27245652-b1aa-4341-96a0-c4addd1c2804-utilities\") pod \"redhat-marketplace-tn6gn\" (UID: \"27245652-b1aa-4341-96a0-c4addd1c2804\") " pod="openshift-marketplace/redhat-marketplace-tn6gn" Dec 07 20:02:27 crc kubenswrapper[4815]: I1207 20:02:27.689622 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkrgb\" (UniqueName: \"kubernetes.io/projected/27245652-b1aa-4341-96a0-c4addd1c2804-kube-api-access-bkrgb\") pod \"redhat-marketplace-tn6gn\" (UID: \"27245652-b1aa-4341-96a0-c4addd1c2804\") " pod="openshift-marketplace/redhat-marketplace-tn6gn" Dec 07 20:02:27 crc kubenswrapper[4815]: I1207 20:02:27.799346 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tn6gn" Dec 07 20:02:28 crc kubenswrapper[4815]: I1207 20:02:28.300591 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tn6gn"] Dec 07 20:02:29 crc kubenswrapper[4815]: I1207 20:02:29.095674 4815 generic.go:334] "Generic (PLEG): container finished" podID="27245652-b1aa-4341-96a0-c4addd1c2804" containerID="ff49e00d1e05c4ddf531f66aa49f103c183bcb205facd198143e91fbfc2ba88c" exitCode=0 Dec 07 20:02:29 crc kubenswrapper[4815]: I1207 20:02:29.096046 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tn6gn" event={"ID":"27245652-b1aa-4341-96a0-c4addd1c2804","Type":"ContainerDied","Data":"ff49e00d1e05c4ddf531f66aa49f103c183bcb205facd198143e91fbfc2ba88c"} Dec 07 20:02:29 crc kubenswrapper[4815]: I1207 20:02:29.096086 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tn6gn" event={"ID":"27245652-b1aa-4341-96a0-c4addd1c2804","Type":"ContainerStarted","Data":"0494bbd4843434447138d965ddba7ede8254166cf06423bc56f0ddfbdd4a9864"} Dec 07 20:02:29 crc kubenswrapper[4815]: I1207 20:02:29.099257 4815 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 07 20:02:30 crc kubenswrapper[4815]: I1207 20:02:30.105663 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tn6gn" event={"ID":"27245652-b1aa-4341-96a0-c4addd1c2804","Type":"ContainerStarted","Data":"efdd1002a2b7326d17e602b1a56cb8f09b52c5dd577a1eeb6f3294b969ade123"} Dec 07 20:02:31 crc kubenswrapper[4815]: I1207 20:02:31.121498 4815 generic.go:334] "Generic (PLEG): container finished" podID="27245652-b1aa-4341-96a0-c4addd1c2804" containerID="efdd1002a2b7326d17e602b1a56cb8f09b52c5dd577a1eeb6f3294b969ade123" exitCode=0 Dec 07 20:02:31 crc kubenswrapper[4815]: I1207 20:02:31.121578 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tn6gn" event={"ID":"27245652-b1aa-4341-96a0-c4addd1c2804","Type":"ContainerDied","Data":"efdd1002a2b7326d17e602b1a56cb8f09b52c5dd577a1eeb6f3294b969ade123"} Dec 07 20:02:32 crc kubenswrapper[4815]: I1207 20:02:32.132983 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tn6gn" event={"ID":"27245652-b1aa-4341-96a0-c4addd1c2804","Type":"ContainerStarted","Data":"087d34fc3e8b7ba71cec2000e3d3b41a0cff36608ac614935027d6f8159d4888"} Dec 07 20:02:32 crc kubenswrapper[4815]: I1207 20:02:32.151861 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-tn6gn" podStartSLOduration=2.706293148 podStartE2EDuration="5.151843463s" podCreationTimestamp="2025-12-07 20:02:27 +0000 UTC" firstStartedPulling="2025-12-07 20:02:29.098782503 +0000 UTC m=+2853.677772588" lastFinishedPulling="2025-12-07 20:02:31.544332858 +0000 UTC m=+2856.123322903" observedRunningTime="2025-12-07 20:02:32.150865455 +0000 UTC m=+2856.729855520" watchObservedRunningTime="2025-12-07 20:02:32.151843463 +0000 UTC m=+2856.730833518" Dec 07 20:02:37 crc kubenswrapper[4815]: I1207 20:02:37.881570 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-tn6gn" Dec 07 20:02:37 crc kubenswrapper[4815]: I1207 20:02:37.882377 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-tn6gn" Dec 07 20:02:37 crc kubenswrapper[4815]: I1207 20:02:37.904227 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-tn6gn" Dec 07 20:02:38 crc kubenswrapper[4815]: I1207 20:02:38.243798 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-tn6gn" Dec 07 20:02:38 crc kubenswrapper[4815]: I1207 20:02:38.297046 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tn6gn"] Dec 07 20:02:40 crc kubenswrapper[4815]: I1207 20:02:40.204105 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-tn6gn" podUID="27245652-b1aa-4341-96a0-c4addd1c2804" containerName="registry-server" containerID="cri-o://087d34fc3e8b7ba71cec2000e3d3b41a0cff36608ac614935027d6f8159d4888" gracePeriod=2 Dec 07 20:02:41 crc kubenswrapper[4815]: I1207 20:02:41.213315 4815 generic.go:334] "Generic (PLEG): container finished" podID="27245652-b1aa-4341-96a0-c4addd1c2804" containerID="087d34fc3e8b7ba71cec2000e3d3b41a0cff36608ac614935027d6f8159d4888" exitCode=0 Dec 07 20:02:41 crc kubenswrapper[4815]: I1207 20:02:41.214150 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tn6gn" event={"ID":"27245652-b1aa-4341-96a0-c4addd1c2804","Type":"ContainerDied","Data":"087d34fc3e8b7ba71cec2000e3d3b41a0cff36608ac614935027d6f8159d4888"} Dec 07 20:02:41 crc kubenswrapper[4815]: I1207 20:02:41.214179 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tn6gn" event={"ID":"27245652-b1aa-4341-96a0-c4addd1c2804","Type":"ContainerDied","Data":"0494bbd4843434447138d965ddba7ede8254166cf06423bc56f0ddfbdd4a9864"} Dec 07 20:02:41 crc kubenswrapper[4815]: I1207 20:02:41.214190 4815 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0494bbd4843434447138d965ddba7ede8254166cf06423bc56f0ddfbdd4a9864" Dec 07 20:02:41 crc kubenswrapper[4815]: I1207 20:02:41.238837 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tn6gn" Dec 07 20:02:41 crc kubenswrapper[4815]: I1207 20:02:41.404607 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27245652-b1aa-4341-96a0-c4addd1c2804-utilities\") pod \"27245652-b1aa-4341-96a0-c4addd1c2804\" (UID: \"27245652-b1aa-4341-96a0-c4addd1c2804\") " Dec 07 20:02:41 crc kubenswrapper[4815]: I1207 20:02:41.405101 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27245652-b1aa-4341-96a0-c4addd1c2804-catalog-content\") pod \"27245652-b1aa-4341-96a0-c4addd1c2804\" (UID: \"27245652-b1aa-4341-96a0-c4addd1c2804\") " Dec 07 20:02:41 crc kubenswrapper[4815]: I1207 20:02:41.405237 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bkrgb\" (UniqueName: \"kubernetes.io/projected/27245652-b1aa-4341-96a0-c4addd1c2804-kube-api-access-bkrgb\") pod \"27245652-b1aa-4341-96a0-c4addd1c2804\" (UID: \"27245652-b1aa-4341-96a0-c4addd1c2804\") " Dec 07 20:02:41 crc kubenswrapper[4815]: I1207 20:02:41.405858 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27245652-b1aa-4341-96a0-c4addd1c2804-utilities" (OuterVolumeSpecName: "utilities") pod "27245652-b1aa-4341-96a0-c4addd1c2804" (UID: "27245652-b1aa-4341-96a0-c4addd1c2804"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 07 20:02:41 crc kubenswrapper[4815]: I1207 20:02:41.411760 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27245652-b1aa-4341-96a0-c4addd1c2804-kube-api-access-bkrgb" (OuterVolumeSpecName: "kube-api-access-bkrgb") pod "27245652-b1aa-4341-96a0-c4addd1c2804" (UID: "27245652-b1aa-4341-96a0-c4addd1c2804"). InnerVolumeSpecName "kube-api-access-bkrgb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 20:02:41 crc kubenswrapper[4815]: I1207 20:02:41.428405 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27245652-b1aa-4341-96a0-c4addd1c2804-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "27245652-b1aa-4341-96a0-c4addd1c2804" (UID: "27245652-b1aa-4341-96a0-c4addd1c2804"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 07 20:02:41 crc kubenswrapper[4815]: I1207 20:02:41.507126 4815 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27245652-b1aa-4341-96a0-c4addd1c2804-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 07 20:02:41 crc kubenswrapper[4815]: I1207 20:02:41.507163 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bkrgb\" (UniqueName: \"kubernetes.io/projected/27245652-b1aa-4341-96a0-c4addd1c2804-kube-api-access-bkrgb\") on node \"crc\" DevicePath \"\"" Dec 07 20:02:41 crc kubenswrapper[4815]: I1207 20:02:41.507176 4815 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27245652-b1aa-4341-96a0-c4addd1c2804-utilities\") on node \"crc\" DevicePath \"\"" Dec 07 20:02:42 crc kubenswrapper[4815]: I1207 20:02:42.226712 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tn6gn" Dec 07 20:02:42 crc kubenswrapper[4815]: I1207 20:02:42.253890 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tn6gn"] Dec 07 20:02:42 crc kubenswrapper[4815]: I1207 20:02:42.262750 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-tn6gn"] Dec 07 20:02:43 crc kubenswrapper[4815]: I1207 20:02:43.781656 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27245652-b1aa-4341-96a0-c4addd1c2804" path="/var/lib/kubelet/pods/27245652-b1aa-4341-96a0-c4addd1c2804/volumes" Dec 07 20:03:26 crc kubenswrapper[4815]: I1207 20:03:26.359779 4815 patch_prober.go:28] interesting pod/machine-config-daemon-gkn4h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 07 20:03:26 crc kubenswrapper[4815]: I1207 20:03:26.360449 4815 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 07 20:03:56 crc kubenswrapper[4815]: I1207 20:03:56.359961 4815 patch_prober.go:28] interesting pod/machine-config-daemon-gkn4h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 07 20:03:56 crc kubenswrapper[4815]: I1207 20:03:56.360483 4815 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 07 20:04:26 crc kubenswrapper[4815]: I1207 20:04:26.360239 4815 patch_prober.go:28] interesting pod/machine-config-daemon-gkn4h container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 07 20:04:26 crc kubenswrapper[4815]: I1207 20:04:26.360672 4815 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 07 20:04:26 crc kubenswrapper[4815]: I1207 20:04:26.360722 4815 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" Dec 07 20:04:26 crc kubenswrapper[4815]: I1207 20:04:26.361441 4815 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"149247e582280a661fc3724e5798bed3a0dd823ca1dea0d7f214efa0bfd7873c"} pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 07 20:04:26 crc kubenswrapper[4815]: I1207 20:04:26.361483 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" containerName="machine-config-daemon" containerID="cri-o://149247e582280a661fc3724e5798bed3a0dd823ca1dea0d7f214efa0bfd7873c" gracePeriod=600 Dec 07 20:04:26 crc kubenswrapper[4815]: E1207 20:04:26.495186 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gkn4h_openshift-machine-config-operator(3d662ba2-aa03-4eea-bd30-8ad40638f6c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" Dec 07 20:04:27 crc kubenswrapper[4815]: I1207 20:04:27.216562 4815 generic.go:334] "Generic (PLEG): container finished" podID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" containerID="149247e582280a661fc3724e5798bed3a0dd823ca1dea0d7f214efa0bfd7873c" exitCode=0 Dec 07 20:04:27 crc kubenswrapper[4815]: I1207 20:04:27.216614 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" event={"ID":"3d662ba2-aa03-4eea-bd30-8ad40638f6c7","Type":"ContainerDied","Data":"149247e582280a661fc3724e5798bed3a0dd823ca1dea0d7f214efa0bfd7873c"} Dec 07 20:04:27 crc kubenswrapper[4815]: I1207 20:04:27.216660 4815 scope.go:117] "RemoveContainer" containerID="cfd4ba17f36b5ec097ece042ae7edbfb01b68a6555e6ad1f367486be28f1eea1" Dec 07 20:04:27 crc kubenswrapper[4815]: I1207 20:04:27.217505 4815 scope.go:117] "RemoveContainer" containerID="149247e582280a661fc3724e5798bed3a0dd823ca1dea0d7f214efa0bfd7873c" Dec 07 20:04:27 crc kubenswrapper[4815]: E1207 20:04:27.218250 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gkn4h_openshift-machine-config-operator(3d662ba2-aa03-4eea-bd30-8ad40638f6c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" Dec 07 20:04:29 crc kubenswrapper[4815]: I1207 20:04:29.858237 4815 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zxfv6"] Dec 07 20:04:29 crc kubenswrapper[4815]: E1207 20:04:29.859685 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27245652-b1aa-4341-96a0-c4addd1c2804" containerName="extract-utilities" Dec 07 20:04:29 crc kubenswrapper[4815]: I1207 20:04:29.859707 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="27245652-b1aa-4341-96a0-c4addd1c2804" containerName="extract-utilities" Dec 07 20:04:29 crc kubenswrapper[4815]: E1207 20:04:29.859733 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27245652-b1aa-4341-96a0-c4addd1c2804" containerName="registry-server" Dec 07 20:04:29 crc kubenswrapper[4815]: I1207 20:04:29.859746 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="27245652-b1aa-4341-96a0-c4addd1c2804" containerName="registry-server" Dec 07 20:04:29 crc kubenswrapper[4815]: E1207 20:04:29.859790 4815 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27245652-b1aa-4341-96a0-c4addd1c2804" containerName="extract-content" Dec 07 20:04:29 crc kubenswrapper[4815]: I1207 20:04:29.859804 4815 state_mem.go:107] "Deleted CPUSet assignment" podUID="27245652-b1aa-4341-96a0-c4addd1c2804" containerName="extract-content" Dec 07 20:04:29 crc kubenswrapper[4815]: I1207 20:04:29.860189 4815 memory_manager.go:354] "RemoveStaleState removing state" podUID="27245652-b1aa-4341-96a0-c4addd1c2804" containerName="registry-server" Dec 07 20:04:29 crc kubenswrapper[4815]: I1207 20:04:29.862606 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zxfv6" Dec 07 20:04:29 crc kubenswrapper[4815]: I1207 20:04:29.873023 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zxfv6"] Dec 07 20:04:30 crc kubenswrapper[4815]: I1207 20:04:30.000990 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkm7s\" (UniqueName: \"kubernetes.io/projected/1ecea754-5019-4610-b81d-e405a66f40f5-kube-api-access-lkm7s\") pod \"community-operators-zxfv6\" (UID: \"1ecea754-5019-4610-b81d-e405a66f40f5\") " pod="openshift-marketplace/community-operators-zxfv6" Dec 07 20:04:30 crc kubenswrapper[4815]: I1207 20:04:30.001048 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ecea754-5019-4610-b81d-e405a66f40f5-catalog-content\") pod \"community-operators-zxfv6\" (UID: \"1ecea754-5019-4610-b81d-e405a66f40f5\") " pod="openshift-marketplace/community-operators-zxfv6" Dec 07 20:04:30 crc kubenswrapper[4815]: I1207 20:04:30.001196 4815 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ecea754-5019-4610-b81d-e405a66f40f5-utilities\") pod \"community-operators-zxfv6\" (UID: \"1ecea754-5019-4610-b81d-e405a66f40f5\") " pod="openshift-marketplace/community-operators-zxfv6" Dec 07 20:04:30 crc kubenswrapper[4815]: I1207 20:04:30.102950 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ecea754-5019-4610-b81d-e405a66f40f5-utilities\") pod \"community-operators-zxfv6\" (UID: \"1ecea754-5019-4610-b81d-e405a66f40f5\") " pod="openshift-marketplace/community-operators-zxfv6" Dec 07 20:04:30 crc kubenswrapper[4815]: I1207 20:04:30.103084 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkm7s\" (UniqueName: \"kubernetes.io/projected/1ecea754-5019-4610-b81d-e405a66f40f5-kube-api-access-lkm7s\") pod \"community-operators-zxfv6\" (UID: \"1ecea754-5019-4610-b81d-e405a66f40f5\") " pod="openshift-marketplace/community-operators-zxfv6" Dec 07 20:04:30 crc kubenswrapper[4815]: I1207 20:04:30.103104 4815 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ecea754-5019-4610-b81d-e405a66f40f5-catalog-content\") pod \"community-operators-zxfv6\" (UID: \"1ecea754-5019-4610-b81d-e405a66f40f5\") " pod="openshift-marketplace/community-operators-zxfv6" Dec 07 20:04:30 crc kubenswrapper[4815]: I1207 20:04:30.103630 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ecea754-5019-4610-b81d-e405a66f40f5-utilities\") pod \"community-operators-zxfv6\" (UID: \"1ecea754-5019-4610-b81d-e405a66f40f5\") " pod="openshift-marketplace/community-operators-zxfv6" Dec 07 20:04:30 crc kubenswrapper[4815]: I1207 20:04:30.103677 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ecea754-5019-4610-b81d-e405a66f40f5-catalog-content\") pod \"community-operators-zxfv6\" (UID: \"1ecea754-5019-4610-b81d-e405a66f40f5\") " pod="openshift-marketplace/community-operators-zxfv6" Dec 07 20:04:30 crc kubenswrapper[4815]: I1207 20:04:30.137114 4815 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkm7s\" (UniqueName: \"kubernetes.io/projected/1ecea754-5019-4610-b81d-e405a66f40f5-kube-api-access-lkm7s\") pod \"community-operators-zxfv6\" (UID: \"1ecea754-5019-4610-b81d-e405a66f40f5\") " pod="openshift-marketplace/community-operators-zxfv6" Dec 07 20:04:30 crc kubenswrapper[4815]: I1207 20:04:30.213137 4815 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zxfv6" Dec 07 20:04:30 crc kubenswrapper[4815]: I1207 20:04:30.535050 4815 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zxfv6"] Dec 07 20:04:31 crc kubenswrapper[4815]: I1207 20:04:31.253846 4815 generic.go:334] "Generic (PLEG): container finished" podID="1ecea754-5019-4610-b81d-e405a66f40f5" containerID="5d3b9b01761a56ef70d1e5eeb222c699e364795cef3c7e3cb906a5459738fde3" exitCode=0 Dec 07 20:04:31 crc kubenswrapper[4815]: I1207 20:04:31.253907 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zxfv6" event={"ID":"1ecea754-5019-4610-b81d-e405a66f40f5","Type":"ContainerDied","Data":"5d3b9b01761a56ef70d1e5eeb222c699e364795cef3c7e3cb906a5459738fde3"} Dec 07 20:04:31 crc kubenswrapper[4815]: I1207 20:04:31.254242 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zxfv6" event={"ID":"1ecea754-5019-4610-b81d-e405a66f40f5","Type":"ContainerStarted","Data":"a84c285273ab835d2e5d2e3816620b8a04d917c13465dc5a1cc08749a5a7762f"} Dec 07 20:04:32 crc kubenswrapper[4815]: I1207 20:04:32.264083 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zxfv6" event={"ID":"1ecea754-5019-4610-b81d-e405a66f40f5","Type":"ContainerStarted","Data":"dd73894858dc54417551f9fd680814425bc7d498dc8bde83e9e8afd4df188f52"} Dec 07 20:04:33 crc kubenswrapper[4815]: I1207 20:04:33.274077 4815 generic.go:334] "Generic (PLEG): container finished" podID="1ecea754-5019-4610-b81d-e405a66f40f5" containerID="dd73894858dc54417551f9fd680814425bc7d498dc8bde83e9e8afd4df188f52" exitCode=0 Dec 07 20:04:33 crc kubenswrapper[4815]: I1207 20:04:33.274177 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zxfv6" event={"ID":"1ecea754-5019-4610-b81d-e405a66f40f5","Type":"ContainerDied","Data":"dd73894858dc54417551f9fd680814425bc7d498dc8bde83e9e8afd4df188f52"} Dec 07 20:04:34 crc kubenswrapper[4815]: I1207 20:04:34.286902 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zxfv6" event={"ID":"1ecea754-5019-4610-b81d-e405a66f40f5","Type":"ContainerStarted","Data":"bc01d514cea7a3c562deabdf2e2ba96bd9c6b2ea1827bad440423fdd134c1323"} Dec 07 20:04:34 crc kubenswrapper[4815]: I1207 20:04:34.314403 4815 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-zxfv6" podStartSLOduration=2.899503531 podStartE2EDuration="5.314383435s" podCreationTimestamp="2025-12-07 20:04:29 +0000 UTC" firstStartedPulling="2025-12-07 20:04:31.256047656 +0000 UTC m=+2975.835037701" lastFinishedPulling="2025-12-07 20:04:33.67092756 +0000 UTC m=+2978.249917605" observedRunningTime="2025-12-07 20:04:34.309319112 +0000 UTC m=+2978.888309157" watchObservedRunningTime="2025-12-07 20:04:34.314383435 +0000 UTC m=+2978.893373480" Dec 07 20:04:39 crc kubenswrapper[4815]: I1207 20:04:39.769573 4815 scope.go:117] "RemoveContainer" containerID="149247e582280a661fc3724e5798bed3a0dd823ca1dea0d7f214efa0bfd7873c" Dec 07 20:04:39 crc kubenswrapper[4815]: E1207 20:04:39.770379 4815 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-gkn4h_openshift-machine-config-operator(3d662ba2-aa03-4eea-bd30-8ad40638f6c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-gkn4h" podUID="3d662ba2-aa03-4eea-bd30-8ad40638f6c7" Dec 07 20:04:40 crc kubenswrapper[4815]: I1207 20:04:40.213612 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-zxfv6" Dec 07 20:04:40 crc kubenswrapper[4815]: I1207 20:04:40.213662 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-zxfv6" Dec 07 20:04:40 crc kubenswrapper[4815]: I1207 20:04:40.266055 4815 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-zxfv6" Dec 07 20:04:40 crc kubenswrapper[4815]: I1207 20:04:40.378359 4815 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-zxfv6" Dec 07 20:04:40 crc kubenswrapper[4815]: I1207 20:04:40.577668 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zxfv6"] Dec 07 20:04:42 crc kubenswrapper[4815]: I1207 20:04:42.352019 4815 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-zxfv6" podUID="1ecea754-5019-4610-b81d-e405a66f40f5" containerName="registry-server" containerID="cri-o://bc01d514cea7a3c562deabdf2e2ba96bd9c6b2ea1827bad440423fdd134c1323" gracePeriod=2 Dec 07 20:04:43 crc kubenswrapper[4815]: I1207 20:04:43.275941 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zxfv6" Dec 07 20:04:43 crc kubenswrapper[4815]: I1207 20:04:43.340751 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ecea754-5019-4610-b81d-e405a66f40f5-utilities\") pod \"1ecea754-5019-4610-b81d-e405a66f40f5\" (UID: \"1ecea754-5019-4610-b81d-e405a66f40f5\") " Dec 07 20:04:43 crc kubenswrapper[4815]: I1207 20:04:43.341287 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lkm7s\" (UniqueName: \"kubernetes.io/projected/1ecea754-5019-4610-b81d-e405a66f40f5-kube-api-access-lkm7s\") pod \"1ecea754-5019-4610-b81d-e405a66f40f5\" (UID: \"1ecea754-5019-4610-b81d-e405a66f40f5\") " Dec 07 20:04:43 crc kubenswrapper[4815]: I1207 20:04:43.341324 4815 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ecea754-5019-4610-b81d-e405a66f40f5-catalog-content\") pod \"1ecea754-5019-4610-b81d-e405a66f40f5\" (UID: \"1ecea754-5019-4610-b81d-e405a66f40f5\") " Dec 07 20:04:43 crc kubenswrapper[4815]: I1207 20:04:43.342805 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ecea754-5019-4610-b81d-e405a66f40f5-utilities" (OuterVolumeSpecName: "utilities") pod "1ecea754-5019-4610-b81d-e405a66f40f5" (UID: "1ecea754-5019-4610-b81d-e405a66f40f5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 07 20:04:43 crc kubenswrapper[4815]: I1207 20:04:43.346678 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ecea754-5019-4610-b81d-e405a66f40f5-kube-api-access-lkm7s" (OuterVolumeSpecName: "kube-api-access-lkm7s") pod "1ecea754-5019-4610-b81d-e405a66f40f5" (UID: "1ecea754-5019-4610-b81d-e405a66f40f5"). InnerVolumeSpecName "kube-api-access-lkm7s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 07 20:04:43 crc kubenswrapper[4815]: I1207 20:04:43.367099 4815 generic.go:334] "Generic (PLEG): container finished" podID="1ecea754-5019-4610-b81d-e405a66f40f5" containerID="bc01d514cea7a3c562deabdf2e2ba96bd9c6b2ea1827bad440423fdd134c1323" exitCode=0 Dec 07 20:04:43 crc kubenswrapper[4815]: I1207 20:04:43.367157 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zxfv6" event={"ID":"1ecea754-5019-4610-b81d-e405a66f40f5","Type":"ContainerDied","Data":"bc01d514cea7a3c562deabdf2e2ba96bd9c6b2ea1827bad440423fdd134c1323"} Dec 07 20:04:43 crc kubenswrapper[4815]: I1207 20:04:43.367190 4815 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zxfv6" event={"ID":"1ecea754-5019-4610-b81d-e405a66f40f5","Type":"ContainerDied","Data":"a84c285273ab835d2e5d2e3816620b8a04d917c13465dc5a1cc08749a5a7762f"} Dec 07 20:04:43 crc kubenswrapper[4815]: I1207 20:04:43.367213 4815 scope.go:117] "RemoveContainer" containerID="bc01d514cea7a3c562deabdf2e2ba96bd9c6b2ea1827bad440423fdd134c1323" Dec 07 20:04:43 crc kubenswrapper[4815]: I1207 20:04:43.367358 4815 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zxfv6" Dec 07 20:04:43 crc kubenswrapper[4815]: I1207 20:04:43.394841 4815 scope.go:117] "RemoveContainer" containerID="dd73894858dc54417551f9fd680814425bc7d498dc8bde83e9e8afd4df188f52" Dec 07 20:04:43 crc kubenswrapper[4815]: I1207 20:04:43.404856 4815 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ecea754-5019-4610-b81d-e405a66f40f5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1ecea754-5019-4610-b81d-e405a66f40f5" (UID: "1ecea754-5019-4610-b81d-e405a66f40f5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 07 20:04:43 crc kubenswrapper[4815]: I1207 20:04:43.418888 4815 scope.go:117] "RemoveContainer" containerID="5d3b9b01761a56ef70d1e5eeb222c699e364795cef3c7e3cb906a5459738fde3" Dec 07 20:04:43 crc kubenswrapper[4815]: I1207 20:04:43.443190 4815 scope.go:117] "RemoveContainer" containerID="bc01d514cea7a3c562deabdf2e2ba96bd9c6b2ea1827bad440423fdd134c1323" Dec 07 20:04:43 crc kubenswrapper[4815]: I1207 20:04:43.443458 4815 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ecea754-5019-4610-b81d-e405a66f40f5-utilities\") on node \"crc\" DevicePath \"\"" Dec 07 20:04:43 crc kubenswrapper[4815]: I1207 20:04:43.443491 4815 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lkm7s\" (UniqueName: \"kubernetes.io/projected/1ecea754-5019-4610-b81d-e405a66f40f5-kube-api-access-lkm7s\") on node \"crc\" DevicePath \"\"" Dec 07 20:04:43 crc kubenswrapper[4815]: I1207 20:04:43.443506 4815 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ecea754-5019-4610-b81d-e405a66f40f5-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 07 20:04:43 crc kubenswrapper[4815]: E1207 20:04:43.443885 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc01d514cea7a3c562deabdf2e2ba96bd9c6b2ea1827bad440423fdd134c1323\": container with ID starting with bc01d514cea7a3c562deabdf2e2ba96bd9c6b2ea1827bad440423fdd134c1323 not found: ID does not exist" containerID="bc01d514cea7a3c562deabdf2e2ba96bd9c6b2ea1827bad440423fdd134c1323" Dec 07 20:04:43 crc kubenswrapper[4815]: I1207 20:04:43.444039 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc01d514cea7a3c562deabdf2e2ba96bd9c6b2ea1827bad440423fdd134c1323"} err="failed to get container status \"bc01d514cea7a3c562deabdf2e2ba96bd9c6b2ea1827bad440423fdd134c1323\": rpc error: code = NotFound desc = could not find container \"bc01d514cea7a3c562deabdf2e2ba96bd9c6b2ea1827bad440423fdd134c1323\": container with ID starting with bc01d514cea7a3c562deabdf2e2ba96bd9c6b2ea1827bad440423fdd134c1323 not found: ID does not exist" Dec 07 20:04:43 crc kubenswrapper[4815]: I1207 20:04:43.444115 4815 scope.go:117] "RemoveContainer" containerID="dd73894858dc54417551f9fd680814425bc7d498dc8bde83e9e8afd4df188f52" Dec 07 20:04:43 crc kubenswrapper[4815]: E1207 20:04:43.444481 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd73894858dc54417551f9fd680814425bc7d498dc8bde83e9e8afd4df188f52\": container with ID starting with dd73894858dc54417551f9fd680814425bc7d498dc8bde83e9e8afd4df188f52 not found: ID does not exist" containerID="dd73894858dc54417551f9fd680814425bc7d498dc8bde83e9e8afd4df188f52" Dec 07 20:04:43 crc kubenswrapper[4815]: I1207 20:04:43.444505 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd73894858dc54417551f9fd680814425bc7d498dc8bde83e9e8afd4df188f52"} err="failed to get container status \"dd73894858dc54417551f9fd680814425bc7d498dc8bde83e9e8afd4df188f52\": rpc error: code = NotFound desc = could not find container \"dd73894858dc54417551f9fd680814425bc7d498dc8bde83e9e8afd4df188f52\": container with ID starting with dd73894858dc54417551f9fd680814425bc7d498dc8bde83e9e8afd4df188f52 not found: ID does not exist" Dec 07 20:04:43 crc kubenswrapper[4815]: I1207 20:04:43.444523 4815 scope.go:117] "RemoveContainer" containerID="5d3b9b01761a56ef70d1e5eeb222c699e364795cef3c7e3cb906a5459738fde3" Dec 07 20:04:43 crc kubenswrapper[4815]: E1207 20:04:43.444847 4815 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d3b9b01761a56ef70d1e5eeb222c699e364795cef3c7e3cb906a5459738fde3\": container with ID starting with 5d3b9b01761a56ef70d1e5eeb222c699e364795cef3c7e3cb906a5459738fde3 not found: ID does not exist" containerID="5d3b9b01761a56ef70d1e5eeb222c699e364795cef3c7e3cb906a5459738fde3" Dec 07 20:04:43 crc kubenswrapper[4815]: I1207 20:04:43.444967 4815 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d3b9b01761a56ef70d1e5eeb222c699e364795cef3c7e3cb906a5459738fde3"} err="failed to get container status \"5d3b9b01761a56ef70d1e5eeb222c699e364795cef3c7e3cb906a5459738fde3\": rpc error: code = NotFound desc = could not find container \"5d3b9b01761a56ef70d1e5eeb222c699e364795cef3c7e3cb906a5459738fde3\": container with ID starting with 5d3b9b01761a56ef70d1e5eeb222c699e364795cef3c7e3cb906a5459738fde3 not found: ID does not exist" Dec 07 20:04:43 crc kubenswrapper[4815]: I1207 20:04:43.715951 4815 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zxfv6"] Dec 07 20:04:43 crc kubenswrapper[4815]: I1207 20:04:43.725363 4815 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-zxfv6"] Dec 07 20:04:43 crc kubenswrapper[4815]: I1207 20:04:43.780731 4815 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ecea754-5019-4610-b81d-e405a66f40f5" path="/var/lib/kubelet/pods/1ecea754-5019-4610-b81d-e405a66f40f5/volumes"